mirror of
https://github.com/elastic/elasticsearch.git
synced 2025-04-17 20:05:09 -04:00
[Build] Remove mentioning ./gradlew assemble to build all distributions (#116400)
This causes problems since this also includes building our wolfi image that is based on a non public available base image resulting in build failure for user without access to the docker registry. We should avoid suggesting using ./gradlew assemble as it builds a tons of stuff unrelated to any distribution. Instead people should be pointed to `./gradlew localDistro`
This commit is contained in:
parent
e38aa7e8f1
commit
35c6b60c77
2 changed files with 45 additions and 39 deletions
|
@ -4,7 +4,7 @@ Elasticsearch is a distributed search and analytics engine, scalable data store
|
|||
|
||||
Use cases enabled by Elasticsearch include:
|
||||
|
||||
* https://www.elastic.co/search-labs/blog/articles/retrieval-augmented-generation-rag[Retrieval Augmented Generation (RAG)]
|
||||
* https://www.elastic.co/search-labs/blog/articles/retrieval-augmented-generation-rag[Retrieval Augmented Generation (RAG)]
|
||||
* https://www.elastic.co/search-labs/blog/categories/vector-search[Vector search]
|
||||
* Full-text search
|
||||
* Logs
|
||||
|
@ -17,7 +17,7 @@ Use cases enabled by Elasticsearch include:
|
|||
To learn more about Elasticsearch's features and capabilities, see our
|
||||
https://www.elastic.co/products/elasticsearch[product page].
|
||||
|
||||
To access information on https://www.elastic.co/search-labs/blog/categories/ml-research[machine learning innovations] and the latest https://www.elastic.co/search-labs/blog/categories/lucene[Lucene contributions from Elastic], more information can be found in https://www.elastic.co/search-labs[Search Labs].
|
||||
To access information on https://www.elastic.co/search-labs/blog/categories/ml-research[machine learning innovations] and the latest https://www.elastic.co/search-labs/blog/categories/lucene[Lucene contributions from Elastic], more information can be found in https://www.elastic.co/search-labs[Search Labs].
|
||||
|
||||
[[get-started]]
|
||||
== Get started
|
||||
|
@ -27,20 +27,20 @@ https://www.elastic.co/cloud/as-a-service[Elasticsearch Service on Elastic
|
|||
Cloud].
|
||||
|
||||
If you prefer to install and manage Elasticsearch yourself, you can download
|
||||
the latest version from
|
||||
the latest version from
|
||||
https://www.elastic.co/downloads/elasticsearch[elastic.co/downloads/elasticsearch].
|
||||
|
||||
=== Run Elasticsearch locally
|
||||
|
||||
////
|
||||
////
|
||||
IMPORTANT: This content is replicated in the Elasticsearch repo. See `run-elasticsearch-locally.asciidoc`.
|
||||
Ensure both files are in sync.
|
||||
|
||||
https://github.com/elastic/start-local is the source of truth.
|
||||
////
|
||||
////
|
||||
|
||||
[WARNING]
|
||||
====
|
||||
====
|
||||
DO NOT USE THESE INSTRUCTIONS FOR PRODUCTION DEPLOYMENTS.
|
||||
|
||||
This setup is intended for local development and testing only.
|
||||
|
@ -93,7 +93,7 @@ Use this key to connect to Elasticsearch with a https://www.elastic.co/guide/en/
|
|||
From the `elastic-start-local` folder, check the connection to Elasticsearch using `curl`:
|
||||
|
||||
[source,sh]
|
||||
----
|
||||
----
|
||||
source .env
|
||||
curl $ES_LOCAL_URL -H "Authorization: ApiKey ${ES_LOCAL_API_KEY}"
|
||||
----
|
||||
|
@ -101,12 +101,12 @@ curl $ES_LOCAL_URL -H "Authorization: ApiKey ${ES_LOCAL_API_KEY}"
|
|||
|
||||
=== Send requests to Elasticsearch
|
||||
|
||||
You send data and other requests to Elasticsearch through REST APIs.
|
||||
You can interact with Elasticsearch using any client that sends HTTP requests,
|
||||
You send data and other requests to Elasticsearch through REST APIs.
|
||||
You can interact with Elasticsearch using any client that sends HTTP requests,
|
||||
such as the https://www.elastic.co/guide/en/elasticsearch/client/index.html[Elasticsearch
|
||||
language clients] and https://curl.se[curl].
|
||||
language clients] and https://curl.se[curl].
|
||||
|
||||
==== Using curl
|
||||
==== Using curl
|
||||
|
||||
Here's an example curl command to create a new Elasticsearch index, using basic auth:
|
||||
|
||||
|
@ -149,19 +149,19 @@ print(client.info())
|
|||
|
||||
==== Using the Dev Tools Console
|
||||
|
||||
Kibana's developer console provides an easy way to experiment and test requests.
|
||||
Kibana's developer console provides an easy way to experiment and test requests.
|
||||
To access the console, open Kibana, then go to **Management** > **Dev Tools**.
|
||||
|
||||
**Add data**
|
||||
|
||||
You index data into Elasticsearch by sending JSON objects (documents) through the REST APIs.
|
||||
Whether you have structured or unstructured text, numerical data, or geospatial data,
|
||||
Elasticsearch efficiently stores and indexes it in a way that supports fast searches.
|
||||
You index data into Elasticsearch by sending JSON objects (documents) through the REST APIs.
|
||||
Whether you have structured or unstructured text, numerical data, or geospatial data,
|
||||
Elasticsearch efficiently stores and indexes it in a way that supports fast searches.
|
||||
|
||||
For timestamped data such as logs and metrics, you typically add documents to a
|
||||
data stream made up of multiple auto-generated backing indices.
|
||||
|
||||
To add a single document to an index, submit an HTTP post request that targets the index.
|
||||
To add a single document to an index, submit an HTTP post request that targets the index.
|
||||
|
||||
----
|
||||
POST /customer/_doc/1
|
||||
|
@ -171,11 +171,11 @@ POST /customer/_doc/1
|
|||
}
|
||||
----
|
||||
|
||||
This request automatically creates the `customer` index if it doesn't exist,
|
||||
adds a new document that has an ID of 1, and
|
||||
This request automatically creates the `customer` index if it doesn't exist,
|
||||
adds a new document that has an ID of 1, and
|
||||
stores and indexes the `firstname` and `lastname` fields.
|
||||
|
||||
The new document is available immediately from any node in the cluster.
|
||||
The new document is available immediately from any node in the cluster.
|
||||
You can retrieve it with a GET request that specifies its document ID:
|
||||
|
||||
----
|
||||
|
@ -183,7 +183,7 @@ GET /customer/_doc/1
|
|||
----
|
||||
|
||||
To add multiple documents in one request, use the `_bulk` API.
|
||||
Bulk data must be newline-delimited JSON (NDJSON).
|
||||
Bulk data must be newline-delimited JSON (NDJSON).
|
||||
Each line must end in a newline character (`\n`), including the last line.
|
||||
|
||||
----
|
||||
|
@ -200,15 +200,15 @@ PUT customer/_bulk
|
|||
|
||||
**Search**
|
||||
|
||||
Indexed documents are available for search in near real-time.
|
||||
The following search matches all customers with a first name of _Jennifer_
|
||||
Indexed documents are available for search in near real-time.
|
||||
The following search matches all customers with a first name of _Jennifer_
|
||||
in the `customer` index.
|
||||
|
||||
----
|
||||
GET customer/_search
|
||||
{
|
||||
"query" : {
|
||||
"match" : { "firstname": "Jennifer" }
|
||||
"match" : { "firstname": "Jennifer" }
|
||||
}
|
||||
}
|
||||
----
|
||||
|
@ -223,9 +223,9 @@ data streams, or index aliases.
|
|||
|
||||
. Go to **Management > Stack Management > Kibana > Data Views**.
|
||||
. Select **Create data view**.
|
||||
. Enter a name for the data view and a pattern that matches one or more indices,
|
||||
such as _customer_.
|
||||
. Select **Save data view to Kibana**.
|
||||
. Enter a name for the data view and a pattern that matches one or more indices,
|
||||
such as _customer_.
|
||||
. Select **Save data view to Kibana**.
|
||||
|
||||
To start exploring, go to **Analytics > Discover**.
|
||||
|
||||
|
@ -254,11 +254,6 @@ To build a distribution for another platform, run the related command:
|
|||
./gradlew :distribution:archives:windows-zip:assemble
|
||||
----
|
||||
|
||||
To build distributions for all supported platforms, run:
|
||||
----
|
||||
./gradlew assemble
|
||||
----
|
||||
|
||||
Distributions are output to `distribution/archives`.
|
||||
|
||||
To run the test suite, see xref:TESTING.asciidoc[TESTING].
|
||||
|
@ -281,7 +276,7 @@ The https://github.com/elastic/elasticsearch-labs[`elasticsearch-labs`] repo con
|
|||
[[contribute]]
|
||||
== Contribute
|
||||
|
||||
For contribution guidelines, see xref:CONTRIBUTING.md[CONTRIBUTING].
|
||||
For contribution guidelines, see xref:CONTRIBUTING.md[CONTRIBUTING].
|
||||
|
||||
[[questions]]
|
||||
== Questions? Problems? Suggestions?
|
||||
|
|
25
build.gradle
25
build.gradle
|
@ -13,14 +13,13 @@ import com.avast.gradle.dockercompose.tasks.ComposePull
|
|||
import com.fasterxml.jackson.databind.JsonNode
|
||||
import com.fasterxml.jackson.databind.ObjectMapper
|
||||
|
||||
import org.elasticsearch.gradle.DistributionDownloadPlugin
|
||||
import org.elasticsearch.gradle.Version
|
||||
import org.elasticsearch.gradle.internal.BaseInternalPluginBuildPlugin
|
||||
import org.elasticsearch.gradle.internal.ResolveAllDependencies
|
||||
import org.elasticsearch.gradle.internal.info.BuildParams
|
||||
import org.elasticsearch.gradle.util.GradleUtils
|
||||
import org.gradle.plugins.ide.eclipse.model.AccessRule
|
||||
import org.gradle.plugins.ide.eclipse.model.ProjectDependency
|
||||
import org.elasticsearch.gradle.DistributionDownloadPlugin
|
||||
|
||||
import java.nio.file.Files
|
||||
|
||||
|
@ -89,7 +88,7 @@ class ListExpansion {
|
|||
|
||||
// Filters out intermediate patch releases to reduce the load of CI testing
|
||||
def filterIntermediatePatches = { List<Version> versions ->
|
||||
versions.groupBy {"${it.major}.${it.minor}"}.values().collect {it.max()}
|
||||
versions.groupBy { "${it.major}.${it.minor}" }.values().collect { it.max() }
|
||||
}
|
||||
|
||||
tasks.register("updateCIBwcVersions") {
|
||||
|
@ -101,7 +100,10 @@ tasks.register("updateCIBwcVersions") {
|
|||
}
|
||||
}
|
||||
|
||||
def writeBuildkitePipeline = { String outputFilePath, String pipelineTemplatePath, List<ListExpansion> listExpansions, List<StepExpansion> stepExpansions = [] ->
|
||||
def writeBuildkitePipeline = { String outputFilePath,
|
||||
String pipelineTemplatePath,
|
||||
List<ListExpansion> listExpansions,
|
||||
List<StepExpansion> stepExpansions = [] ->
|
||||
def outputFile = file(outputFilePath)
|
||||
def pipelineTemplate = file(pipelineTemplatePath)
|
||||
|
||||
|
@ -132,7 +134,12 @@ tasks.register("updateCIBwcVersions") {
|
|||
// Writes a Buildkite pipeline from a template, and replaces $BWC_STEPS with a list of steps, one for each version
|
||||
// Useful when you need to configure more versions than are allowed in a matrix configuration
|
||||
def expandBwcSteps = { String outputFilePath, String pipelineTemplatePath, String stepTemplatePath, List<Version> versions ->
|
||||
writeBuildkitePipeline(outputFilePath, pipelineTemplatePath, [], [new StepExpansion(templatePath: stepTemplatePath, versions: versions, variable: "BWC_STEPS")])
|
||||
writeBuildkitePipeline(
|
||||
outputFilePath,
|
||||
pipelineTemplatePath,
|
||||
[],
|
||||
[new StepExpansion(templatePath: stepTemplatePath, versions: versions, variable: "BWC_STEPS")]
|
||||
)
|
||||
}
|
||||
|
||||
doLast {
|
||||
|
@ -150,7 +157,11 @@ tasks.register("updateCIBwcVersions") {
|
|||
new ListExpansion(versions: filterIntermediatePatches(BuildParams.bwcVersions.unreleasedIndexCompatible), variable: "BWC_LIST"),
|
||||
],
|
||||
[
|
||||
new StepExpansion(templatePath: ".buildkite/pipelines/periodic.bwc.template.yml", versions: filterIntermediatePatches(BuildParams.bwcVersions.indexCompatible), variable: "BWC_STEPS"),
|
||||
new StepExpansion(
|
||||
templatePath: ".buildkite/pipelines/periodic.bwc.template.yml",
|
||||
versions: filterIntermediatePatches(BuildParams.bwcVersions.indexCompatible),
|
||||
variable: "BWC_STEPS"
|
||||
),
|
||||
]
|
||||
)
|
||||
|
||||
|
@ -302,7 +313,7 @@ allprojects {
|
|||
if (project.path.startsWith(":x-pack:")) {
|
||||
if (project.path.contains("security") || project.path.contains(":ml")) {
|
||||
tasks.register('checkPart4') { dependsOn 'check' }
|
||||
} else if (project.path == ":x-pack:plugin" || project.path.contains("ql") || project.path.contains("smoke-test")) {
|
||||
} else if (project.path == ":x-pack:plugin" || project.path.contains("ql") || project.path.contains("smoke-test")) {
|
||||
tasks.register('checkPart3') { dependsOn 'check' }
|
||||
} else if (project.path.contains("multi-node")) {
|
||||
tasks.register('checkPart5') { dependsOn 'check' }
|
||||
|
|
Loading…
Add table
Reference in a new issue