mirror of
https://github.com/elastic/elasticsearch.git
synced 2025-04-25 15:47:23 -04:00
[DOCS] Add local dev setup instructions (#107913)
* [DOCS] Add local dev setup instructions - Replace existing Run ES in Docker locally page, with simpler no-security local dev setup - Move this file into Quickstart folder, along with existing quickstart guide - Update self-managed instructions in Quickstart guide to use local dev approach
This commit is contained in:
parent
6e7afa04b4
commit
d0f4966431
12 changed files with 202 additions and 254 deletions
284
docs/reference/quickstart/getting-started.asciidoc
Normal file
284
docs/reference/quickstart/getting-started.asciidoc
Normal file
|
@ -0,0 +1,284 @@
|
|||
[[getting-started]]
|
||||
== Quick start guide
|
||||
|
||||
This guide helps you learn how to:
|
||||
|
||||
* Run {es} and {kib} (using {ecloud} or in a local Docker dev environment),
|
||||
* add simple (non-timestamped) dataset to {es},
|
||||
* run basic searches.
|
||||
|
||||
[TIP]
|
||||
====
|
||||
If you're interested in using {es} with Python, check out Elastic Search Labs. This is the best place to explore AI-powered search use cases, such as working with embeddings, vector search, and retrieval augmented generation (RAG).
|
||||
|
||||
* https://www.elastic.co/search-labs/tutorials/search-tutorial/welcome[Tutorial]: this walks you through building a complete search solution with {es}, from the ground up.
|
||||
* https://github.com/elastic/elasticsearch-labs[`elasticsearch-labs` repository]: it contains a range of Python https://github.com/elastic/elasticsearch-labs/tree/main/notebooks[notebooks] and https://github.com/elastic/elasticsearch-labs/tree/main/example-apps[example apps].
|
||||
====
|
||||
|
||||
[discrete]
|
||||
[[run-elasticsearch]]
|
||||
=== Run {es}
|
||||
|
||||
The simplest way to set up {es} is to create a managed deployment with {ess} on
|
||||
{ecloud}. If you prefer to manage your own test environment, install and
|
||||
run {es} using Docker.
|
||||
|
||||
include::{es-ref-dir}/tab-widgets/code.asciidoc[]
|
||||
include::{es-ref-dir}/tab-widgets/quick-start-install-widget.asciidoc[]
|
||||
|
||||
[discrete]
|
||||
[[send-requests-to-elasticsearch]]
|
||||
=== Send requests to {es}
|
||||
|
||||
You send data and other requests to {es} using REST APIs. This lets you interact
|
||||
with {es} using any client that sends HTTP requests, such as
|
||||
https://curl.se[curl]. You can also use {kib}'s Console to send requests to
|
||||
{es}.
|
||||
|
||||
include::{es-ref-dir}/tab-widgets/api-call-widget.asciidoc[]
|
||||
|
||||
[discrete]
|
||||
[[add-data]]
|
||||
=== Add data
|
||||
|
||||
You add data to {es} as JSON objects called documents. {es} stores these
|
||||
documents in searchable indices.
|
||||
|
||||
[discrete]
|
||||
[[add-single-document]]
|
||||
==== Add a single document
|
||||
|
||||
Submit the following indexing request to add a single document to the
|
||||
`books` index.
|
||||
The request automatically creates the index.
|
||||
|
||||
////
|
||||
[source,console]
|
||||
----
|
||||
PUT books
|
||||
----
|
||||
// TESTSETUP
|
||||
////
|
||||
|
||||
[source,console]
|
||||
----
|
||||
POST books/_doc
|
||||
{"name": "Snow Crash", "author": "Neal Stephenson", "release_date": "1992-06-01", "page_count": 470}
|
||||
----
|
||||
// TEST[s/_doc/_doc?refresh=wait_for/]
|
||||
|
||||
The response includes metadata that {es} generates for the document including a unique `_id` for the document within the index.
|
||||
|
||||
.Expand to see example response
|
||||
[%collapsible]
|
||||
===============
|
||||
[source,console-result]
|
||||
----
|
||||
{
|
||||
"_index": "books",
|
||||
"_id": "O0lG2IsBaSa7VYx_rEia",
|
||||
"_version": 1,
|
||||
"result": "created",
|
||||
"_shards": {
|
||||
"total": 2,
|
||||
"successful": 2,
|
||||
"failed": 0
|
||||
},
|
||||
"_seq_no": 0,
|
||||
"_primary_term": 1
|
||||
}
|
||||
----
|
||||
// TEST[skip:TODO]
|
||||
===============
|
||||
|
||||
[discrete]
|
||||
[[add-multiple-documents]]
|
||||
==== Add multiple documents
|
||||
|
||||
Use the `_bulk` endpoint to add multiple documents in one request. Bulk data
|
||||
must be newline-delimited JSON (NDJSON). Each line must end in a newline
|
||||
character (`\n`), including the last line.
|
||||
|
||||
[source,console]
|
||||
----
|
||||
POST /_bulk
|
||||
{ "index" : { "_index" : "books" } }
|
||||
{"name": "Revelation Space", "author": "Alastair Reynolds", "release_date": "2000-03-15", "page_count": 585}
|
||||
{ "index" : { "_index" : "books" } }
|
||||
{"name": "1984", "author": "George Orwell", "release_date": "1985-06-01", "page_count": 328}
|
||||
{ "index" : { "_index" : "books" } }
|
||||
{"name": "Fahrenheit 451", "author": "Ray Bradbury", "release_date": "1953-10-15", "page_count": 227}
|
||||
{ "index" : { "_index" : "books" } }
|
||||
{"name": "Brave New World", "author": "Aldous Huxley", "release_date": "1932-06-01", "page_count": 268}
|
||||
{ "index" : { "_index" : "books" } }
|
||||
{"name": "The Handmaids Tale", "author": "Margaret Atwood", "release_date": "1985-06-01", "page_count": 311}
|
||||
----
|
||||
// TEST[continued]
|
||||
|
||||
You should receive a response indicating there were no errors.
|
||||
|
||||
.Expand to see example response
|
||||
[%collapsible]
|
||||
===============
|
||||
[source,console-result]
|
||||
----
|
||||
{
|
||||
"errors": false,
|
||||
"took": 29,
|
||||
"items": [
|
||||
{
|
||||
"index": {
|
||||
"_index": "books",
|
||||
"_id": "QklI2IsBaSa7VYx_Qkh-",
|
||||
"_version": 1,
|
||||
"result": "created",
|
||||
"_shards": {
|
||||
"total": 2,
|
||||
"successful": 2,
|
||||
"failed": 0
|
||||
},
|
||||
"_seq_no": 1,
|
||||
"_primary_term": 1,
|
||||
"status": 201
|
||||
}
|
||||
},
|
||||
{
|
||||
"index": {
|
||||
"_index": "books",
|
||||
"_id": "Q0lI2IsBaSa7VYx_Qkh-",
|
||||
"_version": 1,
|
||||
"result": "created",
|
||||
"_shards": {
|
||||
"total": 2,
|
||||
"successful": 2,
|
||||
"failed": 0
|
||||
},
|
||||
"_seq_no": 2,
|
||||
"_primary_term": 1,
|
||||
"status": 201
|
||||
}
|
||||
},
|
||||
{
|
||||
"index": {
|
||||
"_index": "books",
|
||||
"_id": "RElI2IsBaSa7VYx_Qkh-",
|
||||
"_version": 1,
|
||||
"result": "created",
|
||||
"_shards": {
|
||||
"total": 2,
|
||||
"successful": 2,
|
||||
"failed": 0
|
||||
},
|
||||
"_seq_no": 3,
|
||||
"_primary_term": 1,
|
||||
"status": 201
|
||||
}
|
||||
},
|
||||
{
|
||||
"index": {
|
||||
"_index": "books",
|
||||
"_id": "RUlI2IsBaSa7VYx_Qkh-",
|
||||
"_version": 1,
|
||||
"result": "created",
|
||||
"_shards": {
|
||||
"total": 2,
|
||||
"successful": 2,
|
||||
"failed": 0
|
||||
},
|
||||
"_seq_no": 4,
|
||||
"_primary_term": 1,
|
||||
"status": 201
|
||||
}
|
||||
},
|
||||
{
|
||||
"index": {
|
||||
"_index": "books",
|
||||
"_id": "RklI2IsBaSa7VYx_Qkh-",
|
||||
"_version": 1,
|
||||
"result": "created",
|
||||
"_shards": {
|
||||
"total": 2,
|
||||
"successful": 2,
|
||||
"failed": 0
|
||||
},
|
||||
"_seq_no": 5,
|
||||
"_primary_term": 1,
|
||||
"status": 201
|
||||
}
|
||||
}
|
||||
]
|
||||
}
|
||||
----
|
||||
// TEST[skip:TODO]
|
||||
===============
|
||||
|
||||
[discrete]
|
||||
[[qs-search-data]]
|
||||
=== Search data
|
||||
|
||||
Indexed documents are available for search in near real-time.
|
||||
|
||||
[discrete]
|
||||
[[search-all-documents]]
|
||||
==== Search all documents
|
||||
|
||||
Run the following command to search the `books` index for all documents:
|
||||
[source,console]
|
||||
----
|
||||
GET books/_search
|
||||
----
|
||||
// TEST[continued]
|
||||
|
||||
The `_source` of each hit contains the original
|
||||
JSON object submitted during indexing.
|
||||
|
||||
[discrete]
|
||||
[[qs-match-query]]
|
||||
==== `match` query
|
||||
|
||||
You can use the `match` query to search for documents that contain a specific value in a specific field.
|
||||
This is the standard query for performing full-text search, including fuzzy matching and phrase searches.
|
||||
|
||||
Run the following command to search the `books` index for documents containing `brave` in the `name` field:
|
||||
[source,console]
|
||||
----
|
||||
GET books/_search
|
||||
{
|
||||
"query": {
|
||||
"match": {
|
||||
"name": "brave"
|
||||
}
|
||||
}
|
||||
}
|
||||
----
|
||||
// TEST[continued]
|
||||
|
||||
[discrete]
|
||||
[[whats-next]]
|
||||
=== Next steps
|
||||
|
||||
Now that {es} is up and running and you've learned the basics, you'll probably want to test out larger datasets, or index your own data.
|
||||
|
||||
[discrete]
|
||||
[[whats-next-search-learn-more]]
|
||||
==== Learn more about search queries
|
||||
|
||||
* <<search-with-elasticsearch>>. Jump here to learn about exact value search, full-text search, vector search, and more, using the <<search-search,search API>>.
|
||||
|
||||
[discrete]
|
||||
[[whats-next-more-data]]
|
||||
==== Add more data
|
||||
|
||||
* Learn how to {kibana-ref}/sample-data.html[install sample data] using {kib}. This is a quick way to test out {es} on larger workloads.
|
||||
* Learn how to use the {kibana-ref}/connect-to-elasticsearch.html#upload-data-kibana[upload data UI] in {kib} to add your own CSV, TSV, or JSON files.
|
||||
* Use the https://www.elastic.co/guide/en/elasticsearch/reference/current/docs-bulk.html[bulk API] to ingest your own datasets to {es}.
|
||||
|
||||
[discrete]
|
||||
[[whats-next-client-libraries]]
|
||||
==== {es} programming language clients
|
||||
|
||||
* Check out our https://www.elastic.co/guide/en/elasticsearch/client/index.html[client library] to work with your {es} instance in your preferred programming language.
|
||||
* If you're using Python, check out https://www.elastic.co/search-labs[Elastic Search Labs] for a range of examples that use the {es} Python client. This is the best place to explore AI-powered search use cases, such as working with embeddings, vector search, and retrieval augmented generation (RAG).
|
||||
** This extensive, hands-on https://www.elastic.co/search-labs/tutorials/search-tutorial/welcome[tutorial]
|
||||
walks you through building a complete search solution with {es}, from the ground up.
|
||||
** https://github.com/elastic/elasticsearch-labs[`elasticsearch-labs`] contains a range of executable Python https://github.com/elastic/elasticsearch-labs/tree/main/notebooks[notebooks] and https://github.com/elastic/elasticsearch-labs/tree/main/example-apps[example apps].
|
10
docs/reference/quickstart/index.asciidoc
Normal file
10
docs/reference/quickstart/index.asciidoc
Normal file
|
@ -0,0 +1,10 @@
|
|||
[[quickstart]]
|
||||
= Quickstart
|
||||
|
||||
Get started quickly with {es}.
|
||||
|
||||
* Learn how to run {es} (and {kib}) for <<run-elasticsearch-locally,local development>>.
|
||||
* Follow our <<getting-started,Quickstart guide>> to add data to {es} and query it.
|
||||
|
||||
include::run-elasticsearch-locally.asciidoc[]
|
||||
include::getting-started.asciidoc[]
|
177
docs/reference/quickstart/run-elasticsearch-locally.asciidoc
Normal file
177
docs/reference/quickstart/run-elasticsearch-locally.asciidoc
Normal file
|
@ -0,0 +1,177 @@
|
|||
[[run-elasticsearch-locally]]
|
||||
== Run {es} locally in Docker (without security)
|
||||
++++
|
||||
<titleabbrev>Local dev setup (Docker)</titleabbrev>
|
||||
++++
|
||||
|
||||
[WARNING]
|
||||
====
|
||||
*DO NOT USE THESE INSTRUCTIONS FOR PRODUCTION DEPLOYMENTS*
|
||||
|
||||
The instructions on this page are for *local development only*. Do not use these instructions for production deployments, because they are not secure.
|
||||
While this approach is convenient for experimenting and learning, you should never run the service in this way in a production environment.
|
||||
|
||||
Refer to https://www.elastic.co/guide/en/elasticsearch/reference/current/install-elasticsearch.html[Install {es}] to learn about the various options for installing {es} in a production environment, including using Docker.
|
||||
====
|
||||
|
||||
The following commands help you very quickly spin up a single-node {es} cluster, together with {kib} in Docker.
|
||||
Note that if you don't need the {kib} UI, you can skip those instructions.
|
||||
|
||||
[discrete]
|
||||
[[local-dev-why]]
|
||||
=== When would I use this setup?
|
||||
|
||||
Use this setup if you want to quickly spin up {es} (and {kib}) for local development or testing.
|
||||
|
||||
For example you might:
|
||||
|
||||
* Want to run a quick test to see how a feature works.
|
||||
* Follow a tutorial or guide that requires an {es} cluster, like our <<getting-started,quick start guide>>.
|
||||
* Experiment with the {es} APIs using different tools, like the Dev Tools Console, cURL, or an Elastic programming language client.
|
||||
* Quickly spin up an {es} cluster to test an executable https://github.com/elastic/elasticsearch-labs/tree/main/notebooks#readme[Python notebook] locally.
|
||||
|
||||
[discrete]
|
||||
[[local-dev-prerequisites]]
|
||||
=== Prerequisites
|
||||
|
||||
If you don't have Docker installed, https://www.docker.com/products/docker-desktop[download and install Docker Desktop] for your operating system.
|
||||
|
||||
[discrete]
|
||||
[[local-dev-env-vars]]
|
||||
=== Set environment variables
|
||||
|
||||
Configure the following environment variables.
|
||||
|
||||
[source,sh]
|
||||
----
|
||||
export ELASTIC_PASSWORD="<ES_PASSWORD>" # password for "elastic" username
|
||||
export KIBANA_PASSWORD="<KIB_PASSWORD>" # Used _internally_ by Kibana, must be at least 6 characters long
|
||||
----
|
||||
|
||||
[discrete]
|
||||
[[local-dev-create-docker-network]]
|
||||
=== Create a Docker network
|
||||
|
||||
To run both {es} and {kib}, you'll need to create a Docker network:
|
||||
|
||||
[source,sh]
|
||||
----
|
||||
docker network create elastic-net
|
||||
----
|
||||
|
||||
[discrete]
|
||||
[[local-dev-run-es]]
|
||||
=== Run {es}
|
||||
|
||||
Start the {es} container with the following command:
|
||||
|
||||
ifeval::["{release-state}"=="unreleased"]
|
||||
WARNING: Version {version} has not yet been released.
|
||||
No Docker image is currently available for {es} {version}.
|
||||
endif::[]
|
||||
|
||||
[source,sh,subs="attributes"]
|
||||
----
|
||||
docker run -p 127.0.0.1:9200:9200 -d --name elasticsearch --network elastic-net \
|
||||
-e ELASTIC_PASSWORD=$ELASTIC_PASSWORD \
|
||||
-e "discovery.type=single-node" \
|
||||
-e "xpack.security.http.ssl.enabled=false" \
|
||||
-e "xpack.license.self_generated.type=trial" \
|
||||
{docker-image}
|
||||
----
|
||||
|
||||
[discrete]
|
||||
[[local-dev-run-kib]]
|
||||
=== Run {kib} (optional)
|
||||
|
||||
To run {kib}, you must first set the `kibana_system` password in the {es} container.
|
||||
|
||||
[source,sh,subs="attributes"]
|
||||
----
|
||||
# configure the Kibana password in the ES container
|
||||
curl -u elastic:$ELASTIC_PASSWORD \
|
||||
-X POST \
|
||||
http://localhost:9200/_security/user/kibana_system/_password \
|
||||
-d '{"password":"'"$KIBANA_PASSWORD"'"}' \
|
||||
-H 'Content-Type: application/json'
|
||||
----
|
||||
// NOTCONSOLE
|
||||
|
||||
Start the {kib} container with the following command:
|
||||
|
||||
ifeval::["{release-state}"=="unreleased"]
|
||||
WARNING: Version {version} has not yet been released.
|
||||
No Docker image is currently available for {es} {version}.
|
||||
endif::[]
|
||||
|
||||
[source,sh,subs="attributes"]
|
||||
----
|
||||
docker run -p 127.0.0.1:5601:5601 -d --name kibana --network elastic-net \
|
||||
-e ELASTICSEARCH_URL=http://elasticsearch:9200 \
|
||||
-e ELASTICSEARCH_HOSTS=http://elasticsearch:9200 \
|
||||
-e ELASTICSEARCH_USERNAME=kibana_system \
|
||||
-e ELASTICSEARCH_PASSWORD=$KIBANA_PASSWORD \
|
||||
-e "xpack.security.enabled=false" \
|
||||
-e "xpack.license.self_generated.type=trial" \
|
||||
{kib-docker-image}
|
||||
----
|
||||
|
||||
[NOTE]
|
||||
====
|
||||
The service is started with a trial license. The trial license enables all features of Elasticsearch for a trial period of 30 days. After the trial period expires, the license is downgraded to a basic license, which is free forever. If you prefer to skip the trial and use the basic license, set the value of the `xpack.license.self_generated.type` variable to basic instead. For a detailed feature comparison between the different licenses, refer to our https://www.elastic.co/subscriptions[subscriptions page].
|
||||
====
|
||||
|
||||
[discrete]
|
||||
[[local-dev-connecting-clients]]
|
||||
== Connecting to {es} with language clients
|
||||
|
||||
To connect to the {es} cluster from a language client, you can use basic authentication with the `elastic` username and the password you set in the environment variable.
|
||||
|
||||
You'll use the following connection details:
|
||||
|
||||
* **{es} endpoint**: `http://localhost:9200`
|
||||
* **Username**: `elastic`
|
||||
* **Password**: `$ELASTIC_PASSWORD` (Value you set in the environment variable)
|
||||
|
||||
For example, to connect with the Python `elasticsearch` client:
|
||||
|
||||
[source,python]
|
||||
----
|
||||
import os
|
||||
from elasticsearch import Elasticsearch
|
||||
|
||||
username = 'elastic'
|
||||
password = os.getenv('ELASTIC_PASSWORD') # Value you set in the environment variable
|
||||
|
||||
client = Elasticsearch(
|
||||
"http://localhost:9200",
|
||||
basic_auth=(username, password)
|
||||
)
|
||||
|
||||
print(client.info())
|
||||
----
|
||||
|
||||
Here's an example curl command using basic authentication:
|
||||
|
||||
[source,sh,subs="attributes"]
|
||||
----
|
||||
curl -u elastic:$ELASTIC_PASSWORD \
|
||||
-X PUT \
|
||||
http://localhost:9200/my-new-index \
|
||||
-H 'Content-Type: application/json'
|
||||
----
|
||||
// NOTCONSOLE
|
||||
|
||||
[discrete]
|
||||
[[local-dev-next-steps]]
|
||||
=== Next steps
|
||||
|
||||
Use our <<getting-started,quick start guide>> to learn the basics of {es}: how to add data and query it.
|
||||
|
||||
[discrete]
|
||||
[[local-dev-production]]
|
||||
=== Moving to production
|
||||
|
||||
This setup is not suitable for production use. For production deployments, we recommend using our managed service on Elastic Cloud. https://cloud.elastic.co/registration[Sign up for a free trial] (no credit card required).
|
||||
|
||||
Otherwise, refer to https://www.elastic.co/guide/en/elasticsearch/reference/current/install-elasticsearch.html[Install {es}] to learn about the various options for installing {es} in a self-managed production environment, including using Docker.
|
Loading…
Add table
Add a link
Reference in a new issue