[DOCS] Add local dev setup instructions (#107913)

* [DOCS] Add local dev setup instructions

- Replace existing Run ES in Docker locally page, with simpler no-security local dev setup
- Move this file into Quickstart folder, along with existing quickstart guide
- Update self-managed instructions in Quickstart guide to use local dev approach
This commit is contained in:
Liam Thompson 2024-05-07 18:10:48 +02:00 committed by GitHub
parent 6e7afa04b4
commit d0f4966431
No known key found for this signature in database
GPG key ID: B5690EEEBB952194
12 changed files with 202 additions and 254 deletions

View file

@ -20,7 +20,7 @@ If you want to install and manage {es} yourself, you can:
* Run {es} in a <<elasticsearch-docker-images,Docker container>>.
* Set up and manage {es}, {kib}, {agent}, and the rest of the Elastic Stack on Kubernetes with {eck-ref}[{eck}].
TIP: To try out Elasticsearch on your own machine, we recommend using Docker and running both Elasticsearch and Kibana. For more information, see <<run-elasticsearch-locally,Run Elasticsearch locally>>.
TIP: To try out Elasticsearch on your own machine, we recommend using Docker and running both Elasticsearch and Kibana. For more information, see <<run-elasticsearch-locally,Run Elasticsearch locally>>. Please note that this setup is *not suitable for production use*.
[discrete]
[[elasticsearch-install-packages]]

View file

@ -8,6 +8,12 @@ https://github.com/elastic/elasticsearch/blob/{branch}/distribution/docker[GitHu
include::license.asciidoc[]
[TIP]
====
If you just want to test {es} in local development, refer to <<run-elasticsearch-locally>>.
Please note that this setup is not suitable for production environments.
====
[[docker-cli-run-dev-mode]]
==== Run {es} in Docker

View file

@ -1,183 +0,0 @@
[[run-elasticsearch-locally]]
== Run Elasticsearch locally
////
IMPORTANT: This content is replicated in the Elasticsearch repo
README.ascidoc file. If you make changes, you must also update the
Elasticsearch README.
+
GitHub renders the tagged region directives when you view the README,
so it's not possible to just include the content from the README. Darn.
+
Also note that there are similar instructions in the Kibana guide:
https://www.elastic.co/guide/en/kibana/current/docker.html
////
To try out Elasticsearch on your own machine, we recommend using Docker
and running both Elasticsearch and Kibana.
Docker images are available from the https://www.docker.elastic.co[Elastic Docker registry].
NOTE: Starting in Elasticsearch 8.0, security is enabled by default.
The first time you start Elasticsearch, TLS encryption is configured automatically,
a password is generated for the `elastic` user,
and a Kibana enrollment token is created so you can connect Kibana to your secured cluster.
For other installation options, see the
https://www.elastic.co/guide/en/elasticsearch/reference/current/install-elasticsearch.html[Elasticsearch installation documentation].
[discrete]
=== Start Elasticsearch
. Install and start https://www.docker.com/products/docker-desktop[Docker
Desktop]. Go to **Preferences > Resources > Advanced** and set Memory to at least 4GB.
. Start an Elasticsearch container:
ifeval::["{release-state}"=="unreleased"]
+
WARNING: Version {version} of {es} has not yet been released, so no
Docker image is currently available for this version.
endif::[]
+
[source,sh,subs="attributes"]
----
docker network create elastic
docker pull docker.elastic.co/elasticsearch/elasticsearch:{version}
docker run --name elasticsearch --net elastic -p 9200:9200 -p 9300:9300 -e "discovery.type=single-node" -t docker.elastic.co/elasticsearch/elasticsearch:{version}
----
+
When you start Elasticsearch for the first time, the generated `elastic` user password and
Kibana enrollment token are output to the terminal.
+
NOTE: You might need to scroll back a bit in the terminal to view the password
and enrollment token.
. Copy the generated password and enrollment token and save them in a secure
location. These values are shown only when you start Elasticsearch for the first time.
You'll use these to enroll Kibana with your Elasticsearch cluster and log in.
[discrete]
=== Start Kibana
Kibana enables you to easily send requests to Elasticsearch and analyze, visualize, and manage data interactively.
. In a new terminal session, start Kibana and connect it to your Elasticsearch container:
ifeval::["{release-state}"=="unreleased"]
+
WARNING: Version {version} of {kib} has not yet been released, so no
Docker image is currently available for this version.
endif::[]
+
[source,sh,subs="attributes"]
----
docker pull docker.elastic.co/kibana/kibana:{version}
docker run --name kibana --net elastic -p 5601:5601 docker.elastic.co/kibana/kibana:{version}
----
+
When you start Kibana, a unique URL is output to your terminal.
. To access Kibana, open the generated URL in your browser.
.. Paste the enrollment token that you copied when starting
Elasticsearch and click the button to connect your Kibana instance with Elasticsearch.
.. Log in to Kibana as the `elastic` user with the password that was generated
when you started Elasticsearch.
[discrete]
=== Send requests to Elasticsearch
You send data and other requests to Elasticsearch through REST APIs.
You can interact with Elasticsearch using any client that sends HTTP requests,
such as the https://www.elastic.co/guide/en/elasticsearch/client/index.html[Elasticsearch
language clients] and https://curl.se[curl].
Kibana's developer console provides an easy way to experiment and test requests.
To access the console, go to **Management > Dev Tools**.
[discrete]
=== Add data
You index data into Elasticsearch by sending JSON objects (documents) through the REST APIs.
Whether you have structured or unstructured text, numerical data, or geospatial data,
Elasticsearch efficiently stores and indexes it in a way that supports fast searches.
For timestamped data such as logs and metrics, you typically add documents to a
data stream made up of multiple auto-generated backing indices.
To add a single document to an index, submit an HTTP post request that targets the index.
[source,console]
----
POST /customer/_doc/1
{
"firstname": "Jennifer",
"lastname": "Walters"
}
----
This request automatically creates the `customer` index if it doesn't exist,
adds a new document that has an ID of 1, and
stores and indexes the `firstname` and `lastname` fields.
The new document is available immediately from any node in the cluster.
You can retrieve it with a GET request that specifies its document ID:
[source,console]
----
GET /customer/_doc/1
----
// TEST[continued]
To add multiple documents in one request, use the `_bulk` API.
Bulk data must be newline-delimited JSON (NDJSON).
Each line must end in a newline character (`\n`), including the last line.
[source,console]
----
PUT customer/_bulk
{ "create": { } }
{ "firstname": "Monica","lastname":"Rambeau"}
{ "create": { } }
{ "firstname": "Carol","lastname":"Danvers"}
{ "create": { } }
{ "firstname": "Wanda","lastname":"Maximoff"}
{ "create": { } }
{ "firstname": "Jennifer","lastname":"Takeda"}
----
// TEST[continued]
[discrete]
=== Search
Indexed documents are available for search in near real-time.
The following search matches all customers with a first name of _Jennifer_
in the `customer` index.
[source,console]
----
GET customer/_search
{
"query" : {
"match" : { "firstname": "Jennifer" }
}
}
----
// TEST[continued]
[discrete]
=== Explore
You can use Discover in Kibana to interactively search and filter your data.
From there, you can start creating visualizations and building and sharing dashboards.
To get started, create a _data view_ that connects to one or more Elasticsearch indices,
data streams, or index aliases.
. Go to **Management > Stack Management > Kibana > Data Views**.
. Select **Create data view**.
. Enter a name for the data view and a pattern that matches one or more indices,
such as _customer_.
. Select **Save data view to Kibana**.
To start exploring, go to **Analytics > Discover**.