mirror of
https://github.com/elastic/kibana.git
synced 2025-04-23 17:28:26 -04:00
Updates tutorial to add geopoint mapping for logs dataset.
Other fixes to dataset import commands.
This commit is contained in:
parent
c8a61626c6
commit
cfd4bcbd8e
2 changed files with 27 additions and 6 deletions
|
@ -102,8 +102,6 @@ Move the cursor to the bottom right corner of the container until the cursor cha
|
|||
cursor changes, click and drag the corner of the container to change the container's size. Release the mouse button to
|
||||
confirm the new container size.
|
||||
|
||||
// enhancement request: a way to specify specific dimensions for a container in pixels, or at least display that info?
|
||||
|
||||
[float]
|
||||
[[removing-containers]]
|
||||
==== Removing Containers
|
||||
|
|
|
@ -30,7 +30,7 @@ The tutorials in this section rely on the following data sets:
|
|||
* A set of fictitious accounts with randomly generated data. Download this data set by clicking here:
|
||||
https://github.com/bly2k/files/blob/master/accounts.zip?raw=true[accounts.zip]
|
||||
* A set of randomly generated log files. Download this data set by clicking here:
|
||||
https://download.elastic.co/demos/kibana/gettingstarted/logs.jsonl.gz[logstash.jsonl.gz]
|
||||
https://download.elastic.co/demos/kibana/gettingstarted/logs.jsonl.gz[logs.jsonl.gz]
|
||||
|
||||
Two of the data sets are compressed. Use the following commands to extract the files:
|
||||
|
||||
|
@ -105,13 +105,36 @@ there are multiple words in the field.
|
|||
* The same applies to the _play_name_ field.
|
||||
* The line_id and speech_number fields are integers.
|
||||
|
||||
The accounts and logstash data sets don't require any mappings, so at this point we're ready to load the data sets into
|
||||
Elasticsearch with the following commands:
|
||||
The logs data set requires a mapping to label the latitude/longitude pairs in the logs as geographic locations by
|
||||
applying the `geo_point` type to those fields.
|
||||
|
||||
Use the following command to establish `geo_point` mapping for the logs:
|
||||
|
||||
[source,shell]
|
||||
curl -XPUT http://localhost:9200/logstash-2015.05.18 -d '
|
||||
{
|
||||
"mappings" : {
|
||||
"pin" : {
|
||||
"properties" : {
|
||||
"coordinates" : {
|
||||
"type" : "geo_point"
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
';
|
||||
|
||||
Because the logs data set is in three indices, one for each day in a three-day period, run the mapping again two more
|
||||
times, changing the name of the index to logstash-2015.05.19 and logstash-2015.05.20.
|
||||
|
||||
The accounts data set doesn't require any mappings, so at this point we're ready to use the Elasticsearch
|
||||
{ref}/docs-bulk.html[`bulk`] API to load the data sets with the following commands:
|
||||
|
||||
[source,shell]
|
||||
curl -XPOST 'localhost:9200/bank/_bulk?pretty' --data-binary @accounts.json
|
||||
curl -XPOST 'localhost:9200/shakespeare/_bulk?pretty' --data-binary @shakespeare.json
|
||||
curl -XPOST 'localhost:9200/_bulk?pretty' --data-binary @logstash.json
|
||||
curl -XPOST 'localhost:9200/_bulk?pretty' --data-binary @logs.jsonl
|
||||
|
||||
These commands may take some time to execute, depending on the computing resources available.
|
||||
|
||||
|
|
Loading…
Add table
Add a link
Reference in a new issue