mirror of
https://github.com/elastic/kibana.git
synced 2025-04-25 02:09:32 -04:00
This PR: - updates navigation instructions to accommodate for the navigation changes related to solution views. - updates instructions for adding sample data to rely on the integrations page instead of the home page, that only exists with the classic solution view - updates references to the home page to avoid confusing users using one of the new solution views Closes: https://github.com/elastic/platform-docs-team/issues/529 Closes: https://github.com/elastic/platform-docs-team/issues/540
117 lines
4.3 KiB
Text
117 lines
4.3 KiB
Text
[[connect-to-elasticsearch]]
|
|
== Add data
|
|
|
|
The best way to add data to the Elastic Stack is to use one of our many integrations,
|
|
which are pre-packaged assets that are available for a wide array of popular
|
|
services and platforms. With integrations, you can add monitoring for logs and
|
|
metrics, protect systems from security threats, and more.
|
|
|
|
All integrations are available in a single view on the **Integrations** page.
|
|
|
|
[role="screenshot"]
|
|
image::images/add-integration.png[Integrations page from which you can choose integrations to start collecting and analyzing data]
|
|
|
|
NOTE: When an integration is available for both
|
|
{fleet-guide}/beats-agent-comparison.html[Elastic Agent and Beats],
|
|
the *Integrations* view defaults to the
|
|
Elastic Agent integration, if it is generally available (GA).
|
|
To show a
|
|
Beats integration, use the filter below the side navigation.
|
|
|
|
[float]
|
|
=== Add data with Elastic solutions
|
|
|
|
A good place to start is with one of our Elastic solutions, which
|
|
offer experiences for common use cases.
|
|
|
|
* *Elastic connectors and crawler.*
|
|
** Create searchable mirrors of your data in Sharepoint Online, S3, Google Drive, and many other web services using our open code {ref}/es-connectors.html[Elastic connectors].
|
|
** Discover, extract, and index your web content into {es} using the
|
|
{enterprise-search-ref}/crawler.html[Elastic web crawler].
|
|
|
|
|
|
* *Elastic Observability.*
|
|
Get logs, metrics, traces, and uptime data into the Elastic Stack.
|
|
Integrations are available for popular services and platforms,
|
|
such as Nginx, AWS, and MongoDB,
|
|
and generic input types like log files.
|
|
Refer to {observability-guide}/observability-introduction.html[Elastic Observability]
|
|
for more information.
|
|
|
|
* *Endpoint Security.*
|
|
Protect your hosts and send logs, metrics, and endpoint security data
|
|
to Elastic Security.
|
|
Refer to {security-guide}/ingest-data.html[Ingest data to Elastic Security]
|
|
for more information.
|
|
|
|
[float]
|
|
=== Add data with programming languages
|
|
|
|
Add any data to the Elastic Stack using a programming language,
|
|
such as JavaScript, Java, Python, and Ruby.
|
|
Details for each programming language library that Elastic provides are in the
|
|
https://www.elastic.co/guide/en/elasticsearch/client/index.html[{es} Client documentation].
|
|
|
|
If you are running {kib} on our hosted {es} Service,
|
|
click *Connection details* on the *Integrations* view
|
|
to verify your {es} endpoint and Cloud ID, and create API keys for integration.
|
|
Alternatively, the *Connection details* are also accessible through the top bar help menu.
|
|
|
|
[float]
|
|
=== Add sample data
|
|
|
|
Sample data sets come with sample visualizations, dashboards, and more to help you
|
|
explore {kib} before you add your own data.
|
|
In the *Integrations* view, search for *Sample Data*, and then add the type of
|
|
data you want.
|
|
|
|
[role="screenshot"]
|
|
image::images/add-sample-data.png[eCommerce, flights, and web logs sample data sets that you can explore in Kibana]
|
|
|
|
[discrete]
|
|
[[upload-data-kibana]]
|
|
=== Upload a data file
|
|
|
|
You can upload files, view their fields and metrics, and optionally import them to {es} with the Data Visualizer.
|
|
In the *Integrations* view, search for *Upload a file*, and then drop your file on the target.
|
|
|
|
You can upload different file formats for analysis with the Data Visualizer:
|
|
|
|
File formats supported up to 500 MB:
|
|
|
|
* CSV
|
|
* TSV
|
|
* NDJSON
|
|
* Log files
|
|
|
|
File formats supported up to 60 MB:
|
|
|
|
* PDF
|
|
* Microsoft Office files (Word, Excel, PowerPoint)
|
|
* Plain Text (TXT)
|
|
* Rich Text (RTF)
|
|
* Open Document Format (ODF)
|
|
|
|
NOTE: The upload feature is not intended for use as part of a repeated production
|
|
process, but rather for the initial exploration of your data.
|
|
|
|
[role="screenshot"]
|
|
image::images/add-data-fv.png[Uploading a file in {kib}]
|
|
|
|
The {stack-security-features} provide roles and privileges that control which
|
|
users can upload files. To upload a file in {kib} and import it into an {es}
|
|
index, you'll need:
|
|
|
|
* `manage_pipeline` or `manage_ingest_pipelines` cluster privilege
|
|
* `create`, `create_index`, `manage`, and `read` index privileges for the index
|
|
* `all` {kib} privileges for *Discover* and *Data Views Management*
|
|
|
|
You can manage your roles, privileges, and spaces in **{stack-manage-app}**.
|
|
|
|
[discrete]
|
|
=== What's next?
|
|
|
|
To take your investigation
|
|
to a deeper level, use <<discover, **Discover**>> and quickly gain insight to your data:
|
|
search and filter your data, get information about the structure of the fields,
|
|
and analyze your findings in a visualization.
|