diff --git a/docs/index.asciidoc b/docs/index.asciidoc index 921d13555..34dad2ee3 100644 --- a/docs/index.asciidoc +++ b/docs/index.asciidoc @@ -135,6 +135,10 @@ include::static/config-management.asciidoc[] include::static/management/configuring-centralized-pipelines.asciidoc[] +// EA Integrations to Logstash +// (Planting near module content for now. Will likely move it up in info architecture.) +include::static/ea-integrations.asciidoc[] + // Working with Logstash Modules include::static/modules.asciidoc[] diff --git a/docs/static/ea-integrations.asciidoc b/docs/static/ea-integrations.asciidoc new file mode 100644 index 000000000..4d4340c3a --- /dev/null +++ b/docs/static/ea-integrations.asciidoc @@ -0,0 +1,79 @@ +[[ea-integrations]] +== Using {ls} with Elastic {integrations} (Beta) + +You can take advantage of the extensive, built-in capabilities of Elastic {integrations}--such as managing data collection, transformation, and visualization--and then use {ls} for additional data processing and output options. +{ls} can further expand capabilities for use cases where you need additional processing, or if you need your data delivered to multiple destinations. + +[discrete] +[[integrations-value]] +=== Elastic {integrations}: ingesting to visualizing + +https://docs.elastic.co/integrations[Elastic {integrations}] provide quick, end-to-end solutions for: + +* ingesting data from a variety of data sources +* getting the data into the {stack}, and +* visualizing it with purpose-built dashboards. + +{integrations} are available for https://docs.elastic.co/integrations/all_integrations[popular services and platforms], such as Nginx, AWS, and MongoDB, as well as many generic input types like log files. +Each integration includes pre-packaged assets to help reduce the time between ingest and insights. + +To see available integrations, go to the {kib} home page, and click **Add {integrations}**. +You can use the query bar to search for integrations you may want to use. +When you find an integration for your data source, the UI walks you through adding and configuring it. + +[discrete] +[[integrations-and-ls]] +=== Extend {integrations} with {ls} (Beta) + +Logstash can run the ingest pipeline component of your integration when you use the Logstash filter-elastic_integration plugin. + +.How to + +**** +Create a {ls} pipeline that uses the <> plugin, and the https://github.com/elastic/logstash-filter-elastic_integration[logstash-filter-elastic_integration] plugin as the _first_ filter in your {ls} pipeline. +You can add more filters for additional processing, but they must come after the `logstash-filter-elastic_integration` plugin in your configuration. +Add an output plugin to complete your pipeline. +**** + + +**Sample pipeline configuration** + +[source,ruby] +----- +input { + elastic_agent { + port => 5044 + } +} + +filter { + elastic_integration{ <1> + cloud_id => "" + cloud_auth => " + source => "[http][host]" + target => "[@metadata][tenant]" + dictionary_path => "/etc/conf.d/logstash/tenants.yml" + } +} + +output { <3> + if [@metadata][tenant] == "tenant01" { + elasticsearch { + cloud_id => "" + api_key => "" + } + } else if [@metadata][tenant] == "tenant02" { + elasticsearch { + cloud_id => "" + api_key => "" + } + } +} +----- + +<1> Use `filter-elastic_agent` as the first filter in your pipeline +<2> You can use additional filters as long as they follow `filter-elastic_agent` +<3> Sample config to output data to multiple destinations