Doc: Add docs for extending integrations with filter-elastic_integrations (#15518) (#15674)

(cherry picked from commit 6c2e123a24)

Co-authored-by: Karen Metts <35154725+karenzone@users.noreply.github.com>
This commit is contained in:
github-actions[bot] 2023-12-07 14:59:24 -05:00 committed by GitHub
parent 0c86afe923
commit 1afa9d5250
No known key found for this signature in database
GPG key ID: 4AEE18F83AFDEB23
2 changed files with 83 additions and 0 deletions

View file

@ -135,6 +135,10 @@ include::static/config-management.asciidoc[]
include::static/management/configuring-centralized-pipelines.asciidoc[]
// EA Integrations to Logstash
// (Planting near module content for now. Will likely move it up in info architecture.)
include::static/ea-integrations.asciidoc[]
// Working with Logstash Modules
include::static/modules.asciidoc[]

79
docs/static/ea-integrations.asciidoc vendored Normal file
View file

@ -0,0 +1,79 @@
[[ea-integrations]]
== Using {ls} with Elastic {integrations} (Beta)
You can take advantage of the extensive, built-in capabilities of Elastic {integrations}--such as managing data collection, transformation, and visualization--and then use {ls} for additional data processing and output options.
{ls} can further expand capabilities for use cases where you need additional processing, or if you need your data delivered to multiple destinations.
[discrete]
[[integrations-value]]
=== Elastic {integrations}: ingesting to visualizing
https://docs.elastic.co/integrations[Elastic {integrations}] provide quick, end-to-end solutions for:
* ingesting data from a variety of data sources
* getting the data into the {stack}, and
* visualizing it with purpose-built dashboards.
{integrations} are available for https://docs.elastic.co/integrations/all_integrations[popular services and platforms], such as Nginx, AWS, and MongoDB, as well as many generic input types like log files.
Each integration includes pre-packaged assets to help reduce the time between ingest and insights.
To see available integrations, go to the {kib} home page, and click **Add {integrations}**.
You can use the query bar to search for integrations you may want to use.
When you find an integration for your data source, the UI walks you through adding and configuring it.
[discrete]
[[integrations-and-ls]]
=== Extend {integrations} with {ls} (Beta)
Logstash can run the ingest pipeline component of your integration when you use the Logstash filter-elastic_integration plugin.
.How to
****
Create a {ls} pipeline that uses the <<plugins-inputs-elastic_agent,elastic_agent input>> plugin, and the https://github.com/elastic/logstash-filter-elastic_integration[logstash-filter-elastic_integration] plugin as the _first_ filter in your {ls} pipeline.
You can add more filters for additional processing, but they must come after the `logstash-filter-elastic_integration` plugin in your configuration.
Add an output plugin to complete your pipeline.
****
**Sample pipeline configuration**
[source,ruby]
-----
input {
elastic_agent {
port => 5044
}
}
filter {
elastic_integration{ <1>
cloud_id => "<cloud id>"
cloud_auth => "<your_cloud-auth"
}
translate { <2>
source => "[http][host]"
target => "[@metadata][tenant]"
dictionary_path => "/etc/conf.d/logstash/tenants.yml"
}
}
output { <3>
if [@metadata][tenant] == "tenant01" {
elasticsearch {
cloud_id => "<cloud id>"
api_key => "<api key>"
}
} else if [@metadata][tenant] == "tenant02" {
elasticsearch {
cloud_id => "<cloud id>"
api_key => "<api key>"
}
}
}
-----
<1> Use `filter-elastic_agent` as the first filter in your pipeline
<2> You can use additional filters as long as they follow `filter-elastic_agent`
<3> Sample config to output data to multiple destinations