logstash/docs/reference/creating-logstash-pipeline.md
Karen Metts 91927d7450
Doc: Migrate docs from AsciiDoc to Markdown in 9.0 branch (#17289)
* Doc: Delete asciidoc files for 9.0 branch
* Add MD files for 9.0 branch
2025-03-10 18:02:14 -04:00

1.5 KiB
Raw Blame History

mapped_pages
https://www.elastic.co/guide/en/logstash/current/configuration.html

Creating a Logstash Pipeline [configuration]

You can create a pipeline by stringing together plugins--inputs, outputs, filters, and sometimes codecs--in order to process data. To build a Logstash pipeline, create a config file to specify which plugins you want to use and the settings for each plugin.

A very basic pipeline might contain only an input and an output. Most pipelines include at least one filter plugin because thats where the "transform" part of the ETL (extract, transform, load) magic happens. You can reference event fields in a pipeline and use conditionals to process events when they meet certain criteria.

Lets step through creating a simple pipeline config on your local machine and then using it to run Logstash. Create a file named "logstash-simple.conf" and save it in the same directory as Logstash.

input { stdin { } }
output {
  elasticsearch { cloud_id => "<cloud id>" api_key => "<api key>" }
  stdout { codec => rubydebug }
}

Then, run {{ls}} and specify the configuration file with the -f flag.

bin/logstash -f logstash-simple.conf

Et voilà! Logstash reads the specified configuration file and outputs to both Elasticsearch and stdout. Before we move on to more complex examples, lets take a look at whats in a pipeline config file.