[DOC] Logging settings can affect performances (#12246) (#12247)

Logging settings can affect performances
Backports #12246 to 7.x branch

Co-authored-by: Luca Belluccini <luca.belluccini@elastic.co>
This commit is contained in:
Karen Metts 2020-09-15 16:51:29 -04:00 committed by GitHub
parent 1f345a2653
commit 2fb348e053
No known key found for this signature in database
GPG key ID: 4AEE18F83AFDEB23

View file

@ -162,6 +162,24 @@ troubleshooting tips to share, please:
* create an issue at https://github.com/elastic/logstash/issues, or * create an issue at https://github.com/elastic/logstash/issues, or
* create a pull request with your proposed changes at https://github.com/elastic/logstash. * create a pull request with your proposed changes at https://github.com/elastic/logstash.
[float]
[[ts-pipeline-logging-level-performance]]
=== Logging level can affect performances
*Symptoms*
Simple filters such as `mutate` or `json` filter can take several milliseconds per event to execute.
Inputs and outputs might be affected, too.
*Background*
The different plugins running on Logstash can be quite verbose if the logging level is set to `debug` or `trace`.
As the logging library used in Logstash is synchronous, heavy logging can affect performances.
*Solution*
Reset the logging level to `info`.
[float] [float]
[[ts-kafka]] [[ts-kafka]]
== Common Kafka support issues and solutions == Common Kafka support issues and solutions