logstash/docs/static/releasenotes.asciidoc
2017-05-26 13:25:37 -07:00

163 lines
6.1 KiB
Text

[[releasenotes]]
== Release Notes
This section summarizes the changes in the following releases:
* <<logstash-5-3-3,Logstash 5.3.3>>
* <<logstash-5-3-2,Logstash 5.3.2>>
* <<logstash-5-3-1,Logstash 5.3.1>>
* <<logstash-5-3-0,Logstash 5.3.0>>
[[logstash-5-3-3]]
=== Logstash 5.3.3 Release Notes
* Fixed an issue on Windows where Logstash was unable to locate the default log4j2.properties file ({lsissue}6352[Issue 6352]).
[float]
==== Input Plugins
*`JDBC`*:
* This plugin now automatically reconnects on connection issues (https://github.com/logstash-plugins/logstash-input-jdbc/issues/45)[Issue 45].
*`HTTP Poller`*:
* Added option to specify top level user/password options that apply to all URLs by default.
* Added eager auth functionality. This means the client will send any credentials in its first request rather than waiting for a 401 challenge.
*`Log4j`*:
* This input will now reject any non-log4j log objects sent as input.
* `RabbitMQ`*:
* Updated the underlying Rabbitmq Java lib to v3.0.0.
*`S3`*:
* Fixed an issue when LS would crash when attempting to ingest a json file with a message attribute that is not a string (https://github.com/logstash-plugins/logstash-input-s3/issues/109)[Issue 109].
==== Filter Plugins
*`Date`*:
* This plugin now ignores canceled events which were unnecessarily processed before.
*`Fingerprint`*:
* Improved error messages that could happen during start up.
*`Grok`*:
* Fixed an issue when a subdirectory under patterns directory could crash Logstash at startup (https://github.com/logstash-plugins/logstash-filter-grok/issues/110[Issue 110]).
==== Output Plugins
*`S3`*:
* Fixed `restore_from_crash` option to use the same upload options as the normal uploader (https://github.com/logstash-plugins/logstash-output-s3/issues/140)[Issue 140].
* Updated `canned_acl` options to allow `public-read`, `public-read-write` and `authenticated-read` as possible values.
* `RabbitMQ`*:
* Updated the underlying Rabbitmq Java lib to v3.0.0.
[[logstash-5-3-2]]
=== Logstash 5.3.2 Release Notes
[float]
==== Input Plugins
*`Elasticsearch`*:
* Fixed scrolling to use JSON bodies in the requests.
*`HTTP`*:
* Improve error logging to log more details including stack trace.
*`Log4j`*:
* This input will now reject any non-log4j log objects sent as input.
==== Filter Plugins
*`URL Decode`*:
* Fixed an issue where Logstash would crash when processing unicode input with this filter.
==== Output Plugins
*`Elasticsearch`*:
* Add support for customizing `sniffing_path` with having to use `absolute_sniffing_path`
*`Kafka`*:
* Fixed a bug when Logstash would fail to start up when `SASL_SSL` and `PLAIN` (no Kerberos) options were specified.
[[logstash-5-3-1]]
=== Logstash 5.3.1 Release Notes
* Fixed an issue when using the persistent queue feature where old data on disk was not getting purged ({lsissue}6829[Issue 6829]).
* Fixed a potential data deletion issue on the deployment instance when using the `pack` command of the
offline plugin workflow ({lsissue}6862[Issue 6862]).
[[logstash-5-3-0]]
=== Logstash 5.3.0 Release Notes
* Persistent queues:
** Changed the default queue location on disk to include the pipeline's ID in the path hierarchy.
By default, the queue is now created under `<path.data>/queue/main`. This breaking change was made to
accommodate an upcoming feature where multiple, isolated pipelines could be run on the same Logstash
instance.
** Added a recovery process that runs during Logstash startup to recover data that has been written to the
persistent queue, but not yet checkpointed. This is useful in situations where the input has written data to
the queue, but Logstash crashed before writing to the checkpoint file.
** Added exclusive access to the persistent queue on disk, as defined by the `path.queue` setting. Using a file
lock guards against corruption by ensuring that only a single Logstash instance has access to write to the
queue on the same path. ({lsissue}6604[Issue 6604]).
** You can now safely reload the pipeline config when using persistent queues. Previously, reloading the
config could result in data corruption. In 5.3, the reload sequence has been changed to reliably shut down the
first pipeline before a new one is started with the same settings.
** Fixed an issue where Logstash would stop accepting new events when queue capacity is reached even though events
were successfully acknowledged ({lsissue}6626[Issue 6626]).
* Fixed a warning message when --config.debug is used with --log.level=debug ({lsissue}6256[Issue 6256]).
[float]
==== Input Plugins
*`S3`*:
* We now include the S3 key information in the metadata (https://github.com/logstash-plugins/logstash-input-s3/issues/105[Issue 105]).
*`Unix`*:
* The `host` and `path` fields are no longer overwritten if they are already provided by `add_field` config.
==== Filter Plugins
*`KV`*:
* Breaking: The `trim` and `trimkey` options are renamed to `trim_value` and `trim_key` respectively (https://github.com/logstash-plugins/logstash-filter-kv/issues/10[Issue 10]).
* `trim_value` only removes the specified leading and trailing characters from the value. Similarly, `trim_key`
only removes the specified leading and trailing characters from the key (https://github.com/logstash-plugins/logstash-filter-kv/issues/10[Issue 10]).
* Added new options `remove_char_value` and `remove_char_key` to remove the specified characters from keys
(or values) regardless of where these characters are found (https://github.com/logstash-plugins/logstash-filter-kv/issues/10[Issue 10]).
*`Grok`*:
* Added an option to define custom patterns using `pattern_definitions` configuration.
==== Output Plugins
*`S3`*:
* Fixed to use the correct `signature_version` for the SDK v2 library (https://github.com/logstash-plugins/logstash-output-csv/issues/129[Issue 129]).
* Fixed an issue which resulted in uploading empty files to S3 when using gzip compression (https://github.com/logstash-plugins/logstash-output-s3/issues/95[Issue 95]).
*`CSV`*:
* Updated to work with the 5.0 event API and threading contracts (https://github.com/logstash-plugins/logstash-output-csv/issues/10[Issue 10]).