Cherrypick asciidoctor formatting changes

This commit is contained in:
Karen Metts 2019-05-06 18:46:58 -04:00 committed by GitHub
parent 322b485c0a
commit 304e73c173
No known key found for this signature in database
GPG key ID: 4AEE18F83AFDEB23
8 changed files with 22 additions and 31 deletions

View file

@ -47,7 +47,6 @@ filebeat.prospectors:
output.logstash:
hosts: ["localhost:5044"]
--------------------------------------------------------------------------------
<1> Absolute path to the file or files that Filebeat processes.
Save your changes.
@ -711,7 +710,6 @@ filebeat.prospectors:
output.logstash:
hosts: ["localhost:5044"]
--------------------------------------------------------------------------------
<1> Absolute path to the file or files that Filebeat processes.
<2> Adds a field called `type` with the value `syslog` to the event.

View file

@ -103,7 +103,6 @@ output {
}
}
--------------------------------------------------------------------------------
<1> The path to the top-level directory containing the dead letter queue. This
directory contains a separate folder for each pipeline that writes to the dead
letter queue. To find the path to this directory, look at the `logstash.yml`
@ -212,7 +211,6 @@ output {
}
}
--------------------------------------------------------------------------------
<1> The <<plugins-inputs-dead_letter_queue,`dead_letter_queue` input>> reads from the dead letter queue.
<2> The `mutate` filter removes the problem field called `location`.
<3> The clean event is sent to Elasticsearch, where it can be indexed because

View file

@ -110,27 +110,27 @@ Please see following annotated example and see a concrete example in https://raw
[source,markdown]
----
## 1.0.x // <1> <2>
- change description // <3>
- tag: change description // <3> <4>
- tag1,tag2: change description // <3> <5>
- tag: Multi-line description // <3> <6>
## 1.0.x // <1>
- change description // <2>
- tag: change description // <3>
- tag1,tag2: change description // <4>
- tag: Multi-line description // <5>
must be indented and can use
additional markdown syntax
// <7>
## 1.0.0 // <8>
// <6>
## 1.0.0 // <7>
[...]
----
<1> Latest version is the first line of CHANGELOG.md
<2> Each version identifier should be a level-2 header using `##`
<3> One change description is described as a list item using a dash `-` aligned under the version identifier
<4> One change can be tagged by a word and suffixed by `:`. +
<1> Latest version is the first line of CHANGELOG.md.
Each version identifier should be a level-2 header using `##`
<2> One change description is described as a list item using a dash `-` aligned under the version identifier
<3> One change can be tagged by a word and suffixed by `:`. +
Common tags are `bugfix`, `feature`, `doc`, `test` or `internal`.
<5> One change can have multiple tags separated by a comma and suffixed by `:`
<6> A multi-line change description must be properly indented
<7> Please take care to *separate versions with an empty line*
<8> Previous version identifier
<4> One change can have multiple tags separated by a comma and suffixed by `:`
<5> A multi-line change description must be properly indented
<6> Please take care to *separate versions with an empty line*
<7> Previous version identifier
[float]
==== Continuous Integration

View file

@ -134,7 +134,7 @@ curl -XGET 'localhost:9600/_node/pipelines/test?pretty'
Example response:
[source,json]
--------------------------------------------------
----------
{
"pipelines" : {
"test" : {
@ -145,7 +145,7 @@ Example response:
"config_reload_interval" : 3
}
}
------------------------------------------------
----------
If you specify an invalid pipeline ID, the request returns a 404 Not Found error.

View file

@ -8,7 +8,7 @@ can start adding custom code to process data with Logstash.
**Example Usage**
[source,sh]
--------------------------------------------
-------------------------------------------
bin/logstash-plugin generate --type input --name xkcd --path ~/ws/elastic/plugins
-------------------------------------------

View file

@ -24,7 +24,7 @@ Distributions like Debian Jessie, Ubuntu 15.10+, and many of the SUSE derivative
`systemctl` command to start and stop services. Logstash places the systemd unit files in `/etc/systemd/system` for both deb and rpm. After installing the package, you can start up Logstash with:
[source,sh]
--------------------------------------------
-------------------------------------------
sudo systemctl start logstash.service
-------------------------------------------
@ -34,7 +34,7 @@ sudo systemctl start logstash.service
For systems that use upstart, you can start Logstash with:
[source,sh]
--------------------------------------------
-------------------------------------------
sudo initctl start logstash
-------------------------------------------
@ -46,7 +46,7 @@ The auto-generated configuration file for upstart systems is `/etc/init/logstash
For systems that use SysV, you can start Logstash with:
[source,sh]
--------------------------------------------
-------------------------------------------
sudo /etc/init.d/logstash start
-------------------------------------------

View file

@ -53,10 +53,8 @@ POST _xpack/security/role/logstash_writer
]
}
---------------------------------------------------------------
<1> If you use a custom Logstash index pattern, specify that pattern
instead of the default `logstash-*` pattern.
. Create a `logstash_internal` user and assign it the `logstash_writer` role.
You can create users from the **Management > Users** UI in {kib} or through
the `user` API:
@ -123,7 +121,6 @@ POST _xpack/security/role/logstash_reader
]
}
---------------------------------------------------------------
<1> If you use a custom Logstash index pattern, specify that pattern
instead of the default `logstash-*` pattern.
@ -142,7 +139,6 @@ POST _xpack/security/user/logstash_user
"full_name" : "Kibana User for Logstash"
}
---------------------------------------------------------------
<1> `logstash_admin` is a built-in role that provides access to `.logstash-*`
indices for managing configurations.
@ -250,6 +246,5 @@ You configure the user and password in the `logstash.yml` configuration file:
xpack.management.elasticsearch.username: logstash_admin_user <1>
xpack.management.elasticsearch.password: t0p.s3cr3t
----------------------------------------------------------
<1> The user you specify here must have the built-in `logstash_admin` role as
well as the `logstash_writer` role that you created earlier.

View file

@ -511,7 +511,7 @@ filter {
# using add_field here to add & rename values to the event root
add_field => { server_name => "%{[server][0][description]}" }
add_field => { user_firstname => "%{[user][0][firstname]}" } <5>
add_field => { user_lastname => "%{[user][0][lastname]}" } <5>
add_field => { user_lastname => "%{[user][0][lastname]}" }
remove_field => ["server", "user"]
jdbc_user => "logstash"
jdbc_password => "example"