kibana/docs/dev-tools/grokdebugger/index.asciidoc
James Rodewig 40f2784f37
Rename 'Ingest Node Pipelines' to 'Ingest Pipelines' (#113783)
While Elasticsearch ingest pipelines require a node with the `ingest`
role, we don't need to include `ingest node` in the feature name.

There are no official plans, but the Elasticsearch team has discussed removing
the `ingest` role in the future. This also better aligns the Kibana UI with the
Elasticsearch docs.

The PR also makes some related changes to the Kibana docs.

Relates to https://github.com/elastic/elasticsearch/pull/70253.
2021-10-05 16:03:11 -04:00

120 lines
4.5 KiB
Text

[role="xpack"]
[[xpack-grokdebugger]]
== Debug grok expressions
You can build and debug grok patterns in the {kib} *Grok Debugger*
before you use them in your data processing pipelines. Grok is a pattern
matching syntax that you can use to parse arbitrary text and
structure it. Grok is good for parsing syslog, apache, and other
webserver logs, mysql logs, and in general, any log format that is
written for human consumption.
Grok patterns are supported in {es} {ref}/runtime.html[runtime fields], the {es}
{ref}/grok-processor.html[grok ingest processor], and the {ls}
{logstash-ref}/plugins-filters-grok.html[grok filter]. For syntax, see
{ref}/grok.html[Grokking grok].
The {stack} ships with more than 120 reusable grok patterns. For a complete
list of patterns, see
https://github.com/elastic/elasticsearch/tree/master/libs/grok/src/main/resources/patterns[{es}
grok patterns] and
https://github.com/logstash-plugins/logstash-patterns-core/tree/master/patterns[{ls}
grok patterns].
Because
{es} and {ls} share the same grok implementation and pattern
libraries, any grok pattern that you create in the *Grok Debugger* will work
in both {es} and {ls}.
[float]
[[grokdebugger-getting-started]]
=== Get started
This example walks you through using the *Grok Debugger*. This tool
is automatically enabled in {kib}.
NOTE: If you're using {stack-security-features}, you must have the `manage_pipeline`
permission to use the Grok Debugger.
. Open the main menu, click *Dev Tools*, then click *Grok Debugger*.
. In *Sample Data*, enter a message that is representative of the data that you
want to parse. For example:
+
[source,ruby]
-------------------------------------------------------------------------------
55.3.244.1 GET /index.html 15824 0.043
-------------------------------------------------------------------------------
. In *Grok Pattern*, enter the grok pattern that you want to apply to the data.
+
To parse the log line in this example, use:
+
[source,ruby]
-------------------------------------------------------------------------------
%{IP:client} %{WORD:method} %{URIPATHPARAM:request} %{NUMBER:bytes} %{NUMBER:duration}
-------------------------------------------------------------------------------
. Click **Simulate**.
+
You'll see the simulated event that results from applying the grok
pattern.
+
[role="screenshot"]
image::dev-tools/grokdebugger/images/grok-debugger-overview.png["Grok Debugger"]
//TODO: Update LS and ingest node docs with pointers to the new grok debugger. Replace references to the Heroku app.
[float]
[[grokdebugger-custom-patterns]]
=== Test custom patterns
If the default grok pattern dictionary doesn't contain the patterns you need,
you can define, test, and debug custom patterns using the *Grok Debugger*.
Custom patterns that you enter in the *Grok Debugger* are not saved. Custom patterns
are only available for the current debugging session and have no side effects.
Follow this example to define a custom pattern.
. In *Sample Data*, enter the following sample message:
+
[source,ruby]
-------------------------------------------------------------------------------
Jan 1 06:25:43 mailserver14 postfix/cleanup[21403]: BEF25A72965: message-id=<20130101142543.5828399CCAF@mailserver14.example.com>
-------------------------------------------------------------------------------
. Enter this grok pattern:
+
[source,ruby]
-------------------------------------------------------------------------------
%{SYSLOGBASE} %{POSTFIX_QUEUEID:queue_id}: %{MSG:syslog_message}
-------------------------------------------------------------------------------
+
Notice that the grok pattern references custom patterns called `POSTFIX_QUEUEID`
and `MSG`.
. Expand **Custom Patterns** and enter pattern definitions for the custom
patterns that you want to use in the grok expression. You must specify each pattern definition
on its own line.
+
For this example, you must specify pattern definitions
for `POSTFIX_QUEUEID` and `MSG`:
+
[source,ruby]
-------------------------------------------------------------------------------
POSTFIX_QUEUEID [0-9A-F]{10,11}
MSG message-id=<%{GREEDYDATA}>
-------------------------------------------------------------------------------
. Click **Simulate**.
+
You'll see the simulated output event that results from applying
the grok pattern that contains the custom pattern:
+
[role="screenshot"]
image::dev-tools/grokdebugger/images/grok-debugger-custom-pattern.png["Debugging a custom pattern"]
+
If an error occurs, you can continue iterating over
the custom pattern until the output matches the event
that you expect.