Cherrypick to 7.0: Convert instructions for Java plugins to asciidoc (#10551)

* Convert instructions for Java plugins to asciidoc

* Update java plugin docs for beta
This commit is contained in:
Karen Metts 2019-03-13 12:12:59 -04:00 committed by GitHub
parent f0006a051e
commit ee5060a75d
No known key found for this signature in database
GPG key ID: 4AEE18F83AFDEB23
8 changed files with 1543 additions and 1 deletions

View file

@ -247,12 +247,15 @@ include::static/maintainer-guide.asciidoc[]
// A space is necessary here ^^^
// Submitting a Plugin
:edit_url: https://github.com/elastic/logstash/edit/{branch}/docs/static/submitting-a-plugin.asciidoc
include::static/submitting-a-plugin.asciidoc[]
// Contributing to Logstash - JAVA EDITION
:edit_url: https://github.com/elastic/logstash/edit/{branch}/docs/static/contributing-java-plugin.asciidoc
include::static/contributing-java-plugin.asciidoc[]
// Glossary of Terms

View file

@ -0,0 +1,54 @@
[[contributing-java-plugin]]
== Contributing a Java Plugin
beta[]
Now you can write your own Java plugin for use with {ls}.
We have provided instructions and GitHub examples to give
you a head start.
Native support for Java plugins in {ls} consists of several components
including:
* Extensions to the Java execution engine to support running Java plugins in
Logstash pipelines
* APIs for developing Java plugins.
The APIs are in the `co.elastic.logstash.api` package.
A Java plugin might break if it references classes or specific concrete
implementations of API interfaces outside that package. The implementation of
classes outside of the API package may change at any time.
* Coming in a future release: Tooling to automate the packaging and deployment of
Java plugins in Logstash. (Currently, this process is manual.)
[float]
=== Process overview
Here are the steps:
. Choose the type of plugin you want to create: input, codec, filter, or output.
. Set up your environment.
. Code the plugin.
. Package and deploy the plugin.
. Run Logstash with your new plugin.
[float]
==== Let's get started
Here are the example repos:
* https://github.com/logstash-plugins/logstash-input-java_input_example[Input plugin example]
* https://github.com/logstash-plugins/logstash-codec-java_codec_example[Codec plugin example]
* https://github.com/logstash-plugins/logstash-filter-java_filter_example[Filter plugin example]
* https://github.com/logstash-plugins/logstash-output-java_output_example[Output plugin example]
Here are the instructions:
* <<java-input-plugin>>
* <<java-codec-plugin>>
* <<java-filter-plugin>>
* <<java-output-plugin>>
include::java-input.asciidoc[]
include::java-codec.asciidoc[]
include::java-filter.asciidoc[]
include::java-output.asciidoc[]

View file

@ -0,0 +1,134 @@
[float]
=== Package and deploy
Java plugins are packaged as Ruby gems for dependency management and
interoperability with Ruby plugins.
NOTE: One of the goals for Java plugin support is to eliminate the need for any
knowledge of Ruby or its toolchain for Java plugin development. Future phases of
the Java plugin project will automate the packaging of Java plugins as Ruby gems
so no direct knowledge of or interaction with Ruby will be required. In the
current phase, Java plugins must still be manually packaged as Ruby gems
and installed with the `logstash-plugin` utility.
[float]
==== Compile to JAR file
The Java plugin should be compiled and assembled into a fat jar with the
`vendor` task in the Gradle build file. This will package all Java dependencies
into a single jar and write it to the correct folder for later packaging into a
Ruby gem.
[float]
==== Manually package as Ruby gem
Several Ruby source files are required to package the jar file as a
Ruby gem. These Ruby files are used only at Logstash startup time to identify
the Java plugin and are not used during runtime event processing.
NOTE: These Ruby source files will be automatically generated in a future release.
**+logstash-{plugintype}-<{plugintype}-name>.gemspec+**
[source,txt]
[subs="attributes"]
-----
Gem::Specification.new do |s|
s.name = 'logstash-{plugintype}-java_{plugintype}_example'
s.version = PLUGIN_VERSION
s.licenses = ['Apache-2.0']
s.summary = "Example {plugintype} using Java plugin API"
s.description = ""
s.authors = ['Elasticsearch']
s.email = 'info@elastic.co'
s.homepage = "http://www.elastic.co/guide/en/logstash/current/index.html"
s.require_paths = ['lib', 'vendor/jar-dependencies']
# Files
s.files = Dir["lib/**/*","spec/**/*","*.gemspec","*.md","CONTRIBUTORS","Gemfile","LICENSE","NOTICE.TXT", "vendor/jar-dependencies/**/*.jar", "vendor/jar-dependencies/**/*.rb", "VERSION", "docs/**/*"]
# Special flag to let us know this is actually a logstash plugin
s.metadata = { 'logstash_plugin' => 'true', 'logstash_group' => '{plugintype}'}
# Gem dependencies
s.add_runtime_dependency "logstash-core-plugin-api", ">= 1.60", "<= 2.99"
s.add_runtime_dependency 'jar-dependencies'
s.add_development_dependency 'logstash-devutils'
end
-----
You can use this file with the following modifications:
* `s.name` must follow the +logstash-pass:attributes[{plugintype}]-<{plugintype}-name>+ pattern
* `s.version` must match the `project.version` specified in the `build.gradle` file.
Both versions should be set to be read from the `VERSION` file in this example.
**+lib/logstash/{plugintype}s/<{plugintype}-name>.rb+**
[source,ruby]
[subs="attributes"]
-----
# encoding: utf-8
require "logstash/{plugintype}s/base"
require "logstash/namespace"
require "logstash-{plugintype}-java_{plugintype}_example_jars"
require "java"
class LogStash::{plugintype}s::Java{plugintypecap}Example < LogStash::{pluginclass}::Base
config_name "java_{plugintype}_example"
def self.javaClass() org.logstash.javaapi.Java{plugintypecap}Example.java_class; end
end
-----
Modify these items in the file above:
* Change the name to correspond with the {plugintype} name.
* Change +require "logstash-{plugintype}-java_{plugintype}_example_jars"+ to reference the appropriate "jars" file
as described below.
* Change +class LogStash::{pluginclass}::Java{plugintypecap}Example < LogStash::{pluginclass}::Base+ to provide a unique and
descriptive Ruby class name.
* Change +config_name "java_{plugintype}_example"+ to match the name of the plugin as specified in the `name` property of
the `@LogstashPlugin` annotation.
* Change +def self.javaClass() org.logstash.javaapi.Java{plugintypecap}Example.java_class; end+ to return the
class of the Java {plugintype}.
**+lib/logstash-{plugintype}-<{plugintype}-name>_jars.rb+**
[source,txt]
[subs="attributes"]
-----
require 'jar_dependencies'
require_jar('org.logstash.javaapi', 'logstash-{plugintype}-java_{plugintype}_example', {sversion})
-----
In the file above:
* Rename the file to correspond to the {plugintype} name.
* Change the `require_jar` directive to correspond to the `group` specified in the
Gradle build file, the name of the {plugintype} JAR file, and the version as
specified in both the gemspec and Gradle build file.
After you have created the previous files and the plugin JAR file, build the gem using the
following command:
[source,shell]
[subs="attributes"]
-----
gem build logstash-{plugintype}-<{plugintype}-name>.gemspec
-----
[float]
==== Installing the Java plugin in Logstash
After you have packaged your Java plugin as a Ruby gem, you can install it in
Logstash with this command:
[source,shell]
-----
bin/logstash-plugin install --no-verify --local /path/to/javaPlugin.gem
-----
For Windows platforms: Substitute backslashes for forward slashes as appropriate in the command.

View file

@ -0,0 +1,52 @@
To develop a new Java {plugintype} for Logstash, you write a new Java class that
conforms to the Logstash Java {pluginclass} API, package it, and install it with the
logstash-plugin utility. We'll go through each of those steps.
[float]
=== Set up your environment
[float]
==== Copy the example repo
Start by copying the {pluginrepo}. The plugin API is currently part of the
Logstash codebase so you must have a local copy of that available. You can
obtain a copy of the Logstash codebase with the following `git` command:
[source,shell]
-----
git clone --branch <branch_name> --single-branch https://github.com/elastic/logstash.git <target_folder>
-----
The `branch_name` should correspond to the version of Logstash containing the
preferred revision of the Java plugin API.
NOTE: The beta version of the Java plugin API is available in the `6.7`
branch of the Logstash codebase.
Specify the `target_folder` for your local copy of the Logstash codebase. If you
do not specify `target_folder`, it defaults to a new folder called `logstash`
under your current folder.
[float]
==== Generate the .jar file
After you have obtained a copy of the appropriate revision of the Logstash
codebase, you need to compile it to generate the .jar file containing the Java
plugin API. From the root directory of your Logstash codebase ($LS_HOME), you
can compile it with `./gradlew assemble` (or `gradlew.bat assemble` if you're
running on Windows). This should produce the
`$LS_HOME/logstash-core/build/libs/logstash-core-x.y.z.jar` where `x`, `y`, and
`z` refer to the version of Logstash.
After you have successfully compiled Logstash, you need to tell your Java plugin
where to find the `logstash-core-x.y.z.jar` file. Create a new file named
`gradle.properties` in the root folder of your plugin project. That file should
have a single line:
[source,txt]
-----
LOGSTASH_CORE_PATH=<target_folder>/logstash-core
-----
where `target_folder` is the root folder of your local copy of the Logstash codebase.

410
docs/static/java-codec.asciidoc vendored Normal file
View file

@ -0,0 +1,410 @@
:register_method: true
:encode_method: true
:decode_method: true
:plugintype: codec
:pluginclass: Codecs
:pluginname: example
:pluginnamecap: Example
:plugintypecap: Codec
:sversion: '0.2.0'
:pluginrepo: https://github.com/logstash-plugins/logstash-codec-java_codec_example[example codec plugin]
:blockinput: true
//:getstarted: Let's step through creating a {plugintype} plugin using the https://github.com/logstash-plugins/logstash-codec-example/[example {plugintype} plugin].
//:methodheader: Logstash codecs must implement the `register` method, and the `decode` method or the `encode` method (or both).
[[java-codec-plugin]]
=== How to write a Java codec plugin
beta[]
// Pulls in shared section: Setting Up Environment
include::include/javapluginsetup.asciidoc[]
[float]
=== Code the plugin
The example codec plugin decodes messages separated by a configurable delimiter
and encodes messages by writing their string representation separated by a
delimiter. For example, if the codec were configured with `/` as the delimiter,
the input text `event1/event2/` would be decoded into two separate events with
`message` fields of `event1` and `event2`, respectively. Note that this is only
an example codec and does not cover all the edge cases that a production-grade
codec should cover.
Let's look at the main class in that codec filter:
[source,java]
-----
@LogstashPlugin(name="java_codec_example")
public class JavaCodecExample implements Codec {
public static final PluginConfigSpec<String> DELIMITER_CONFIG =
PluginConfigSpec.stringSetting("delimiter", ",");
private final String id;
private final String delimiter;
private final CharsetEncoder encoder;
private Event currentEncodedEvent;
private CharBuffer currentEncoding;
public JavaCodecExample(final Configuration config, final Context context) {
this(config.get(DELIMITER_CONFIG));
}
private JavaCodecExample(String delimiter) {
this.id = UUID.randomUUID().toString();
this.delimiter = delimiter;
this.encoder = Charset.defaultCharset().newEncoder();
}
@Override
public void decode(ByteBuffer byteBuffer, Consumer<Map<String, Object>> consumer) {
// a not-production-grade delimiter decoder
byte[] byteInput = new byte[byteBuffer.remaining()];
byteBuffer.get(byteInput);
if (byteInput.length > 0) {
String input = new String(byteInput);
String[] split = input.split(delimiter);
for (String s : split) {
Map<String, Object> map = new HashMap<>();
map.put("message", s);
consumer.accept(map);
}
}
}
@Override
public void flush(ByteBuffer byteBuffer, Consumer<Map<String, Object>> consumer) {
// if the codec maintains any internal state such as partially-decoded input, this
// method should flush that state along with any additional input supplied in
// the ByteBuffer
decode(byteBuffer, consumer); // this is a simplistic implementation
}
@Override
public boolean encode(Event event, ByteBuffer buffer) throws EncodeException {
try {
if (currentEncodedEvent != null && event != currentEncodedEvent) {
throw new EncodeException("New event supplied before encoding of previous event was completed");
} else if (currentEncodedEvent == null) {
currentEncoding = CharBuffer.wrap(event.toString() + delimiter);
}
CoderResult result = encoder.encode(currentEncoding, buffer, true);
buffer.flip();
if (result.isError()) {
result.throwException();
}
if (result.isOverflow()) {
currentEncodedEvent = event;
return false;
} else {
currentEncodedEvent = null;
return true;
}
} catch (IOException e) {
throw new IllegalStateException(e);
}
}
@Override
public Collection<PluginConfigSpec<?>> configSchema() {
// should return a list of all configuration options for this plugin
return Collections.singletonList(DELIMITER_CONFIG);
}
@Override
public Codec cloneCodec() {
return new JavaCodecExample(this.delimiter);
}
@Override
public String getId() {
return this.id;
}
}
-----
Let's step through and examine each part of that class.
[float]
==== Class declaration
[source,java]
-----
@LogstashPlugin(name="java_codec_example")
public class JavaCodecExample implements Codec {
-----
Notes about the class declaration:
* All Java plugins must be annotated with the `@LogstashPlugin` annotation. Additionally:
** The `name` property of the annotation must be supplied and defines the name of the plugin as it will be used
in the Logstash pipeline definition. For example, this codec would be referenced in the codec section of the
an appropriate input or output in the Logstash pipeline defintion as `codec => java_codec_example { }`
** The value of the `name` property must match the name of the class excluding casing and underscores.
* The class must implement the `co.elastic.logstash.api.Codec` interface.
[float]
===== Plugin settings
The snippet below contains both the setting definition and the method referencing it:
[source,java]
-----
public static final PluginConfigSpec<String> DELIMITER_CONFIG =
PluginConfigSpec.stringSetting("delimiter", ",");
@Override
public Collection<PluginConfigSpec<?>> configSchema() {
return Collections.singletonList(DELIMITER_CONFIG);
}
-----
The `PluginConfigSpec` class allows developers to specify the settings that a
plugin supports complete with setting name, data type, deprecation status,
required status, and default value. In this example, the `delimiter` setting
defines the delimiter on which the codec will split events. It is not a required
setting and if it is not explicitly set, its default value will be `,`.
The `configSchema` method must return a list of all settings that the plugin
supports. The Logstash execution engine will validate that all required
settings are present and that no unsupported settings are present.
[float]
===== Constructor and initialization
[source,java]
-----
private final String id;
private final String delimiter;
private final CharsetEncoder encoder;
public JavaCodecExample(final Configuration config, final Context context) {
this(config.get(DELIMITER_CONFIG));
}
private JavaCodecExample(String delimiter) {
this.id = UUID.randomUUID().toString();
this.delimiter = delimiter;
this.encoder = Charset.defaultCharset().newEncoder();
}
-----
All Java codec plugins must have a constructor taking a `Configuration` and
`Context` argument. This is the constructor that will be used to instantiate
them at runtime. The retrieval and validation of all plugin settings should
occur in this constructor. In this example, the delimiter to be used for
delimiting events is retrieved from its setting and stored in a local variable
so that it can be used later in the `decode` and `encode` methods. The codec's
ID is initialized to a random UUID (as should be done for most codecs), and a
local `encoder` variable is initialized to encode and decode with a specified
character set.
Any additional initialization may occur in the constructor as well. If there are
any unrecoverable errors encountered in the configuration or initialization of
the codec plugin, a descriptive exception should be thrown. The exception will
be logged and will prevent Logstash from starting.
[float]
==== Codec methods
[source,java]
-----
@Override
public void decode(ByteBuffer byteBuffer, Consumer<Map<String, Object>> consumer) {
// a not-production-grade delimiter decoder
byte[] byteInput = new byte[byteBuffer.remaining()];
byteBuffer.get(byteInput);
if (byteInput.length > 0) {
String input = new String(byteInput);
String[] split = input.split(delimiter);
for (String s : split) {
Map<String, Object> map = new HashMap<>();
map.put("message", s);
consumer.accept(map);
}
}
}
@Override
public void flush(ByteBuffer byteBuffer, Consumer<Map<String, Object>> consumer) {
decode(byteBuffer, consumer); // this is a simplistic implementation
}
@Override
public boolean encode(Event event, ByteBuffer buffer) throws EncodeException {
try {
if (currentEncodedEvent != null && event != currentEncodedEvent) {
throw new EncodeException("New event supplied before encoding of previous event was completed");
} else if (currentEncodedEvent == null) {
currentEncoding = CharBuffer.wrap(event.toString() + delimiter);
}
CoderResult result = encoder.encode(currentEncoding, buffer, true);
buffer.flip();
if (result.isError()) {
result.throwException();
}
if (result.isOverflow()) {
currentEncodedEvent = event;
return false;
} else {
currentEncodedEvent = null;
return true;
}
} catch (IOException e) {
throw new IllegalStateException(e);
}
}
-----
The `decode`, `flush`, and `encode` methods provide the core functionality of
the codec. Codecs may be used by inputs to decode a sequence or stream of bytes
into events or by outputs to encode events into a sequence of bytes.
The `decode` method decodes events from the specified `ByteBuffer` and passes
them to the provided `Consumer`. The input must provide a `ByteBuffer` that is
ready for reading with `byteBuffer.position()` indicating the next position to
read and `byteBuffer.limit()` indicating the first byte in the buffer that is
not safe to read. Codecs must ensure that `byteBuffer.position()` reflects the
last-read position before returning control to the input. The input is then
responsible for returning the buffer to write mode via either
`byteBuffer.clear()` or `byteBuffer.compact()` before resuming writes. In the
example above, the `decode` method simply splits the incoming byte stream on the
specified delimiter. A production-grade codec such as
https://github.com/elastic/logstash/blob/6.7/logstash-core/src/main/java/org/logstash/plugins/codecs/Line.java[`java-line`]
would not make the simplifying assumption that the end of the supplied byte
stream corresponded with the end of an event.
The `flush` method works in coordination with the `decode` method to decode all
remaining events from the specified `ByteBuffer` along with any internal state
that may remain after previous calls to the `decode` method. As an example of
internal state that a codec might maintain, consider an input stream of bytes
`event1/event2/event3` with a delimiter of `/`. Due to buffering or other
reasons, the input might supply a partial stream of bytes such as `event1/eve`
to the codec's `decode` method. In this case, the codec could save the beginning
three characters `eve` of the second event rather than assuming that the
supplied byte stream ends on an event boundary. If the next call to `decode`
supplied the `nt2/ev` bytes, the codec would prepend the saved `eve` bytes to
produce the full `event2` event and then save the remaining `ev` bytes for
decoding when the remainder of the bytes for that event were supplied. A call to
`flush` signals the codec that the supplied bytes represent the end of an event
stream and all remaining bytes should be decoded to events. The `flush` example
above is a simplistic implementation that does not maintain any state about
partially-supplied byte streams across calls to `decode`.
The `encode` method encodes an event into a sequence of bytes and writes it into
the specified `ByteBuffer`. Under ideal circumstances, the entirety of the
event's encoding will fit into the supplied buffer. In cases where the buffer
has insufficient space to hold the event's encoding, the codec must fill the
buffer with as much of the event's encoding as possible, the `encode` must
return `false`, and the output must call the `encode` method with the same event
and a buffer that has more `buffer.remaining()` bytes. The output typically does
that by draining the partial encoding from the supplied buffer. This process
must be repeated until the event's entire encoding is written to the buffer at
which point the `encode` method will return `true`. Attempting to call this
method with a new event before the entirety of the previous event's encoding has
been written to a buffer must result in an `EncodeException`. As the coneptual
inverse of the `decode` method, the `encode` method must return the buffer in a
state from which it can be read, typically by calling `buffer.flip()` before
returning. In the example above, the `encode` method attempts to write the
event's encoding to the supplied buffer. If the buffer contains sufficient free
space, the entirety of the event is written and `true` is returned. Otherwise,
the method writes as much of the event's encoding to the buffer as possible,
returns `false`, and stores the remainder to be written to the buffer in the
next call to the `encode` method.
[float]
==== cloneCodec method
[source,java]
-----
@Override
public Codec cloneCodec() {
return new JavaCodecExample(this.delimiter);
}
-----
The `cloneCodec` method should return an identical instance of the codec with the exception of its ID. Because codecs
may be stateful, a separate instance of each codec must be supplied to each worker thread in a pipeline. For all
pipelines with more than one worker, the `cloneCodec` method is called by the Logstash execution engine to create all
codec instances beyond the first. In the example above, the codec is cloned with the same delimiter but a different ID.
[float]
==== getId method
[source,java]
-----
@Override
public String getId() {
return id;
}
-----
For codec plugins, the `getId` method should always return the id that was set at instantiation time. This is typically
an UUID.
[float]
==== Unit tests
Lastly, but certainly not least importantly, unit tests are strongly encouraged.
The example codec plugin includes an
https://github.com/logstash-plugins/logstash-codec-java_codec_example/blob/master/src/test/java/org/logstash/javaapi/JavaCodecExampleTest.java[example unit
test] that you can use as a template for your own.
// Pulls in shared section about Packaging and Deploying
include::include/javapluginpkg.asciidoc[]
[float]
=== Run Logstash with the Java codec plugin
To test the plugin, start Logstash with:
[source,java]
-----
echo "foo,bar" | bin/logstash --java-execution -e 'input { java_stdin { codec => java_codec_example } } }'
-----
Note that the `--java-execution` flag to enable the Java execution engine is required as Java plugins are not supported
in the Ruby execution engine.
The expected Logstash output (excluding initialization) with the configuration above is:
[source,txt]
-----
{
"@version" => "1",
"message" => "foo",
"@timestamp" => yyyy-MM-ddThh:mm:ss.SSSZ,
"host" => "<yourHostName>"
}
{
"@version" => "1",
"message" => "bar\n",
"@timestamp" => yyyy-MM-ddThh:mm:ss.SSSZ,
"host" => "<yourHostName>"
}
-----
[float]
=== Feedback
If you have any feedback on Java plugin support in Logstash, please comment on our
https://github.com/elastic/logstash/issues/9215[main Github issue] or post in the
https://discuss.elastic.co/c/logstash[Logstash forum].
:pluginrepo!:
:sversion!:
:plugintypecap!:
:pluginnamecap!:
:pluginname!:
:pluginclass!:
:plugintype!:

284
docs/static/java-filter.asciidoc vendored Normal file
View file

@ -0,0 +1,284 @@
:register_method: true
:filter_method: true
:plugintype: filter
:pluginclass: Filters
:pluginname: example
:pluginnamecap: Example
:plugintypecap: Filter
:sversion: '0.0.1'
:pluginrepo: https://github.com/logstash-plugins/logstash-filter-java_filter_example[example filter plugin]
:blockcodec: true
[[java-filter-plugin]]
=== How to write a Java filter plugin
beta[]
// Pulls in shared section: Setting Up Environment
include::include/javapluginsetup.asciidoc[]
[float]
=== Code the plugin
The example filter plugin allows one to configure a field in each event that
will be reversed. For example, if the filter were configured to reverse the
`day_of_week` field, an event with `day_of_week: "Monday"` would be transformed
to `day_of_week: "yadnoM"`. Let's look at the main class in that example filter:
[source,java]
-----
@LogstashPlugin(name = "java_filter_example")
public class JavaFilterExample implements Filter {
public static final PluginConfigSpec<String> SOURCE_CONFIG =
PluginConfigSpec.stringSetting("source", "message");
private String id;
private String sourceField;
public JavaFilterExample(String id, Configuration config, Context context) {
this.id = id;
this.sourceField = config.get(SOURCE_CONFIG);
}
@Override
public Collection<Event> filter(Collection<Event> events, FilterMatchListener matchListener) {
for (Event e : events) {
Object f = e.getField(sourceField);
if (f instanceof String) {
e.setField(sourceField, StringUtils.reverse((String)f));
matchListener.filterMatched(e);
}
}
return events;
}
@Override
public Collection<PluginConfigSpec<?>> configSchema() {
return Collections.singletonList(SOURCE_CONFIG);
}
@Override
public String getId() {
return this.id;
}
}
-----
Let's step through and examine each part of that class.
[float]
==== Class declaration
[source,java]
-----
@LogstashPlugin(name = "java_filter_example")
public class JavaFilterExample implements Filter {
-----
Notes about the class declaration:
* All Java plugins must be annotated with the `@LogstashPlugin` annotation. Additionally:
** The `name` property of the annotation must be supplied and defines the name of the plugin as it will be used
in the Logstash pipeline definition. For example, this filter would be referenced in the filter section of the
Logstash pipeline defintion as `filter { java_filter_example => { .... } }`
** The value of the `name` property must match the name of the class excluding casing and underscores.
* The class must implement the `co.elastic.logstash.api.Filter` interface.
[float]
==== Plugin settings
The snippet below contains both the setting definition and the method referencing it:
[source,java]
-----
public static final PluginConfigSpec<String> SOURCE_CONFIG =
PluginConfigSpec.stringSetting("source", "message");
@Override
public Collection<PluginConfigSpec<?>> configSchema() {
return Collections.singletonList(SOURCE_CONFIG);
}
-----
The `PluginConfigSpec` class allows developers to specify the settings that a plugin supports complete with setting
name, data type, deprecation status, required status, and default value. In this example, the `source` setting defines
the name of the field in each event that will be reversed. It is not a required setting and if it is not explicitly
set, its default value will be `message`.
The `configSchema` method must return a list of all settings that the plugin supports. In a future phase of the
Java plugin project, the Logstash execution engine will validate that all required settings are present and that
no unsupported settings are present.
[float]
==== Constructor and initialization
[source,java]
-----
private String id;
private String sourceField;
public JavaFilterExample(String id, Configuration config, Context context) {
this.id = id;
this.sourceField = config.get(SOURCE_CONFIG);
}
-----
All Java filter plugins must have a constructor taking a `String` id and a
`Configuration` and `Context` argument. This is the constructor that will be
used to instantiate them at runtime. The retrieval and validation of all plugin
settings should occur in this constructor. In this example, the name of the
field to be reversed in each event is retrieved from its setting and stored in
a local variable so that it can be used later in the `filter` method.
Any additional initialization may occur in the constructor as well. If there are
any unrecoverable errors encountered in the configuration or initialization of
the filter plugin, a descriptive exception should be thrown. The exception will
be logged and will prevent Logstash from starting.
[float]
==== Filter method
[source,java]
-----
@Override
public Collection<Event> filter(Collection<Event> events, FilterMatchListener matchListener) {
for (Event e : events) {
Object f = e.getField(sourceField);
if (f instanceof String) {
e.setField(sourceField, StringUtils.reverse((String)f));
matchListener.filterMatched(e);
}
}
return events;
-----
Finally, we come to the `filter` method that is invoked by the Logstash
execution engine on batches of events as they flow through the event processing
pipeline. The events to be filtered are supplied in the `events` argument and
the method should return a collection of filtered events. Filters may perform a
variety of actions on events as they flow through the pipeline including:
* Mutation -- Fields in events may be added, removed, or changed by a filter. This
is the most common scenario for filters that perform various kinds of
enrichment on events. In this scenario, the incoming `events` collection may be
returned unmodified since the events in the collection are mutated in place.
* Deletion -- Events may be removed from the event pipeline by a filter so that
subsequent filters and outputs do not receive them. In this scenario, the
events to be deleted must be removed from the collection of filtered events
before it is returned.
* Creation -- A filter may insert new events into the event pipeline that will be
seen only by subsequent filters and outputs. In this scenario, the new events
must be added to the collection of filtered events before it is returned.
* Observation -- Events may pass unchanged by a filter through the event pipeline.
This may be useful in scenarios where a filter performs external actions (e.g.,
updating an external cache) based on the events observed in the event pipeline.
In this scenario, the incoming `events` collection may be returned unmodified
since no changes were made.
In the example above, the value of the `source` field is retrieved from each
event and reversed if it is a string value. Because each event is mutated in
place, the incoming `events` collection can be returned.
The `matchListener` is the mechanism by which filters indicate which events
"match". The common actions for filters such as `add_field` and `add_tag` are
applied only to events that are designated as "matching". Some filters such as
the https://www.elastic.co/guide/en/logstash/current/plugins-filters-grok.html[grok
filter]
have a clear definition for what constitutes a matching event and will notify
the listener only for matching events. Other filters such as the
https://www.elastic.co/guide/en/logstash/current/plugins-filters-uuid.html[UUID
filter]
have no specific match criteria and should notify the listener for every event
filtered. In this example, the filter notifies the match listener for any event
that had a `String` value in its `source` field and was therefore able to be
reversed.
[float]
==== getId method
[source,java]
-----
@Override
public String getId() {
return id;
}
----
For filter plugins, the `getId` method should always return the id that was provided to the plugin through its
constructor at instantiation time.
[float]
==== Unit tests
Lastly, but certainly not least importantly, unit tests are strongly encouraged.
The example filter plugin includes an
https://github.com/logstash-plugins/logstash-filter-java_filter_example/blob/master/src/test/java/org/logstash/javaapi/JavaFilterExampleTest.java)[example
unit test] that you can use as a template for your own.
// Pulls in shared section about Packaging and Deploying
include::include/javapluginpkg.asciidoc[]
[float]
=== Run Logstash with the Java filter plugin
The following is a minimal Logstash configuration that can be used to test that
the Java filter plugin is correctly installed and functioning.
[source,java]
-----
input {
generator { message => "Hello world!" count => 1 }
}
filter {
java_filter_example {}
}
output {
stdout { codec => rubydebug }
}
-----
Copy the above Logstash configuration to a file such as `java_filter.conf`.
Start Logstash with:
[source,shell]
-----
bin/logstash --java-execution -f /path/to/java_filter.conf
-----
Note that the `--java-execution` flag to enable the Java execution engine is
required as Java plugins are not supported in the Ruby execution engine.
The expected Logstash output (excluding initialization) with the configuration
above is:
[source,txt]
-----
{
"sequence" => 0,
"@version" => "1",
"message" => "!dlrow olleH",
"@timestamp" => yyyy-MM-ddThh:mm:ss.SSSZ,
"host" => "<yourHostName>"
}
-----
[float]
=== Feedback
If you have any feedback on Java plugin support in Logstash, please comment on our
https://github.com/elastic/logstash/issues/9215[main Github issue] or post in the
https://discuss.elastic.co/c/logstash[Logstash forum].
:pluginrepo!:
:sversion!:
:plugintypecap!:
:pluginnamecap!:
:pluginname!:
:pluginclass!:
:plugintype!:

317
docs/static/java-input.asciidoc vendored Normal file
View file

@ -0,0 +1,317 @@
:register_method: true
:run_method: true
:plugintype: input
:pluginclass: Inputs
:pluginname: example
:pluginnamecap: Example
:plugintypecap: Input
:sversion: '0.0.1'
:pluginrepo: https://github.com/logstash-plugins/logstash-input-java_input_example[example input plugin]
[[java-input-plugin]]
=== How to write a Java input plugin
beta[]
// Pulls in shared section: Setting Up Environment
include::include/javapluginsetup.asciidoc[]
[float]
=== Code the plugin
The example input plugin generates a configurable number of simple events before
terminating. Let's look at the main class in the example input.
[source,java]
-----
@LogstashPlugin(name="java_input_example")
public class JavaInputExample implements Input {
public static final PluginConfigSpec<Long> EVENT_COUNT_CONFIG =
PluginConfigSpec.numSetting("count", 3);
public static final PluginConfigSpec<String> PREFIX_CONFIG =
PluginConfigSpec.stringSetting("prefix", "message");
private String id;
private long count;
private String prefix;
private final CountDownLatch done = new CountDownLatch(1);
private volatile boolean stopped;
public JavaInputExample(String id, Configuration config, Context context) {
this.id = id;
count = config.get(EVENT_COUNT_CONFIG);
prefix = config.get(PREFIX_CONFIG);
}
@Override
public void start(Consumer<Map<String, Object>> consumer) {
int eventCount = 0;
try {
while (!stopped && eventCount < count) {
eventCount++;
consumer.accept.push(Collections.singletonMap("message",
prefix + " " + StringUtils.center(eventCount + " of " + count, 20)));
}
} finally {
stopped = true;
done.countDown();
}
}
@Override
public void stop() {
stopped = true; // set flag to request cooperative stop of input
}
@Override
public void awaitStop() throws InterruptedException {
done.await(); // blocks until input has stopped
}
@Override
public Collection<PluginConfigSpec<?>> configSchema() {
return Arrays.asList(EVENT_COUNT_CONFIG, PREFIX_CONFIG);
}
@Override
public String getId() {
return this.id;
}
}
-----
Let's step through and examine each part of that class.
[float]
==== Class declaration
[source,java]
-----
@LogstashPlugin(name="java_input_example")
public class JavaInputExample implements Input {
-----
Notes about the class declaration:
* All Java plugins must be annotated with the `@LogstashPlugin` annotation. Additionally:
** The `name` property of the annotation must be supplied and defines the name of the plugin as it will be used
in the Logstash pipeline definition. For example, this input would be referenced in the input section of the
Logstash pipeline defintion as `input { java_input_example => { .... } }`
** The value of the `name` property must match the name of the class excluding casing and underscores.
* The class must implement the `co.elastic.logstash.api.Input` interface.
[float]
==== Plugin settings
The snippet below contains both the setting definition and the method referencing it.
[source,java]
-----
public static final PluginConfigSpec<Long> EVENT_COUNT_CONFIG =
PluginConfigSpec.numSetting("count", 3);
public static final PluginConfigSpec<String> PREFIX_CONFIG =
PluginConfigSpec.stringSetting("prefix", "message");
@Override
public Collection<PluginConfigSpec<?>> configSchema() {
return Arrays.asList(EVENT_COUNT_CONFIG, PREFIX_CONFIG);
}
-----
The `PluginConfigSpec` class allows developers to specify the settings that a
plugin supports complete with setting name, data type, deprecation status,
required status, and default value. In this example, the `count` setting defines
the number of events that will be generated and the `prefix` setting defines an
optional prefix to include in the event field. Neither setting is required and
if it is not explicitly set, the settings default to `3` and `message`,
respectively.
The `configSchema` method must return a list of all settings that the plugin
supports. In a future phase of the Java plugin project, the Logstash execution
engine will validate that all required settings are present and that no
unsupported settings are present.
[float]
==== Constructor and initialization
[source,java]
-----
private String id;
private long count;
private String prefix;
public JavaInputExample(String id, Configuration config, Context context) {
this.id = id;
count = config.get(EVENT_COUNT_CONFIG);
prefix = config.get(PREFIX_CONFIG);
}
-----
All Java input plugins must have a constructor taking a `String` id and
`Configuration` and `Context` argument. This is the constructor that will be
used to instantiate them at runtime. The retrieval and validation of all plugin
settings should occur in this constructor. In this example, the values of the
two plugin settings are retrieved and stored in local variables for later use in
the `start` method.
Any additional initialization may occur in the constructor as well. If there are
any unrecoverable errors encountered in the configuration or initialization of
the input plugin, a descriptive exception should be thrown. The exception will
be logged and will prevent Logstash from starting.
[float]
==== Start method
[source,java]
-----
@Override
public void start(Consumer<Map<String, Object>> consumer) {
int eventCount = 0;
try {
while (!stopped && eventCount < count) {
eventCount++;
consumer.accept.push(Collections.singletonMap("message",
prefix + " " + StringUtils.center(eventCount + " of " + count, 20)));
}
} finally {
stopped = true;
done.countDown();
}
}
-----
The `start` method begins the event-producing loop in an input. Inputs are flexible and may produce events through
many different mechanisms including:
* a pull mechanism such as periodic queries of external database</li>
* a push mechanism such as events sent from clients to a local network port</li>
* a timed computation such as a heartbeat</li>
* any other mechanism that produces a useful stream of events. Event streams may be either finite or infinite.
If the input produces an infinite stream of events, this method should loop until a stop request is made through
the `stop` method. If the input produces a finite stream of events, this method should terminate when the last
event in the stream is produced or a stop request is made, whichever comes first.
Events should be constructed as instances of `Map<String, Object>` and pushed into the event pipeline via the
`Consumer<Map<String, Object>>.accept()` method.
[float]
==== Stop and awaitStop methods
[source,java]
-----
private final CountDownLatch done = new CountDownLatch(1);
private volatile boolean stopped;
@Override
public void stop() {
stopped = true; // set flag to request cooperative stop of input
}
@Override
public void awaitStop() throws InterruptedException {
done.await(); // blocks until input has stopped
}
-----
The `stop` method notifies the input to stop producing events. The stop
mechanism may be implemented in any way that honors the API contract though a
`volatile boolean` flag works well for many use cases.
Inputs stop both asynchronously and cooperatively. Use the `awaitStop` method to
block until the input has completed the stop process. Note that this method
should **not** signal the input to stop as the `stop` method does. The
awaitStop mechanism may be implemented in any way that honors the API contract
though a `CountDownLatch` works well for many use cases.
[float]
==== getId method
[source,java]
-----
@Override
public String getId() {
return id;
}
-----
For input plugins, the `getId` method should always return the id that was provided to the plugin through its
constructor at instantiation time.
[float]
==== Unit tests
Lastly, but certainly not least importantly, unit tests are strongly encouraged.
The example input plugin includes an
https://github.com/logstash-plugins/logstash-input-java_input_example/blob/master/src/test/java/org/logstash/javaapi/JavaInputExampleTest.java[example unit
test] that you can use as a template for your own.
// Pulls in shared section about Packaging and Deploying
include::include/javapluginpkg.asciidoc[]
[float]
=== Running Logstash with the Java input plugin
The following is a minimal Logstash configuration that can be used to test that
the Java input plugin is correctly installed and functioning.
[source,java]
-----
input {
java_input_example {}
}
output {
stdout { codec => rubydebug }
}
-----
Copy the above Logstash configuration to a file such as `java_input.conf`.
Start {ls} with:
[source,shell]
-----
bin/logstash --java-execution -f /path/to/java_input.conf
-----
Note that the `--java-execution` flag to enable the Java execution engine is
required as Java plugins are not supported in the Ruby execution engine.
The expected Logstash output (excluding initialization) with the configuration above is:
[source,txt]
-----
{
"@version" => "1",
"message" => "message 1 of 3 ",
"@timestamp" => yyyy-MM-ddThh:mm:ss.SSSZ
}
{
"@version" => "1",
"message" => "message 2 of 3 ",
"@timestamp" => yyyy-MM-ddThh:mm:ss.SSSZ
}
{
"@version" => "1",
"message" => "message 3 of 3 ",
"@timestamp" => yyyy-MM-ddThh:mm:ss.SSSZ
}
-----
[float]
=== Feedback
If you have any feedback on Java plugin support in Logstash, please comment on our
https://github.com/elastic/logstash/issues/9215[main Github issue] or post in the
https://discuss.elastic.co/c/logstash[Logstash forum].
:pluginrepo!:
:sversion!:
:plugintypecap!:
:pluginnamecap!:
:pluginname!:
:pluginclass!:
:plugintype!:

288
docs/static/java-output.asciidoc vendored Normal file
View file

@ -0,0 +1,288 @@
:register_method: true
:multi_receive_method: true
:plugintype: output
:pluginclass: Outputs
:pluginname: example
:pluginnamecap: Example
:plugintypecap: Output
:sversion: '0.0.1'
:pluginrepo: https://github.com/logstash-plugins/logstash-output-java_output_example[example output plugin]
:blockfilter: true
[[java-output-plugin]]
=== How to write a Java output plugin
beta[]
// Pulls in shared section: Setting Up Environment
include::include/javapluginsetup.asciidoc[]
[float]
=== Code the plugin
The example output plugin prints events to the console using the event's
`toString` method. Let's look at the main class in the example output:
[source,java]
-----
@LogstashPlugin(name = "java_output_example")
public class JavaOutputExample implements Output {
public static final PluginConfigSpec<String> PREFIX_CONFIG =
PluginConfigSpec.stringSetting("prefix", "");
private final String id;
private String prefix;
private PrintStream printer;
private final CountDownLatch done = new CountDownLatch(1);
private volatile boolean stopped = false;
public JavaOutputExample(final String id, final Configuration configuration, final Context context) {
this(id, configuration, context, System.out);
}
JavaOutputExample(final String id, final Configuration config, final Context context, OutputStream targetStream) {
this.id = id;
prefix = config.get(PREFIX_CONFIG);
printer = new PrintStream(targetStream);
}
@Override
public void output(final Collection<Event> events) {
Iterator<Event> z = events.iterator();
while (z.hasNext() && !stopped) {
String s = prefix + z.next();
printer.println(s);
}
}
@Override
public void stop() {
stopped = true;
done.countDown();
}
@Override
public void awaitStop() throws InterruptedException {
done.await();
}
@Override
public Collection<PluginConfigSpec<?>> configSchema() {
return Collections.singletonList(PREFIX_CONFIG);
}
@Override
public String getId() {
return id;
}
}
-----
Let's step through and examine each part of that class.
[float]
==== Class declaration
[source,java]
-----
@LogstashPlugin(name="java_output_example")
public class JavaOutputExample implements Output {
-----
Notes about the class declaration:
* All Java plugins must be annotated with the `@LogstashPlugin` annotation. Additionally:
** The `name` property of the annotation must be supplied and defines the name of the plugin as it will be used
in the Logstash pipeline definition. For example, this output would be referenced in the output section of the
Logstash pipeline definition as `output { java_output_example => { .... } }`
** The value of the `name` property must match the name of the class excluding casing and underscores.
* The class must implement the `co.elastic.logstash.api.Output` interface.
[float]
==== Plugin settings
The snippet below contains both the setting definition and the method referencing it:
[source,java]
-----
public static final PluginConfigSpec<String> PREFIX_CONFIG =
PluginConfigSpec.stringSetting("prefix", "");
@Override
public Collection<PluginConfigSpec<?>> configSchema() {
return Collections.singletonList(PREFIX_CONFIG);
}
-----
The `PluginConfigSpec` class allows developers to specify the settings that a
plugin supports complete with setting name, data type, deprecation status,
required status, and default value. In this example, the `prefix` setting
defines an optional prefix to include in the output of the event. The setting is
not required and if it is not explicitly set, it defaults to the empty string.
The `configSchema` method must return a list of all settings that the plugin
supports. In a future phase of the Java plugin project, the Logstash execution
engine will validate that all required settings are present and that no
unsupported settings are present.
[float]
==== Constructor and initialization
[source,java]
-----
private final String id;
private String prefix;
private PrintStream printer;
public JavaOutputExample(final String id, final Configuration configuration, final Context context) {
this(configuration, context, System.out);
}
JavaOutputExample(final String id, final Configuration config, final Context context, OutputStream targetStream) {
this.id = id;
prefix = config.get(PREFIX_CONFIG);
printer = new PrintStream(targetStream);
}
-----
All Java output plugins must have a constructor taking a `String` id and a
`Configuration` and `Context` argument. This is the constructor that will be
used to instantiate them at runtime. The retrieval and validation of all plugin
settings should occur in this constructor. In this example, the values of the
`prefix` setting is retrieved and stored in a local variable for later use in
the `output` method. In this example, a second, pacakge private constructor is
defined that is useful for unit testing with a `Stream` other than `System.out`.
Any additional initialization may occur in the constructor as well. If there are
any unrecoverable errors encountered in the configuration or initialization of
the output plugin, a descriptive exception should be thrown. The exception will
be logged and will prevent Logstash from starting.
[float]
==== Output method
[source,java]
-----
@Override
public void output(final Collection<Event> events) {
Iterator<Event> z = events.iterator();
while (z.hasNext() && !stopped) {
String s = prefix + z.next();
printer.println(s);
}
}
-----
Outputs may send events to local sinks such as the console or a file or to remote systems such as Elasticsearch
or other external systems. In this example, the events are printed to the local console.
[float]
==== Stop and awaitStop methods
[source,java]
-----
private final CountDownLatch done = new CountDownLatch(1);
private volatile boolean stopped;
@Override
public void stop() {
stopped = true;
done.countDown();
}
@Override
public void awaitStop() throws InterruptedException {
done.await();
}
-----
The `stop` method notifies the output to stop sending events. The stop mechanism
may be implemented in any way that honors the API contract though a `volatile
boolean` flag works well for many use cases. Because this output example is so
simple, its `output` method does not check for the stop flag.
Outputs stop both asynchronously and cooperatively. Use the `awaitStop` method
to block until the output has completed the stop process. Note that this method
should **not** signal the output to stop as the `stop` method does. The
awaitStop mechanism may be implemented in any way that honors the API contract
though a `CountDownLatch` works well for many use cases.
[float]
==== getId method
[source,java]
-----
@Override
public String getId() {
return id;
}
-----
For output plugins, the `getId` method should always return the id that was provided to the plugin through its
constructor at instantiation time.
[float]
==== Unit tests
Lastly, but certainly not least importantly, unit tests are strongly encouraged.
The example output plugin includes an
https://github.com/logstash-plugins/logstash-output-java_output_example/blob/master/src/test/java/org/logstash/javaapi/JavaOutputExampleTest.java[example unit
test] that you can use as a template for your own.
// Pulls in shared section about Packaging and Deploying
include::include/javapluginpkg.asciidoc[]
[float]
=== Running Logstash with the Java output plugin
The following is a minimal Logstash configuration that can be used to test that
the Java output plugin is correctly installed and functioning.
[source,java]
-----
input {
generator { message => "Hello world!" count => 1 }
}
output {
java_output_example {}
}
-----
Copy the above Logstash configuration to a file such as `java_output.conf`.
Logstash should then be started with:
[source,txt]
-----
bin/logstash --java-execution -f /path/to/java_output.conf
-----
Note that the `--java-execution` flag to enable the Java execution engine is
required as Java plugins are not supported in the Ruby execution engine.
The expected Logstash output (excluding initialization) with the configuration
above is:
[source,txt]
-----
{"@timestamp":"yyyy-MM-ddTHH:mm:ss.SSSZ","message":"Hello world!","@version":"1","host":"<yourHostname>","sequence":0}
-----
[float]
=== Feedback
If you have any feedback on Java plugin support in Logstash, please comment on our
https://github.com/elastic/logstash/issues/9215[main Github issue] or post in the
https://discuss.elastic.co/c/logstash[Logstash forum].
:pluginrepo!:
:sversion!:
:plugintypecap!:
:pluginnamecap!:
:pluginname!:
:pluginclass!:
:plugintype!: