remove ingest-converter (#16453)

Removed the tool Ingest Converter
This commit is contained in:
kaisecheng 2024-09-16 15:57:51 +01:00 committed by GitHub
parent 1ec37b7c41
commit 4e82655cd5
No known key found for this signature in database
GPG key ID: B5690EEEBB952194
106 changed files with 1 additions and 3454 deletions

View file

@ -21,10 +21,6 @@ analyze:
type: gradle
target: 'dependencies-report:'
path: .
- name: ingest-converter
type: gradle
target: 'ingest-converter:'
path: .
- name: logstash-core
type: gradle
target: 'logstash-core:'

View file

@ -1,10 +0,0 @@
@echo off
setlocal enabledelayedexpansion
cd /d "%~dp0\.."
for /f %%i in ('cd') do set RESULT=%%i
"%JAVACMD%" -cp "!RESULT!\tools\ingest-converter\build\libs\ingest-converter.jar;*" ^
org.logstash.ingest.Pipeline %*
endlocal

View file

@ -1,4 +0,0 @@
#!/usr/bin/env bash
java -cp "$(cd `dirname $0`/..; pwd)"'/tools/ingest-converter/build/libs/ingest-converter.jar:*' \
org.logstash.ingest.Pipeline "$@"

View file

@ -116,8 +116,6 @@ include::static/managing-multiline-events.asciidoc[]
include::static/glob-support.asciidoc[]
include::static/ingest-convert.asciidoc[]
include::static/field-reference.asciidoc[]
//The `field-reference.asciidoc` file (included above) contains a

View file

@ -134,10 +134,6 @@ For a full example, see <<use-filebeat-modules-kafka>>.
//* <<parsing-nginx>>
//* <<parsing-system>>
//
//TIP: {ls} provides an <<ingest-converter,ingest pipeline conversion tool>>
//to help you migrate ingest pipeline definitions to {ls} configs. The tool does
//not currently support all the processors that are available for ingest node, but
//it's a good starting point.
//
//[[parsing-apache2]]
//==== Apache 2 Logs

View file

@ -1,90 +0,0 @@
[[ingest-converter]]
=== Converting Ingest Node Pipelines
After implementing {ref}/ingest.html[ingest] pipelines to parse your data, you
might decide that you want to take advantage of the richer transformation
capabilities in Logstash. For example, you may need to use Logstash instead of
ingest pipelines if you want to:
* Ingest from more inputs. Logstash can natively ingest data from many other
sources like TCP, UDP, syslog, and relational databases.
* Use multiple outputs. Ingest node was designed to only support Elasticsearch
as an output, but you may want to use more than one output. For example, you may
want to archive your incoming data to S3 as well as indexing it in
Elasticsearch.
* Take advantage of the richer transformation capabilities in Logstash, such as
external lookups.
* Use the persistent queue feature to handle spikes when ingesting data (from
Beats and other sources).
To make it easier for you to migrate your configurations, Logstash provides an
ingest pipeline conversion tool. The conversion tool takes the ingest pipeline
definition as input and, when possible, creates the equivalent Logstash
configuration as output.
See <<ingest-converter-limitations>> for a full list of tool limitations.
[[ingest-converter-run]]
==== Running the tool
You'll find the conversion tool in the `bin` directory of your Logstash
installation. See <<dir-layout>> to find the location of `bin` on your system.
To run the conversion tool, use the following command:
[source,shell]
-----
bin/ingest-convert.sh --input INPUT_FILE_URI --output OUTPUT_FILE_URI [--append-stdio]
-----
Where:
* `INPUT_FILE_URI` is a file URI that specifies the full path to the JSON file
that defines the ingest node pipeline.
* `OUTPUT_FILE_URI` is the file URI of the Logstash DSL file that will be
generated by the tool.
* `--append-stdio` is an optional flag that adds stdin and stdout sections to
the config instead of adding the default Elasticsearch output.
This command expects a file URI, so make sure you use forward slashes and
specify the full path to the file.
For example:
[source,text]
-----
bin/ingest-convert.sh --input file:///tmp/ingest/apache.json --output file:///tmp/ingest/apache.conf
-----
[[ingest-converter-limitations]]
==== Limitations
* Painless script conversion is not supported.
* Only a subset of available processors are
<<ingest-converter-supported-processors,supported>> for conversion. For
processors that are not supported, the tool produces a warning and continues
with a best-effort conversion.
[[ingest-converter-supported-processors]]
==== Supported Processors
The following ingest node processors are currently supported for conversion by
the tool:
* Append
* Convert
* Date
* GeoIP
* Grok
* Gsub
* Json
* Lowercase
* Rename
* Set

View file

@ -50,7 +50,6 @@ namespace "artifact" do
"logstash-core-plugin-api/*.gemspec",
"patterns/**/*",
"tools/ingest-converter/build/libs/ingest-converter.jar",
"vendor/??*/**/*",
# To include ruby-maven's hidden ".mvn" directory, we need to
# do add the line below. This directory contains a file called

View file

@ -1,10 +1,9 @@
rootProject.name = "logstash"
include ':logstash-core', 'logstash-core-benchmarks', 'ingest-converter', 'benchmark-cli', 'jvm-options-parser', 'logstash-integration-tests', 'dependencies-report'
include ':logstash-core', 'logstash-core-benchmarks', 'benchmark-cli', 'jvm-options-parser', 'logstash-integration-tests', 'dependencies-report'
project(':logstash-core').projectDir = new File('./logstash-core')
project(':logstash-core-benchmarks').projectDir = new File('./logstash-core/benchmarks')
project(':logstash-integration-tests').projectDir = new File('./qa/integration')
project(':ingest-converter').projectDir = new File('./tools/ingest-converter')
project(':benchmark-cli').projectDir = new File('./tools/benchmark-cli')
project(':dependencies-report').projectDir = new File('./tools/dependencies-report')
project(':jvm-options-parser').projectDir = new File('./tools/jvm-options-parser')

View file

@ -1,63 +0,0 @@
/*
* Licensed to Elasticsearch B.V. under one or more contributor
* license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright
* ownership. Elasticsearch B.V. licenses this file to you under
* the Apache License, Version 2.0 (the "License"); you may
* not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing,
* software distributed under the License is distributed on an
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
* KIND, either express or implied. See the License for the
* specific language governing permissions and limitations
* under the License.
*/
import org.yaml.snakeyaml.Yaml
// fetch version from Logstash's main versions.yml file
def versionMap = (Map) (new Yaml()).load(new File("$projectDir/../../versions.yml").text)
description = """Ingest JSON to Logstash Grok Config Converter"""
version = versionMap['logstash-core']
String jacksonDatabindVersion = versionMap['jackson-databind']
repositories {
mavenCentral()
}
buildscript {
repositories {
mavenCentral()
gradlePluginPortal()
}
dependencies {
classpath "org.yaml:snakeyaml:${snakeYamlVersion}"
classpath "com.github.johnrengelman:shadow:${shadowGradlePluginVersion}"
}
}
dependencies {
implementation 'net.sf.jopt-simple:jopt-simple:4.6'
implementation "com.fasterxml.jackson.core:jackson-databind:${jacksonDatabindVersion}"
testImplementation "junit:junit:4.13.2"
testImplementation 'commons-io:commons-io:2.16.1'
}
javadoc {
enabled = true
}
apply plugin: 'com.github.johnrengelman.shadow'
shadowJar {
archiveBaseName = 'ingest-converter'
archiveClassifier = null
archiveVersion = ''
}
assemble.dependsOn shadowJar

View file

@ -1,2 +0,0 @@
isDistributedArtifact=false

View file

@ -1,37 +0,0 @@
/*
* Licensed to Elasticsearch B.V. under one or more contributor
* license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright
* ownership. Elasticsearch B.V. licenses this file to you under
* the Apache License, Version 2.0 (the "License"); you may
* not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing,
* software distributed under the License is distributed on an
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
* KIND, either express or implied. See the License for the
* specific language governing permissions and limitations
* under the License.
*/
package org.logstash.ingest;
import javax.script.ScriptException;
/**
* Ingest Append DSL to Logstash mutate Transpiler.
*/
public final class Append {
private Append() {
// Utility Wrapper for JS Script.
}
public static void main(final String... args) throws ScriptException, NoSuchMethodException {
JsUtil.convert(args, "ingest_append_to_logstash");
}
}

View file

@ -1,37 +0,0 @@
/*
* Licensed to Elasticsearch B.V. under one or more contributor
* license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright
* ownership. Elasticsearch B.V. licenses this file to you under
* the Apache License, Version 2.0 (the "License"); you may
* not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing,
* software distributed under the License is distributed on an
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
* KIND, either express or implied. See the License for the
* specific language governing permissions and limitations
* under the License.
*/
package org.logstash.ingest;
import javax.script.ScriptException;
/**
* Ingest Convert DSL to Logstash Date Transpiler.
*/
public final class Convert {
private Convert() {
// Utility Wrapper for JS Script.
}
public static void main(final String... args) throws ScriptException, NoSuchMethodException {
JsUtil.convert(args, "ingest_convert_to_logstash");
}
}

View file

@ -1,37 +0,0 @@
/*
* Licensed to Elasticsearch B.V. under one or more contributor
* license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright
* ownership. Elasticsearch B.V. licenses this file to you under
* the Apache License, Version 2.0 (the "License"); you may
* not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing,
* software distributed under the License is distributed on an
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
* KIND, either express or implied. See the License for the
* specific language governing permissions and limitations
* under the License.
*/
package org.logstash.ingest;
import javax.script.ScriptException;
/**
* Ingest Date DSL to Logstash Date Transpiler.
*/
public final class Date {
private Date() {
// Utility Wrapper for JS Script.
}
public static void main(final String... args) throws ScriptException, NoSuchMethodException {
JsUtil.convert(args, "ingest_to_logstash_date");
}
}

View file

@ -1,34 +0,0 @@
/*
* Licensed to Elasticsearch B.V. under one or more contributor
* license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright
* ownership. Elasticsearch B.V. licenses this file to you under
* the Apache License, Version 2.0 (the "License"); you may
* not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing,
* software distributed under the License is distributed on an
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
* KIND, either express or implied. See the License for the
* specific language governing permissions and limitations
* under the License.
*/
package org.logstash.ingest;
import javax.script.ScriptException;
public final class GeoIp {
private GeoIp() {
// Utility Wrapper for JS Script.
}
public static void main(final String... args) throws ScriptException, NoSuchMethodException {
JsUtil.convert(args, "ingest_to_logstash_geoip");
}
}

View file

@ -1,37 +0,0 @@
/*
* Licensed to Elasticsearch B.V. under one or more contributor
* license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright
* ownership. Elasticsearch B.V. licenses this file to you under
* the Apache License, Version 2.0 (the "License"); you may
* not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing,
* software distributed under the License is distributed on an
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
* KIND, either express or implied. See the License for the
* specific language governing permissions and limitations
* under the License.
*/
package org.logstash.ingest;
import javax.script.ScriptException;
/**
* Ingest JSON DSL to Logstash Grok Transpiler.
*/
public final class Grok {
private Grok() {
// Utility Wrapper for JS Script.
}
public static void main(final String... args) throws ScriptException, NoSuchMethodException {
JsUtil.convert(args, "ingest_to_logstash_grok");
}
}

View file

@ -1,34 +0,0 @@
/*
* Licensed to Elasticsearch B.V. under one or more contributor
* license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright
* ownership. Elasticsearch B.V. licenses this file to you under
* the Apache License, Version 2.0 (the "License"); you may
* not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing,
* software distributed under the License is distributed on an
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
* KIND, either express or implied. See the License for the
* specific language governing permissions and limitations
* under the License.
*/
package org.logstash.ingest;
import javax.script.ScriptException;
public final class Gsub {
private Gsub() {
// Utility Wrapper for JS Script.
}
public static void main(final String... args) throws ScriptException, NoSuchMethodException {
JsUtil.convert(args, "ingest_to_logstash_gsub");
}
}

View file

@ -1,72 +0,0 @@
/*
* Licensed to Elasticsearch B.V. under one or more contributor
* license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright
* ownership. Elasticsearch B.V. licenses this file to you under
* the Apache License, Version 2.0 (the "License"); you may
* not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing,
* software distributed under the License is distributed on an
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
* KIND, either express or implied. See the License for the
* specific language governing permissions and limitations
* under the License.
*/
package org.logstash.ingest;
import com.fasterxml.jackson.core.JsonProcessingException;
import com.fasterxml.jackson.core.type.TypeReference;
import com.fasterxml.jackson.databind.ObjectMapper;
import java.util.HashMap;
import java.util.List;
import java.util.Map;
import java.util.stream.Collectors;
public class IngestAppend {
/**
* Converts Ingest Append JSON to LS mutate filter.
*/
@SuppressWarnings({"rawtypes", "unchecked"})
public static String toLogstash(String json, boolean appendStdio) throws JsonProcessingException {
ObjectMapper mapper = new ObjectMapper();
TypeReference<HashMap<String, Object>> typeRef = new TypeReference<HashMap<String, Object>>() {};
final HashMap<String, Object> jsonDefinition = mapper.readValue(json, typeRef);
final List<Map> processors = (List<Map>) jsonDefinition.get("processors");
List<String> filters_pipeline = processors.stream().map(IngestAppend::mapProcessor).collect(Collectors.toList());
return IngestConverter.filtersToFile(
IngestConverter.appendIoPlugins(filters_pipeline, appendStdio));
}
@SuppressWarnings({"rawtypes", "unchecked"})
private static String mapProcessor(Map processor) {
return IngestConverter.filterHash(IngestConverter.createHash("mutate", appendHash(processor)));
}
@SuppressWarnings({"rawtypes", "unchecked"})
static String appendHash(Map<String, Map> processor) {
Map append_json = processor.get("append");
Object value = append_json.get("value");
Object value_contents;
if (value instanceof List) {
value_contents = IngestConverter.createArray((List) value);
} else {
value_contents = IngestConverter.quoteString((String) value);
}
Object mutate_contents = IngestConverter.createField(
IngestConverter.quoteString(IngestConverter.dotsToSquareBrackets((String) append_json.get("field"))),
(String) value_contents);
return IngestConverter.createField("add_field", IngestConverter.wrapInCurly((String) mutate_contents));
}
public static boolean has_append(Map<String, Object> processor) {
return processor.containsKey("append");
}
}

View file

@ -1,66 +0,0 @@
/*
* Licensed to Elasticsearch B.V. under one or more contributor
* license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright
* ownership. Elasticsearch B.V. licenses this file to you under
* the Apache License, Version 2.0 (the "License"); you may
* not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing,
* software distributed under the License is distributed on an
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
* KIND, either express or implied. See the License for the
* specific language governing permissions and limitations
* under the License.
*/
package org.logstash.ingest;
import com.fasterxml.jackson.core.JsonProcessingException;
import com.fasterxml.jackson.core.type.TypeReference;
import com.fasterxml.jackson.databind.ObjectMapper;
import java.util.HashMap;
import java.util.List;
import java.util.Map;
import java.util.stream.Collectors;
public class IngestConvert {
/**
* Converts Ingest Convert JSON to LS Date filter.
*/
@SuppressWarnings({"rawtypes", "unchecked"})
public static String toLogstash(String json, boolean appendStdio) throws JsonProcessingException {
ObjectMapper mapper = new ObjectMapper();
TypeReference<HashMap<String, Object>> typeRef = new TypeReference<HashMap<String, Object>>() {};
final HashMap<String, Object> jsonDefinition = mapper.readValue(json, typeRef);
final List<Map> processors = (List<Map>) jsonDefinition.get("processors");
List<String> filters_pipeline = processors.stream().map(IngestConvert::mapProcessor).collect(Collectors.toList());
return IngestConverter.filtersToFile(
IngestConverter.appendIoPlugins(filters_pipeline, appendStdio));
}
@SuppressWarnings({"rawtypes", "unchecked"})
private static String mapProcessor(Map processor) {
return IngestConverter.filterHash(IngestConverter.createHash("mutate", convertHash(processor)));
}
@SuppressWarnings({"rawtypes", "unchecked"})
static String convertHash(Map<String, Map> processor) {
Map convert_json = processor.get("convert");
Object mutate_contents = IngestConverter.createField(
IngestConverter.quoteString(IngestConverter.dotsToSquareBrackets((String) convert_json.get("field"))),
IngestConverter.quoteString((String) convert_json.get("type")));
return IngestConverter.createField("convert", IngestConverter.wrapInCurly((String) mutate_contents));
}
public static boolean has_convert(Map<String, Object> processor) {
return processor.containsKey("convert");
}
}

View file

@ -1,231 +0,0 @@
/*
* Licensed to Elasticsearch B.V. under one or more contributor
* license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright
* ownership. Elasticsearch B.V. licenses this file to you under
* the Apache License, Version 2.0 (the "License"); you may
* not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing,
* software distributed under the License is distributed on an
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
* KIND, either express or implied. See the License for the
* specific language governing permissions and limitations
* under the License.
*/
package org.logstash.ingest;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.List;
import java.util.Map;
import java.util.regex.Matcher;
import java.util.regex.Pattern;
import java.util.stream.Collectors;
public class IngestConverter {
/**
* Translates the JSON naming pattern (`name.qualifier.sub`) into the LS pattern
* [name][qualifier][sub] for all applicable tokens in the given string.
* This function correctly identifies and omits renaming of string literals.
* @param content to replace naming pattern in
* @returns {string} with Json naming translated into grok naming
*/
public static String dotsToSquareBrackets(String content) {
final Pattern pattern = Pattern.compile("\\(\\?:%\\{.*\\|-\\)");
final Matcher matcher = pattern.matcher(content);
List<String> tokens = new ArrayList<>();
String right = content;
while (matcher.find()) {
final int start = matcher.start();
final int end = matcher.end();
final String matchContent = content.substring(start, end);
right = content.substring(end);
tokens.add(tokenDotsToSquareBrackets(content.substring(0, start)));
tokens.add(matchContent);
}
tokens.add(tokenDotsToSquareBrackets(right));
return String.join("", tokens);
}
private static String tokenDotsToSquareBrackets(String content) {
//Break out if this is not a naming pattern we convert
final String adjusted;
if (Pattern.compile("([\\w_]+\\.)+[\\w_]+").matcher(content).find()) {
adjusted = content.replaceAll("(\\w*)\\.(\\w*)", "$1][$2")
.replaceAll("\\[(\\w+)(}|$)", "[$1]$2")
.replaceAll("\\{(\\w+):(\\w+)]", "{$1:[$2]")
.replaceAll("^(\\w+)]\\[", "[$1][");
} else {
adjusted = content;
}
return adjusted;
}
public static String quoteString(String content) {
return "\"" + content.replace("\"", "\\\"") + "\"";
}
public static String wrapInCurly(String content) {
return "{\n" + content + "\n}";
}
public static String createField(String fieldName, String content) {
return fieldName + " => " + content;
}
public static String createHash(String fieldName, String content) {
return fieldName + " " + wrapInCurly(content);
}
/**
* All hash fields in LS start on a new line.
* @param fields Array of Strings of Serialized Hash Fields
* @returns {string} Joined Serialization of Hash Fields
*/
public static String joinHashFields(String... fields) {
return String.join("\n", fields);
}
/**
* Fixes indentation in LS string.
* @param content LS string to fix indentation in, that has no indentation intentionally with
* all lines starting on a token without preceding spaces.
* @return LS string indented by 3 spaces per level
*/
public static String fixIndent(String content) {
final String[] lines = content.split("\n");
int count = 0;
for (int i = 0; i < lines.length; i++) {
if (Pattern.compile("(\\{|\\[)$").matcher(lines[i]).find()) {
lines[i] = indent(lines[i], count);
++count;
} else if (Pattern.compile("(\\}|\\])$").matcher(lines[i]).find()) {
--count;
lines[i] = indent(lines[i], count);
// Only indent line if previous line ended on relevant control char.
} else if (i > 0 && Pattern.compile("(=>\\s+\".+\"|,|\\{|\\}|\\[|\\])$").matcher(lines[i - 1]).find()) {
lines[i] = indent(lines[i], count);
}
}
return String.join("\n", lines);
}
private static String indent(String content, int shifts) {
StringBuilder spacing = new StringBuilder();
for (int i = 0; i < shifts * 3; i++) {
spacing.append(" ");
}
return spacing.append(content).toString();
}
/**
* Converts Ingest/JSON style pattern array to LS pattern array, performing necessary variable
* name and quote escaping adjustments.
* @param patterns Pattern Array in JSON formatting
* @return Pattern array in LS formatting
*/
public static String createPatternArray(String... patterns) {
final String body = Arrays.stream(patterns)
.map(IngestConverter::dotsToSquareBrackets)
.map(IngestConverter::quoteString)
.collect(Collectors.joining(",\n"));
return "[\n" + body + "\n]";
}
public static String createArray(List<String> ingestArray) {
final String body = ingestArray.stream()
.map(IngestConverter::quoteString)
.collect(Collectors.joining(",\n"));
return "[\n" + body + "\n]";
}
/**
* Converts Ingest/JSON style pattern array to LS pattern array or string if the given array
* contains a single element only, performing necessary variable name and quote escaping
* adjustments.
* @param patterns Pattern Array in JSON formatting
* @return Pattern array or string in LS formatting
*/
public static String createPatternArrayOrField(String... patterns) {
return patterns.length == 1
? quoteString(dotsToSquareBrackets(patterns[0]))
: createPatternArray(patterns);
}
public static String filterHash(String contents) {
return fixIndent(createHash("filter", contents));
}
public static String filtersToFile(String... filters) {
return String.join("\n\n", filters) + "\n";
}
/**
* Does it have an on_failure field?
* @param processor Json
* @param name Name of the processor
* @return true if has on failure
*/
@SuppressWarnings("rawtypes")
public static boolean hasOnFailure(Map<String, Map> processor, String name) {
final List onFailure = (List) processor.get(name).get("on_failure");
return onFailure != null && !onFailure.isEmpty();
}
@SuppressWarnings({"rawtypes", "unchecked"})
public static List<Map<String, Map>> getOnFailure(Map<String, Map> processor, String name) {
return (List<Map<String, Map>>) processor.get(name).get("on_failure");
}
/**
* Creates an if clause with the tag name
* @param tag String tag name to find in [tags] field
* @param onFailurePipeline The on failure pipeline converted to LS to tack on in the conditional
* @return a string representing a conditional logic
*/
public static String createTagConditional(String tag, String onFailurePipeline) {
return "if " + quoteString(tag) + " in [tags] {\n" +
onFailurePipeline + "\n" +
"}";
}
public static String getElasticsearchOutput() {
return fixIndent("output {\n" +
"elasticsearch {\n" +
"hosts => \"localhost\"\n" +
"}\n" +
"}");
}
public static String getStdinInput() {
return fixIndent("input {\n" +
"stdin {\n" +
"}\n" +
"}");
}
public static String getStdoutOutput() {
return fixIndent("output {\n" +
"stdout {\n" +
"codec => \"rubydebug\"\n" +
"}\n" +
"}");
}
public static String appendIoPlugins(List<String> filtersPipeline, boolean appendStdio) {
// TODO create unique list to join all
String filtersPipelineStr = String.join("\n", filtersPipeline);
if (appendStdio) {
return String.join("\n", IngestConverter.getStdinInput(), filtersPipelineStr, IngestConverter.getStdoutOutput());
} else {
return String.join("\n", filtersPipelineStr, IngestConverter.getElasticsearchOutput());
}
}
}

View file

@ -1,97 +0,0 @@
/*
* Licensed to Elasticsearch B.V. under one or more contributor
* license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright
* ownership. Elasticsearch B.V. licenses this file to you under
* the Apache License, Version 2.0 (the "License"); you may
* not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing,
* software distributed under the License is distributed on an
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
* KIND, either express or implied. See the License for the
* specific language governing permissions and limitations
* under the License.
*/
package org.logstash.ingest;
import com.fasterxml.jackson.core.JsonProcessingException;
import com.fasterxml.jackson.core.type.TypeReference;
import com.fasterxml.jackson.databind.ObjectMapper;
import java.util.ArrayList;
import java.util.HashMap;
import java.util.List;
import java.util.Map;
import java.util.stream.Collectors;
public class IngestDate {
/**
* Converts Ingest Date JSON to LS Date filter.
*/
@SuppressWarnings({"rawtypes", "unchecked"})
public static String toLogstash(String json, boolean appendStdio) throws JsonProcessingException {
ObjectMapper mapper = new ObjectMapper();
TypeReference<HashMap<String, Object>> typeRef = new TypeReference<HashMap<String, Object>>() {};
final HashMap<String, Object> jsonDefinition = mapper.readValue(json, typeRef);
final List<Map> processors = (List<Map>) jsonDefinition.get("processors");
List<String> filters_pipeline = processors.stream().map(IngestDate::mapProcessor).collect(Collectors.toList());
return IngestConverter.filtersToFile(
IngestConverter.appendIoPlugins(filters_pipeline, appendStdio));
}
@SuppressWarnings({"rawtypes", "unchecked"})
private static String mapProcessor(Map processor) {
return IngestConverter.filterHash(IngestConverter.createHash("date", dateHash(processor)));
}
@SuppressWarnings({"rawtypes", "unchecked"})
static String dateHash(Map<String, Map> processor) {
Map date_json = processor.get("date");
List<String> formats = (List<String>) date_json.get("formats");
final String firstElem = IngestConverter.dotsToSquareBrackets((String) date_json.get("field"));
List<String> match_contents = new ArrayList<>();
match_contents.add(firstElem);
for (String f : formats) {
match_contents.add(f);
}
String date_contents = IngestConverter.createField(
"match",
IngestConverter.createPatternArray(match_contents.toArray(new String[0])));
if (JsUtil.isNotEmpty((String) date_json.get("target_field"))) {
String target = IngestConverter.createField(
"target",
IngestConverter.quoteString(
IngestConverter.dotsToSquareBrackets((String) date_json.get("target_field"))
)
);
date_contents = IngestConverter.joinHashFields(date_contents, target);
}
if (JsUtil.isNotEmpty((String) date_json.get("timezone"))) {
String timezone = IngestConverter.createField(
"timezone",
IngestConverter.quoteString((String) date_json.get("timezone"))
);
date_contents = IngestConverter.joinHashFields(date_contents, timezone);
}
if (JsUtil.isNotEmpty((String) date_json.get("locale"))) {
String locale = IngestConverter.createField(
"locale",
IngestConverter.quoteString((String) date_json.get("locale"))
);
date_contents = IngestConverter.joinHashFields(date_contents, locale);
}
return date_contents;
}
public static boolean has_date(Map<String, Object> processor) {
return processor.containsKey("date");
}
}

View file

@ -1,85 +0,0 @@
/*
* Licensed to Elasticsearch B.V. under one or more contributor
* license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright
* ownership. Elasticsearch B.V. licenses this file to you under
* the Apache License, Version 2.0 (the "License"); you may
* not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing,
* software distributed under the License is distributed on an
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
* KIND, either express or implied. See the License for the
* specific language governing permissions and limitations
* under the License.
*/
package org.logstash.ingest;
import com.fasterxml.jackson.core.JsonProcessingException;
import com.fasterxml.jackson.core.type.TypeReference;
import com.fasterxml.jackson.databind.ObjectMapper;
import java.util.HashMap;
import java.util.List;
import java.util.Map;
import java.util.stream.Collectors;
public class IngestGeoIp {
/**
* Converts Ingest Date JSON to LS Date filter.
*/
@SuppressWarnings({"rawtypes", "unchecked"})
public static String toLogstash(String json, boolean appendStdio) throws JsonProcessingException {
ObjectMapper mapper = new ObjectMapper();
TypeReference<HashMap<String, Object>> typeRef = new TypeReference<HashMap<String, Object>>() {};
final HashMap<String, Object> jsonDefinition = mapper.readValue(json, typeRef);
final List<Map> processors = (List<Map>) jsonDefinition.get("processors");
List<String> filters_pipeline = processors.stream().map(IngestGeoIp::mapProcessor).collect(Collectors.toList());
return IngestConverter.filtersToFile(
IngestConverter.appendIoPlugins(filters_pipeline, appendStdio));
}
@SuppressWarnings({"rawtypes", "unchecked"})
private static String mapProcessor(Map processor) {
return IngestConverter.filterHash(IngestConverter.createHash("geoip", geoIpHash(processor)));
}
@SuppressWarnings({"rawtypes", "unchecked"})
static String geoIpHash(Map<String, Map> processor) {
Map geoip_data = processor.get("geoip");
final String sourceField = IngestConverter.createField(
"source",
IngestConverter.quoteString(
IngestConverter.dotsToSquareBrackets((String) geoip_data.get("field"))
)
);
final String targetField = IngestConverter.createField(
"target",
IngestConverter.quoteString(
IngestConverter.dotsToSquareBrackets((String) geoip_data.get("target_field"))
)
);
if (geoip_data.containsKey("properties")) {
String fields = IngestConverter.createField(
"fields",
IngestConverter.createPatternArray(((List<String>) geoip_data.get("properties")).toArray(new String[0])
));
return IngestConverter.joinHashFields(sourceField, targetField, fields);
} else {
return IngestConverter.joinHashFields(sourceField, targetField);
}
}
public static boolean has_geoip(Map<String, Object> processor) {
return processor.containsKey("geoip");
}
}

View file

@ -1,99 +0,0 @@
/*
* Licensed to Elasticsearch B.V. under one or more contributor
* license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright
* ownership. Elasticsearch B.V. licenses this file to you under
* the Apache License, Version 2.0 (the "License"); you may
* not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing,
* software distributed under the License is distributed on an
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
* KIND, either express or implied. See the License for the
* specific language governing permissions and limitations
* under the License.
*/
package org.logstash.ingest;
import com.fasterxml.jackson.core.JsonProcessingException;
import com.fasterxml.jackson.core.type.TypeReference;
import com.fasterxml.jackson.databind.ObjectMapper;
import java.util.ArrayList;
import java.util.HashMap;
import java.util.List;
import java.util.Map;
import java.util.stream.Collectors;
public class IngestGrok {
/**
* Converts Ingest JSON to LS Grok.
*/
@SuppressWarnings({"rawtypes", "unchecked"})
public static String toLogstash(String json, boolean appendStdio) throws JsonProcessingException {
ObjectMapper mapper = new ObjectMapper();
TypeReference<HashMap<String, Object>> typeRef = new TypeReference<HashMap<String, Object>>() {};
final HashMap<String, Object> jsonDefinition = mapper.readValue(json, typeRef);
final List<Map> processors = (List<Map>) jsonDefinition.get("processors");
List<String> filters_pipeline = processors.stream().map(IngestGrok::mapProcessor).collect(Collectors.toList());
return IngestConverter.filtersToFile(
IngestConverter.appendIoPlugins(filters_pipeline, appendStdio));
}
@SuppressWarnings({"rawtypes", "unchecked"})
private static String mapProcessor(Map processor) {
return IngestConverter.filterHash(IngestConverter.createHash("grok", grokHash(processor)));
}
@SuppressWarnings({"rawtypes", "unchecked"})
static String grokHash(Map<String, Map> processor) {
Map grok_data = processor.get("grok");
String grok_contents = createHashField("match",
IngestConverter.createField(
IngestConverter.quoteString((String) grok_data.get("field")),
IngestConverter.createPatternArrayOrField(((List<String>) grok_data.get("patterns")).toArray(new String[0]))
));
if (grok_data.containsKey("pattern_definitions")) {
grok_contents = IngestConverter.joinHashFields(
grok_contents,
createPatternDefinitionHash((Map<String, String>) grok_data.get("pattern_definitions"))
);
}
return grok_contents;
}
private static String createHashField(String name, String content) {
return IngestConverter.createField(name, IngestConverter.wrapInCurly(content));
}
private static String createPatternDefinitionHash(Map<String, String> definitions) {
List<String> content = new ArrayList<>();
for(Map.Entry<String, String> entry : definitions.entrySet()) {
content.add(IngestConverter.createField(
IngestConverter.quoteString(entry.getKey()),
IngestConverter.quoteString(entry.getValue())));
}
final String patternDefs = content.stream().map(IngestConverter::dotsToSquareBrackets)
.collect(Collectors.joining("\n"));
return createHashField(
"pattern_definitions",
patternDefs
);
}
public static boolean has_grok(Map<String, Object> processor) {
return processor.containsKey(get_name());
}
public static String get_name() {
return "grok";
}
}

View file

@ -1,67 +0,0 @@
/*
* Licensed to Elasticsearch B.V. under one or more contributor
* license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright
* ownership. Elasticsearch B.V. licenses this file to you under
* the Apache License, Version 2.0 (the "License"); you may
* not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing,
* software distributed under the License is distributed on an
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
* KIND, either express or implied. See the License for the
* specific language governing permissions and limitations
* under the License.
*/
package org.logstash.ingest;
import com.fasterxml.jackson.core.JsonProcessingException;
import com.fasterxml.jackson.core.type.TypeReference;
import com.fasterxml.jackson.databind.ObjectMapper;
import java.util.HashMap;
import java.util.List;
import java.util.Map;
import java.util.stream.Collectors;
public class IngestGsub {
/**
* Converts Ingest JSON to LS Grok.
*/
@SuppressWarnings({"rawtypes", "unchecked"})
public static String toLogstash(String json, boolean appendStdio) throws JsonProcessingException {
ObjectMapper mapper = new ObjectMapper();
TypeReference<HashMap<String, Object>> typeRef = new TypeReference<HashMap<String, Object>>() {};
final HashMap<String, Object> jsonDefinition = mapper.readValue(json, typeRef);
final List<Map> processors = (List<Map>) jsonDefinition.get("processors");
List<String> filters_pipeline = processors.stream().map(IngestGsub::mapProcessor).collect(Collectors.toList());
return IngestConverter.filtersToFile(
IngestConverter.appendIoPlugins(filters_pipeline, appendStdio));
}
@SuppressWarnings({"rawtypes", "unchecked"})
private static String mapProcessor(Map processor) {
return IngestConverter.filterHash(IngestConverter.createHash("mutate", gsubHash(processor)));
}
@SuppressWarnings({"rawtypes", "unchecked"})
static String gsubHash(Map<String, Map> processor) {
Map gsub_data = processor.get("gsub");
final String body = String.join(", ",
IngestConverter.quoteString(IngestConverter.dotsToSquareBrackets((String) gsub_data.get("field"))),
IngestConverter.quoteString((String) gsub_data.get("pattern")),
IngestConverter.quoteString((String) gsub_data.get("replacement")));
return IngestConverter.createField("gsub", "[\n" + body + "\n]");
}
public static boolean has_gsub(Map<String, Object> processor) {
return processor.containsKey("gsub");
}
}

View file

@ -1,79 +0,0 @@
/*
* Licensed to Elasticsearch B.V. under one or more contributor
* license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright
* ownership. Elasticsearch B.V. licenses this file to you under
* the Apache License, Version 2.0 (the "License"); you may
* not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing,
* software distributed under the License is distributed on an
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
* KIND, either express or implied. See the License for the
* specific language governing permissions and limitations
* under the License.
*/
package org.logstash.ingest;
import com.fasterxml.jackson.core.JsonProcessingException;
import com.fasterxml.jackson.core.type.TypeReference;
import com.fasterxml.jackson.databind.ObjectMapper;
import java.util.ArrayList;
import java.util.HashMap;
import java.util.List;
import java.util.Map;
import java.util.stream.Collectors;
public class IngestJson {
/**
* Converts Ingest json processor to LS json filter.
*/
@SuppressWarnings({"rawtypes", "unchecked"})
public static String toLogstash(String json, boolean appendStdio) throws JsonProcessingException {
ObjectMapper mapper = new ObjectMapper();
TypeReference<HashMap<String, Object>> typeRef = new TypeReference<HashMap<String, Object>>() {};
final HashMap<String, Object> jsonDefinition = mapper.readValue(json, typeRef);
final List<Map> processors = (List<Map>) jsonDefinition.get("processors");
List<String> filters_pipeline = processors.stream().map(IngestJson::mapProcessor).collect(Collectors.toList());
return IngestConverter.filtersToFile(
IngestConverter.appendIoPlugins(filters_pipeline, appendStdio));
}
@SuppressWarnings({"rawtypes", "unchecked"})
private static String mapProcessor(Map processor) {
return IngestConverter.filterHash(IngestConverter.createHash("json", jsonHash(processor)));
}
@SuppressWarnings({"rawtypes", "unchecked"})
static String jsonHash(Map<String, Map> processor) {
Map json_data = processor.get("json");
List<String> parts = new ArrayList();
parts.add(IngestConverter.createField("source",
IngestConverter.quoteString(
IngestConverter.dotsToSquareBrackets((String) json_data.get("field"))
)
));
if (json_data.containsKey("target_field")) {
parts.add(IngestConverter.createField(
"target",
IngestConverter.quoteString(
IngestConverter.dotsToSquareBrackets((String) json_data.get("target_field"))
)
));
}
return IngestConverter.joinHashFields(parts.toArray(new String[0]));
}
public static boolean has_json(Map<String, Object> processor) {
return processor.containsKey("json");
}
}

View file

@ -1,67 +0,0 @@
/*
* Licensed to Elasticsearch B.V. under one or more contributor
* license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright
* ownership. Elasticsearch B.V. licenses this file to you under
* the Apache License, Version 2.0 (the "License"); you may
* not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing,
* software distributed under the License is distributed on an
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
* KIND, either express or implied. See the License for the
* specific language governing permissions and limitations
* under the License.
*/
package org.logstash.ingest;
import com.fasterxml.jackson.core.JsonProcessingException;
import com.fasterxml.jackson.core.type.TypeReference;
import com.fasterxml.jackson.databind.ObjectMapper;
import java.util.HashMap;
import java.util.List;
import java.util.Map;
import java.util.stream.Collectors;
public class IngestLowercase {
/**
* Converts Ingest Lowercase JSON to LS mutate filter.
*/
@SuppressWarnings({"rawtypes", "unchecked"})
public static String toLogstash(String json, boolean appendStdio) throws JsonProcessingException {
ObjectMapper mapper = new ObjectMapper();
TypeReference<HashMap<String, Object>> typeRef = new TypeReference<HashMap<String, Object>>() {};
final HashMap<String, Object> jsonDefinition = mapper.readValue(json, typeRef);
final List<Map> processors = (List<Map>) jsonDefinition.get("processors");
List<String> filters_pipeline = processors.stream().map(IngestLowercase::mapProcessor).collect(Collectors.toList());
return IngestConverter.filtersToFile(
IngestConverter.appendIoPlugins(filters_pipeline, appendStdio));
}
@SuppressWarnings({"rawtypes", "unchecked"})
private static String mapProcessor(Map processor) {
return IngestConverter.filterHash(IngestConverter.createHash("mutate", lowercaseHash(processor)));
}
@SuppressWarnings({"rawtypes", "unchecked"})
static String lowercaseHash(Map<String, Map> processor) {
Map lowercase_data = processor.get("lowercase");
return IngestConverter.createField(
"lowercase",
IngestConverter.quoteString(
IngestConverter.dotsToSquareBrackets((String) lowercase_data.get("field"))
)
);
}
public static boolean has_lowercase(Map<String, Object> processor) {
return processor.containsKey("lowercase");
}
}

View file

@ -1,136 +0,0 @@
/*
* Licensed to Elasticsearch B.V. under one or more contributor
* license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright
* ownership. Elasticsearch B.V. licenses this file to you under
* the Apache License, Version 2.0 (the "License"); you may
* not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing,
* software distributed under the License is distributed on an
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
* KIND, either express or implied. See the License for the
* specific language governing permissions and limitations
* under the License.
*/
package org.logstash.ingest;
import com.fasterxml.jackson.core.JsonProcessingException;
import com.fasterxml.jackson.core.type.TypeReference;
import com.fasterxml.jackson.databind.ObjectMapper;
import java.util.ArrayList;
import java.util.Collections;
import java.util.HashMap;
import java.util.List;
import java.util.Map;
import java.util.stream.Collectors;
public class IngestPipeline {
/**
* Converts Ingest JSON to LS.
*/
@SuppressWarnings({"rawtypes", "unchecked"})
public static String toLogstash(String json, boolean appendStdio) throws JsonProcessingException {
ObjectMapper mapper = new ObjectMapper();
TypeReference<HashMap<String, Object>> typeRef = new TypeReference<HashMap<String, Object>>() {};
final HashMap<String, Object> jsonDefinition = mapper.readValue(json, typeRef);
final List<Map> processors = (List<Map>) jsonDefinition.get("processors");
List<String> filters_pipeline = processors.stream().map(IngestPipeline::mapProcessor).collect(Collectors.toList());
String logstash_pipeline = IngestConverter.filterHash(
IngestConverter.joinHashFields(filters_pipeline.toArray(new String[0])));
return IngestConverter.filtersToFile(
IngestConverter.appendIoPlugins(Collections.singletonList(logstash_pipeline), appendStdio));
}
@SuppressWarnings({"rawtypes", "unchecked"})
private static String mapProcessor(Map processor) {
List<String> filter_blocks = new ArrayList<>();
if (IngestGrok.has_grok(processor)) {
filter_blocks.add(IngestConverter.createHash(IngestGrok.get_name(), IngestGrok.grokHash(processor)));
if (IngestConverter.hasOnFailure(processor, IngestGrok.get_name())) {
filter_blocks.add(
handle_on_failure_pipeline(
IngestConverter.getOnFailure(processor, IngestGrok.get_name()),
"_grokparsefailure"
)
);
}
}
boolean processed = false;
if (IngestDate.has_date(processor)) {
filter_blocks.add(
IngestConverter.createHash("date", IngestDate.dateHash(processor))
);
processed = true;
}
if (IngestGeoIp.has_geoip(processor)) {
filter_blocks.add(
IngestConverter.createHash("geoip", IngestGeoIp.geoIpHash(processor))
);
processed = true;
}
if (IngestConvert.has_convert(processor)) {
filter_blocks.add(
IngestConverter.createHash("mutate", IngestConvert.convertHash(processor))
);
processed = true;
}
if (IngestGsub.has_gsub(processor)) {
filter_blocks.add(
IngestConverter.createHash("mutate", IngestGsub.gsubHash(processor))
);
processed = true;
}
if (IngestAppend.has_append(processor)) {
filter_blocks.add(
IngestConverter.createHash("mutate", IngestAppend.appendHash(processor))
);
processed = true;
}
if (IngestJson.has_json(processor)) {
filter_blocks.add(
IngestConverter.createHash("json", IngestJson.jsonHash(processor))
);
processed = true;
}
if (IngestRename.has_rename(processor)) {
filter_blocks.add(
IngestConverter.createHash("mutate", IngestRename.renameHash(processor))
);
processed = true;
}
if (IngestLowercase.has_lowercase(processor)) {
filter_blocks.add(
IngestConverter.createHash("mutate", IngestLowercase.lowercaseHash(processor))
);
processed = true;
}
if (IngestSet.has_set(processor)) {
filter_blocks.add(
IngestConverter.createHash("mutate", IngestSet.setHash(processor))
);
processed = true;
}
if (!processed) {
System.out.println("WARN Found unrecognized processor named: " + processor.keySet().iterator().next());
}
return IngestConverter.joinHashFields(filter_blocks.toArray(new String[0]));
}
@SuppressWarnings({"rawtypes", "unchecked"})
public static String handle_on_failure_pipeline(List<Map> on_failure_json, String tag_name) {
final List<String> mapped = on_failure_json.stream().map(IngestPipeline::mapProcessor).collect(Collectors.toList());
return IngestConverter.createTagConditional(tag_name,
IngestConverter.joinHashFields(mapped.toArray(new String[0]))
);
}
}

View file

@ -1,66 +0,0 @@
/*
* Licensed to Elasticsearch B.V. under one or more contributor
* license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright
* ownership. Elasticsearch B.V. licenses this file to you under
* the Apache License, Version 2.0 (the "License"); you may
* not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing,
* software distributed under the License is distributed on an
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
* KIND, either express or implied. See the License for the
* specific language governing permissions and limitations
* under the License.
*/
package org.logstash.ingest;
import com.fasterxml.jackson.core.JsonProcessingException;
import com.fasterxml.jackson.core.type.TypeReference;
import com.fasterxml.jackson.databind.ObjectMapper;
import java.util.HashMap;
import java.util.List;
import java.util.Map;
import java.util.stream.Collectors;
public class IngestRename {
/**
* Converts Ingest Rename JSON to LS mutate filter.
*/
@SuppressWarnings({"rawtypes", "unchecked"})
public static String toLogstash(String json, boolean appendStdio) throws JsonProcessingException {
ObjectMapper mapper = new ObjectMapper();
TypeReference<HashMap<String, Object>> typeRef = new TypeReference<HashMap<String, Object>>() {};
final HashMap<String, Object> jsonDefinition = mapper.readValue(json, typeRef);
final List<Map> processors = (List<Map>) jsonDefinition.get("processors");
List<String> filters_pipeline = processors.stream().map(IngestRename::mapProcessor).collect(Collectors.toList());
return IngestConverter.filtersToFile(
IngestConverter.appendIoPlugins(filters_pipeline, appendStdio));
}
@SuppressWarnings({"rawtypes", "unchecked"})
private static String mapProcessor(Map processor) {
return IngestConverter.filterHash(IngestConverter.createHash("mutate", renameHash(processor)));
}
@SuppressWarnings({"rawtypes", "unchecked"})
static String renameHash(Map<String, Map> processor) {
Map rename_json = processor.get("rename");
final String mutateContents = IngestConverter.createField(
IngestConverter.quoteString(IngestConverter.dotsToSquareBrackets((String) rename_json.get("field"))),
IngestConverter.quoteString(IngestConverter.dotsToSquareBrackets((String) rename_json.get("target_field")))
);
return IngestConverter.createField("rename", IngestConverter.wrapInCurly(mutateContents));
}
public static boolean has_rename(Map<String, Object> processor) {
return processor.containsKey("rename");
}
}

View file

@ -1,80 +0,0 @@
/*
* Licensed to Elasticsearch B.V. under one or more contributor
* license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright
* ownership. Elasticsearch B.V. licenses this file to you under
* the Apache License, Version 2.0 (the "License"); you may
* not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing,
* software distributed under the License is distributed on an
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
* KIND, either express or implied. See the License for the
* specific language governing permissions and limitations
* under the License.
*/
package org.logstash.ingest;
import com.fasterxml.jackson.core.JsonProcessingException;
import com.fasterxml.jackson.core.type.TypeReference;
import com.fasterxml.jackson.databind.ObjectMapper;
import java.util.HashMap;
import java.util.List;
import java.util.Map;
import java.util.stream.Collectors;
public class IngestSet {
/**
* Converts Ingest Set JSON to LS mutate filter.
*/
@SuppressWarnings({"rawtypes", "unchecked"})
public static String toLogstash(String json, boolean appendStdio) throws JsonProcessingException {
ObjectMapper mapper = new ObjectMapper();
TypeReference<HashMap<String, Object>> typeRef = new TypeReference<HashMap<String, Object>>() {};
final HashMap<String, Object> jsonDefinition = mapper.readValue(json, typeRef);
final List<Map> processors = (List<Map>) jsonDefinition.get("processors");
List<String> filters_pipeline = processors.stream().map(IngestSet::mapProcessor).collect(Collectors.toList());
return IngestConverter.filtersToFile(
IngestConverter.appendIoPlugins(filters_pipeline, appendStdio));
}
@SuppressWarnings({"rawtypes", "unchecked"})
private static String mapProcessor(Map processor) {
return IngestConverter.filterHash(IngestConverter.createHash("mutate", setHash(processor)));
}
@SuppressWarnings({"rawtypes", "unchecked"})
static String setHash(Map<String, Map> processor) {
Map set_json = processor.get("set");
final Object value = set_json.get("value");
final Object value_contents;
if (value instanceof String) {
value_contents = IngestConverter.quoteString((String) value);
} else {
value_contents = value;
}
if (set_json.containsKey("if") && set_json.get("if") != null) {
String painless_condition = (String) set_json.get("if");
if (!painless_condition.isEmpty()) {
System.out.println("WARN Found in 'set' processor an 'if' painless condition not translated: " + painless_condition);
}
}
String mutate_contents = IngestConverter.createField(
IngestConverter.quoteString(IngestConverter.dotsToSquareBrackets((String) set_json.get("field"))),
value_contents.toString());
return IngestConverter.createField("add_field", IngestConverter.wrapInCurly(mutate_contents));
}
public static boolean has_set(Map<String, Object> processor) {
return processor.containsKey("set");
}
}

View file

@ -1,210 +0,0 @@
/*
* Licensed to Elasticsearch B.V. under one or more contributor
* license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright
* ownership. Elasticsearch B.V. licenses this file to you under
* the Apache License, Version 2.0 (the "License"); you may
* not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing,
* software distributed under the License is distributed on an
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
* KIND, either express or implied. See the License for the
* specific language governing permissions and limitations
* under the License.
*/
package org.logstash.ingest;
import java.io.IOException;
import java.io.InputStreamReader;
import java.io.Reader;
import java.net.URI;
import java.nio.charset.StandardCharsets;
import java.nio.file.Files;
import java.nio.file.Paths;
import javax.script.Invocable;
import javax.script.ScriptEngine;
import javax.script.ScriptEngineManager;
import javax.script.ScriptException;
import joptsimple.OptionException;
import joptsimple.OptionParser;
import joptsimple.OptionSet;
import joptsimple.OptionSpec;
final class JsUtil {
/**
* Script names used by the converter in correct load order.
*/
private static final String[] SCRIPTS = {
"shared", "date", "grok", "geoip", "gsub", "pipeline", "convert", "append", "json",
"rename", "lowercase", "set"
};
private JsUtil() {
// Utility Class
}
/**
* Sets up a {@link ScriptEngine} with all Ingest to LS DSL Converter JS scripts loaded.
* @return {@link ScriptEngine} for Ingest to LS DSL Converter
*/
public static ScriptEngine engine() {
final ScriptEngine engine =
new ScriptEngineManager().getEngineByName("nashorn");
try {
for (final String file : SCRIPTS) {
add(engine, String.format("/ingest-%s.js", file));
}
} catch (final IOException | ScriptException ex) {
throw new IllegalStateException(ex);
}
return engine;
}
/**
* Converts the given files from ingest to LS conf using the javascript function
* @param args CLI Arguments
* @param jsFunc JS function to call
* @throws ScriptException
* @throws NoSuchMethodException
*/
public static void convert(final String[] args, final String jsFunc)
throws ScriptException, NoSuchMethodException {
final OptionParser parser = new OptionParser();
final OptionSpec<URI> input = parser.accepts(
"input",
"Input JSON file location URI. Only supports 'file://' as URI schema."
).withRequiredArg().ofType(URI.class).required().forHelp();
final OptionSpec<URI> output = parser.accepts(
"output",
"Output Logstash DSL file location URI. Only supports 'file://' as URI schema."
).withRequiredArg().ofType(URI.class).required().forHelp();
final OptionSpec<Void> appendStdio = parser.accepts(
"append-stdio",
"Flag to append stdin and stdout as outputs instead of the default ES output."
).forHelp();
try {
final OptionSet options;
try {
options = parser.parse(args);
} catch (final OptionException ex) {
parser.printHelpOn(System.out);
throw ex;
}
switch (jsFunc) {
case "ingest_append_to_logstash":
Files.write(
Paths.get(options.valueOf(output)),
IngestAppend.toLogstash(input(options.valueOf(input)), options.has(appendStdio)).getBytes(StandardCharsets.UTF_8)
);
break;
case "ingest_convert_to_logstash":
Files.write(
Paths.get(options.valueOf(output)),
IngestConvert.toLogstash(input(options.valueOf(input)), options.has(appendStdio)).getBytes(StandardCharsets.UTF_8)
);
break;
case "ingest_to_logstash_date":
Files.write(
Paths.get(options.valueOf(output)),
IngestDate.toLogstash(input(options.valueOf(input)), options.has(appendStdio)).getBytes(StandardCharsets.UTF_8)
);
break;
case "ingest_to_logstash_geoip":
Files.write(
Paths.get(options.valueOf(output)),
IngestGeoIp.toLogstash(input(options.valueOf(input)), options.has(appendStdio)).getBytes(StandardCharsets.UTF_8)
);
break;
case "ingest_to_logstash_grok":
Files.write(
Paths.get(options.valueOf(output)),
IngestGrok.toLogstash(input(options.valueOf(input)), options.has(appendStdio)).getBytes(StandardCharsets.UTF_8)
);
break;
case "ingest_to_logstash_gsub":
Files.write(
Paths.get(options.valueOf(output)),
IngestGsub.toLogstash(input(options.valueOf(input)), options.has(appendStdio)).getBytes(StandardCharsets.UTF_8)
);
break;
case "ingest_json_to_logstash":
Files.write(
Paths.get(options.valueOf(output)),
IngestJson.toLogstash(input(options.valueOf(input)), options.has(appendStdio)).getBytes(StandardCharsets.UTF_8)
);
break;
case "ingest_lowercase_to_logstash":
Files.write(
Paths.get(options.valueOf(output)),
IngestLowercase.toLogstash(input(options.valueOf(input)), options.has(appendStdio)).getBytes(StandardCharsets.UTF_8)
);
break;
case "ingest_rename_to_logstash":
Files.write(
Paths.get(options.valueOf(output)),
IngestRename.toLogstash(input(options.valueOf(input)), options.has(appendStdio)).getBytes(StandardCharsets.UTF_8)
);
break;
case "ingest_set_to_logstash":
Files.write(
Paths.get(options.valueOf(output)),
IngestSet.toLogstash(input(options.valueOf(input)), options.has(appendStdio)).getBytes(StandardCharsets.UTF_8)
);
break;
case "ingest_pipeline_to_logstash":
Files.write(
Paths.get(options.valueOf(output)),
IngestPipeline.toLogstash(input(options.valueOf(input)), options.has(appendStdio)).getBytes(StandardCharsets.UTF_8)
);
break;
default: {
throw new IllegalArgumentException("Can't recognize " + jsFunc + " processor");
}
}
} catch (final IOException ex) {
throw new IllegalStateException(ex);
}
}
/**
* Retrieves the input Ingest JSON from a given {@link URI}.
* @param uri {@link URI} of Ingest JSON
* @return Json String
* @throws IOException On failure to load Ingest JSON
*/
private static String input(final URI uri) throws IOException {
if ("file".equals(uri.getScheme())) {
return new String(
Files.readAllBytes(Paths.get(uri)), StandardCharsets.UTF_8
);
}
throw new IllegalArgumentException("--input must be of schema file://");
}
private static void add(final ScriptEngine engine, final String file)
throws IOException, ScriptException {
try (final Reader reader =
new InputStreamReader(JsUtil.class.getResourceAsStream(file))) {
engine.eval(reader);
}
}
/***
* Not empty check with nullability
* @param s string to check
* @return true iff s in not null and not empty
*/
static boolean isNotEmpty(String s) {
return s != null && !s.isEmpty();
}
}

View file

@ -1,36 +0,0 @@
/*
* Licensed to Elasticsearch B.V. under one or more contributor
* license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright
* ownership. Elasticsearch B.V. licenses this file to you under
* the Apache License, Version 2.0 (the "License"); you may
* not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing,
* software distributed under the License is distributed on an
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
* KIND, either express or implied. See the License for the
* specific language governing permissions and limitations
* under the License.
*/
package org.logstash.ingest;
import javax.script.ScriptException;
/**
* Ingest JSON processor DSL to Logstash json Transpiler.
*/
public class Json {
private Json() {
// Utility Wrapper for JS Script.
}
public static void main(final String... args) throws ScriptException, NoSuchMethodException {
JsUtil.convert(args, "ingest_json_to_logstash");
}
}

View file

@ -1,37 +0,0 @@
/*
* Licensed to Elasticsearch B.V. under one or more contributor
* license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright
* ownership. Elasticsearch B.V. licenses this file to you under
* the Apache License, Version 2.0 (the "License"); you may
* not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing,
* software distributed under the License is distributed on an
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
* KIND, either express or implied. See the License for the
* specific language governing permissions and limitations
* under the License.
*/
package org.logstash.ingest;
import javax.script.ScriptException;
/**
* Ingest Lowercase DSL to Logstash mutate Transpiler.
*/
public final class Lowercase {
private Lowercase() {
// Utility Wrapper for JS Script.
}
public static void main(final String... args) throws ScriptException, NoSuchMethodException {
JsUtil.convert(args, "ingest_lowercase_to_logstash");
}
}

View file

@ -1,37 +0,0 @@
/*
* Licensed to Elasticsearch B.V. under one or more contributor
* license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright
* ownership. Elasticsearch B.V. licenses this file to you under
* the Apache License, Version 2.0 (the "License"); you may
* not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing,
* software distributed under the License is distributed on an
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
* KIND, either express or implied. See the License for the
* specific language governing permissions and limitations
* under the License.
*/
package org.logstash.ingest;
import javax.script.ScriptException;
/**
* Ingest Full DSL to Logstash DSL Transpiler.
*/
public final class Pipeline {
private Pipeline() {
// Utility Wrapper for JS Script.
}
public static void main(final String... args) throws ScriptException, NoSuchMethodException {
JsUtil.convert(args, "ingest_pipeline_to_logstash");
}
}

View file

@ -1,33 +0,0 @@
/*
* Licensed to Elasticsearch B.V. under one or more contributor
* license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright
* ownership. Elasticsearch B.V. licenses this file to you under
* the Apache License, Version 2.0 (the "License"); you may
* not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing,
* software distributed under the License is distributed on an
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
* KIND, either express or implied. See the License for the
* specific language governing permissions and limitations
* under the License.
*/
package org.logstash.ingest;
import javax.script.ScriptException;
public class Rename {
private Rename() {
// Utility Wrapper for JS Script.
}
public static void main(final String... args) throws ScriptException, NoSuchMethodException {
JsUtil.convert(args, "ingest_rename_to_logstash");
}
}

View file

@ -1,36 +0,0 @@
/*
* Licensed to Elasticsearch B.V. under one or more contributor
* license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright
* ownership. Elasticsearch B.V. licenses this file to you under
* the Apache License, Version 2.0 (the "License"); you may
* not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing,
* software distributed under the License is distributed on an
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
* KIND, either express or implied. See the License for the
* specific language governing permissions and limitations
* under the License.
*/
package org.logstash.ingest;
import javax.script.ScriptException;
/**
* Ingest Set DSL to Logstash mutate Transpiler.
*/
public class Set {
private Set() {
// Utility Wrapper for JS Script.
}
public static void main(final String... args) throws ScriptException, NoSuchMethodException {
JsUtil.convert(args, "ingest_set_to_logstash");
}
}

View file

@ -1,39 +0,0 @@
/*
* Licensed to Elasticsearch B.V. under one or more contributor
* license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright
* ownership. Elasticsearch B.V. licenses this file to you under
* the Apache License, Version 2.0 (the "License"); you may
* not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing,
* software distributed under the License is distributed on an
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
* KIND, either express or implied. See the License for the
* specific language governing permissions and limitations
* under the License.
*/
package org.logstash.ingest;
import java.util.Arrays;
import org.junit.Test;
import static org.junit.runners.Parameterized.Parameters;
public final class AppendTest extends IngestTest {
@Parameters
public static Iterable<String> data() {
return Arrays.asList("Append", "DotsInAppendField", "AppendScalar");
}
@Test
public void convertsAppendProcessorCorrectly() throws Exception {
assertCorrectConversion(Append.class);
}
}

View file

@ -1,39 +0,0 @@
/*
* Licensed to Elasticsearch B.V. under one or more contributor
* license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright
* ownership. Elasticsearch B.V. licenses this file to you under
* the Apache License, Version 2.0 (the "License"); you may
* not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing,
* software distributed under the License is distributed on an
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
* KIND, either express or implied. See the License for the
* specific language governing permissions and limitations
* under the License.
*/
package org.logstash.ingest;
import java.util.Arrays;
import org.junit.Test;
import static org.junit.runners.Parameterized.Parameters;
public final class ConvertTest extends IngestTest {
@Parameters
public static Iterable<String> data() {
return Arrays.asList("Convert", "DotsInConvertField", "ConvertBoolean", "ConvertString");
}
@Test
public void convertsConvertProcessorCorrectly() throws Exception {
assertCorrectConversion(Convert.class);
}
}

View file

@ -1,39 +0,0 @@
/*
* Licensed to Elasticsearch B.V. under one or more contributor
* license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright
* ownership. Elasticsearch B.V. licenses this file to you under
* the Apache License, Version 2.0 (the "License"); you may
* not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing,
* software distributed under the License is distributed on an
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
* KIND, either express or implied. See the License for the
* specific language governing permissions and limitations
* under the License.
*/
package org.logstash.ingest;
import java.util.Arrays;
import org.junit.Test;
import static org.junit.runners.Parameterized.Parameters;
public final class DateTest extends IngestTest {
@Parameters
public static Iterable<String> data() {
return Arrays.asList("Date", "DateExtraFields", "DotsInDateField");
}
@Test
public void convertsDateFieldCorrectly() throws Exception {
assertCorrectConversion(Date.class);
}
}

View file

@ -1,39 +0,0 @@
/*
* Licensed to Elasticsearch B.V. under one or more contributor
* license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright
* ownership. Elasticsearch B.V. licenses this file to you under
* the Apache License, Version 2.0 (the "License"); you may
* not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing,
* software distributed under the License is distributed on an
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
* KIND, either express or implied. See the License for the
* specific language governing permissions and limitations
* under the License.
*/
package org.logstash.ingest;
import java.util.Arrays;
import org.junit.Test;
import static org.junit.runners.Parameterized.Parameters;
public final class GeoIpTest extends IngestTest {
@Parameters
public static Iterable<String> data() {
return Arrays.asList("GeoIpSimple", "DotsInGeoIpField");
}
@Test
public void convertsGeoIpFieldCorrectly() throws Exception {
assertCorrectConversion(GeoIp.class);
}
}

View file

@ -1,39 +0,0 @@
/*
* Licensed to Elasticsearch B.V. under one or more contributor
* license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright
* ownership. Elasticsearch B.V. licenses this file to you under
* the Apache License, Version 2.0 (the "License"); you may
* not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing,
* software distributed under the License is distributed on an
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
* KIND, either express or implied. See the License for the
* specific language governing permissions and limitations
* under the License.
*/
package org.logstash.ingest;
import java.util.Arrays;
import org.junit.Test;
import static org.junit.runners.Parameterized.Parameters;
public final class GrokTest extends IngestTest {
@Parameters
public static Iterable<String> data() {
return Arrays.asList("Grok", "GrokPatternDefinition", "GrokMultiplePatternDefinitions");
}
@Test
public void convertsGrokFieldCorrectly() throws Exception {
assertCorrectConversion(Grok.class);
}
}

View file

@ -1,39 +0,0 @@
/*
* Licensed to Elasticsearch B.V. under one or more contributor
* license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright
* ownership. Elasticsearch B.V. licenses this file to you under
* the Apache License, Version 2.0 (the "License"); you may
* not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing,
* software distributed under the License is distributed on an
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
* KIND, either express or implied. See the License for the
* specific language governing permissions and limitations
* under the License.
*/
package org.logstash.ingest;
import java.util.Collections;
import org.junit.Test;
import static org.junit.runners.Parameterized.Parameters;
public final class GsubTest extends IngestTest {
@Parameters
public static Iterable<String> data() {
return Collections.singletonList("GsubSimple");
}
@Test
public void convertsGsubCorrectly() throws Exception {
assertCorrectConversion(Gsub.class);
}
}

View file

@ -1,102 +0,0 @@
/*
* Licensed to Elasticsearch B.V. under one or more contributor
* license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright
* ownership. Elasticsearch B.V. licenses this file to you under
* the Apache License, Version 2.0 (the "License"); you may
* not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing,
* software distributed under the License is distributed on an
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
* KIND, either express or implied. See the License for the
* specific language governing permissions and limitations
* under the License.
*/
package org.logstash.ingest;
import java.io.ByteArrayOutputStream;
import java.io.IOException;
import java.io.InputStream;
import java.net.URL;
import java.nio.charset.StandardCharsets;
import java.util.regex.Pattern;
import org.apache.commons.io.IOUtils;
import org.junit.Rule;
import org.junit.rules.TemporaryFolder;
import org.junit.runner.RunWith;
import org.junit.runners.Parameterized;
import static org.hamcrest.CoreMatchers.is;
import static org.hamcrest.MatcherAssert.assertThat;
import static org.junit.runners.Parameterized.Parameter;
/**
* Base class for ingest migration tests
*/
@RunWith(Parameterized.class)
public abstract class IngestTest {
/**
* Used to normalize line endings since static reference result files have Unix line endings.
*/
private static final Pattern CR_LF =
Pattern.compile("\\r\\n");
/**
* Used to normalize line endings since static reference result files have Unix line endings.
*/
private static final Pattern CARRIAGE_RETURN = Pattern.compile("\\r");
@Rule
public TemporaryFolder temp = new TemporaryFolder();
@Parameter
public String testCase;
protected final void assertCorrectConversion(final Class<?> clazz) throws Exception {
final URL append = getResultPath(temp);
clazz.getMethod("main", String[].class).invoke(
null,
(Object) new String[]{
String.format("--input=%s", resourcePath(String.format("ingest%s.json", testCase))),
String.format("--output=%s", append)
}
);
assertThat(
utf8File(append), is(utf8File(resourcePath(String.format("logstash%s.conf", testCase))))
);
}
/**
* Reads a file, normalizes line endings to Unix line endings and returns the whole content
* as a String.
* @param path Url to read
* @return String content of the URL
* @throws IOException On failure to read from given URL
*/
private static String utf8File(final URL path) throws IOException {
final ByteArrayOutputStream baos = new ByteArrayOutputStream();
try (final InputStream input = path.openStream()) {
IOUtils.copy(input, baos);
}
return CARRIAGE_RETURN.matcher(
CR_LF.matcher(
baos.toString(StandardCharsets.UTF_8.name())
).replaceAll("\n")
).replaceAll("\n");
}
private static URL resourcePath(final String name) {
return IngestTest.class.getResource(name);
}
private static URL getResultPath(TemporaryFolder temp) throws IOException {
return temp.newFolder().toPath().resolve("converted").toUri().toURL();
}
}

View file

@ -1,39 +0,0 @@
/*
* Licensed to Elasticsearch B.V. under one or more contributor
* license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright
* ownership. Elasticsearch B.V. licenses this file to you under
* the Apache License, Version 2.0 (the "License"); you may
* not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing,
* software distributed under the License is distributed on an
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
* KIND, either express or implied. See the License for the
* specific language governing permissions and limitations
* under the License.
*/
package org.logstash.ingest;
import java.util.Arrays;
import org.junit.Test;
import static org.junit.runners.Parameterized.Parameters;
public final class JsonTest extends IngestTest {
@Parameters
public static Iterable<String> data() {
return Arrays.asList("Json", "DotsInJsonField", "JsonExtraFields");
}
@Test
public void convertsConvertProcessorCorrectly() throws Exception {
assertCorrectConversion(Json.class);
}
}

View file

@ -1,39 +0,0 @@
/*
* Licensed to Elasticsearch B.V. under one or more contributor
* license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright
* ownership. Elasticsearch B.V. licenses this file to you under
* the Apache License, Version 2.0 (the "License"); you may
* not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing,
* software distributed under the License is distributed on an
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
* KIND, either express or implied. See the License for the
* specific language governing permissions and limitations
* under the License.
*/
package org.logstash.ingest;
import java.util.Arrays;
import org.junit.Test;
import static org.junit.runners.Parameterized.Parameters;
public final class LowercaseTest extends IngestTest {
@Parameters
public static Iterable<String> data() {
return Arrays.asList("LowercaseSimple", "LowercaseDots");
}
@Test
public void convertsAppendProcessorCorrectly() throws Exception {
assertCorrectConversion(Lowercase.class);
}
}

View file

@ -1,55 +0,0 @@
/*
* Licensed to Elasticsearch B.V. under one or more contributor
* license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright
* ownership. Elasticsearch B.V. licenses this file to you under
* the Apache License, Version 2.0 (the "License"); you may
* not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing,
* software distributed under the License is distributed on an
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
* KIND, either express or implied. See the License for the
* specific language governing permissions and limitations
* under the License.
*/
package org.logstash.ingest;
import java.util.ArrayList;
import java.util.Collection;
import org.junit.Test;
import static org.junit.runners.Parameterized.Parameters;
public final class PipelineTest extends IngestTest {
@Parameters
public static Iterable<String> data() {
final Collection<String> cases = new ArrayList<>();
cases.add("ComplexCase1");
cases.add("ComplexCase2");
cases.add("ComplexCase3");
cases.add("ComplexCase4");
GeoIpTest.data().forEach(cases::add);
DateTest.data().forEach(cases::add);
GrokTest.data().forEach(cases::add);
ConvertTest.data().forEach(cases::add);
GsubTest.data().forEach(cases::add);
AppendTest.data().forEach(cases::add);
JsonTest.data().forEach(cases::add);
RenameTest.data().forEach(cases::add);
LowercaseTest.data().forEach(cases::add);
SetTest.data().forEach(cases::add);
return cases;
}
@Test
public void convertsComplexCaseCorrectly() throws Exception {
assertCorrectConversion(Pipeline.class);
}
}

View file

@ -1,39 +0,0 @@
/*
* Licensed to Elasticsearch B.V. under one or more contributor
* license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright
* ownership. Elasticsearch B.V. licenses this file to you under
* the Apache License, Version 2.0 (the "License"); you may
* not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing,
* software distributed under the License is distributed on an
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
* KIND, either express or implied. See the License for the
* specific language governing permissions and limitations
* under the License.
*/
package org.logstash.ingest;
import java.util.Arrays;
import org.junit.Test;
import static org.junit.runners.Parameterized.Parameters;
public final class RenameTest extends IngestTest {
@Parameters
public static Iterable<String> data() {
return Arrays.asList("Rename", "DotsInRenameField");
}
@Test
public void convertsConvertProcessorCorrectly() throws Exception {
assertCorrectConversion(Rename.class);
}
}

View file

@ -1,39 +0,0 @@
/*
* Licensed to Elasticsearch B.V. under one or more contributor
* license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright
* ownership. Elasticsearch B.V. licenses this file to you under
* the Apache License, Version 2.0 (the "License"); you may
* not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing,
* software distributed under the License is distributed on an
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
* KIND, either express or implied. See the License for the
* specific language governing permissions and limitations
* under the License.
*/
package org.logstash.ingest;
import java.util.Arrays;
import org.junit.Test;
import static org.junit.runners.Parameterized.Parameters;
public final class SetTest extends IngestTest {
@Parameters
public static Iterable<String> data() {
return Arrays.asList("Set", "DotsInSetField", "SetNumber");
}
@Test
public void convertsSetProcessorCorrectly() throws Exception {
assertCorrectConversion(Set.class);
}
}

View file

@ -1,11 +0,0 @@
{
"description": "Pipeline to parse Apache logs",
"processors": [
{
"append": {
"field" : "client",
"value": ["host1", "host2"]
}
}
]
}

View file

@ -1,11 +0,0 @@
{
"description": "Pipeline to parse Apache logs",
"processors": [
{
"append": {
"field" : "foo",
"value": "bar"
}
}
]
}

View file

@ -1,52 +0,0 @@
{
"description": "Pipeline to parse Apache logs",
"processors": [
{
"grok": {
"field": "message",
"patterns": [
"%{COMBINEDAPACHELOG}"
]
}
},
{
"date": {
"field": "timestamp",
"target_field": "@timestamp",
"formats": [
"dd/MMM/YYYY:HH:mm:ss Z"
],
"locale": "en"
}
},
{
"geoip": {
"field": "client.ip",
"target_field": "geo"
}
},
{
"convert": {
"field" : "bytes",
"type": "integer"
}
},
{
"append": {
"field" : "response_code",
"value": ["200", "400", "503"]
}
},
{
"json": {
"field": "string_source"
}
},
{
"rename": {
"field": "foo",
"target_field": "foobar"
}
}
]
}

View file

@ -1,41 +0,0 @@
{
"description": "Pipeline to parse Apache logs",
"processors": [
{
"grok": {
"field": "message",
"patterns": [
"%{COMBINEDAPACHELOG}"
]
}
},
{
"date": {
"field": "timestamp",
"target_field": "@timestamp",
"formats": [
"dd/MMM/YYYY:HH:mm:ss Z"
],
"locale": "en"
}
},
{
"geoip": {
"field": "client.ip",
"target_field": "client.geo"
}
},
{
"geoip": {
"field": "source.ip",
"target_field": "source.geo"
}
},
{
"convert": {
"field" : "[client][bytes]",
"type": "integer"
}
}
]
}

View file

@ -1,35 +0,0 @@
{
"description": "Pipeline to parse Apache logs",
"processors": [
{
"grok": {
"field": "message",
"patterns": ["%{COMBINEDAPACHELOG}"],
"on_failure" : [
{
"set" : {
"field" : "error",
"value" : "field does not exist"
}
}
]
}
},
{
"date": {
"field": "timestamp",
"target_field": "@timestamp",
"formats": [
"dd/MMM/YYYY:HH:mm:ss Z"
],
"locale": "en"
}
},
{
"geoip": {
"field": "clientip",
"target_field": "geo"
}
}
]
}

View file

@ -1,41 +0,0 @@
{
"description": "Pipeline to parse Apache logs",
"processors": [
{
"grok": {
"field": "message",
"patterns": ["%{COMBINEDAPACHELOG}"],
"on_failure" : [
{
"set" : {
"field" : "error",
"value" : "field does not exist"
}
},
{
"convert": {
"field" : "client.ip",
"type": "integer"
}
}
]
}
},
{
"date": {
"field": "timestamp",
"target_field": "@timestamp",
"formats": [
"dd/MMM/YYYY:HH:mm:ss Z"
],
"locale": "en"
}
},
{
"geoip": {
"field": "clientip",
"target_field": "geo"
}
}
]
}

View file

@ -1,10 +0,0 @@
{
"processors": [
{
"convert": {
"field" : "bytes",
"type": "integer"
}
}
]
}

View file

@ -1,10 +0,0 @@
{
"processors": [
{
"convert": {
"field" : "delete",
"type": "boolean"
}
}
]
}

View file

@ -1,10 +0,0 @@
{
"processors": [
{
"convert": {
"field" : "blah",
"type": "string"
}
}
]
}

View file

@ -1,12 +0,0 @@
{
"description" : "...",
"processors" : [
{
"date" : {
"field" : "initial_date",
"target_field" : "timestamp",
"formats" : ["dd/MM/yyyy hh:mm:ss", "dd/MM/yyyy"]
}
}
]
}

View file

@ -1,14 +0,0 @@
{
"description" : "...",
"processors" : [
{
"date" : {
"field" : "initial_date",
"target_field" : "timestamp",
"formats" : ["dd/MM/yyyy hh:mm:ss", "dd/MM/yyyy"],
"timezone" : "Europe/Amsterdam",
"locale": "en"
}
}
]
}

View file

@ -1,11 +0,0 @@
{
"description": "Pipeline to parse Apache logs",
"processors": [
{
"append": {
"field" : "client.ip",
"value": ["127.0.0.1", "127.0.0.2"]
}
}
]
}

View file

@ -1,11 +0,0 @@
{
"description": "Pipeline to parse Apache logs",
"processors": [
{
"convert": {
"field" : "client.bytes",
"type": "float"
}
}
]
}

View file

@ -1,14 +0,0 @@
{
"description" : "...",
"processors" : [
{
"date" : {
"field" : "initial_date",
"target_field" : "apache.timestamp",
"formats" : ["dd/MM/yyyy hh:mm:ss", "dd/MM/yyyy"],
"timezone" : "Europe/Amsterdam",
"locale": "en"
}
}
]
}

View file

@ -1,13 +0,0 @@
{
"description" : "Add geoip info",
"processors" : [
{
"geoip" : {
"field" : "ip",
"target_field" : "apache.geo",
"database_file" : "GeoLite2-Country.mmdb.gz",
"properties": ["continent_name", "country_iso_code"]
}
}
]
}

View file

@ -1,11 +0,0 @@
{
"description": "ExampleJson",
"processors": [
{
"json": {
"field": "[foo][string_source]",
"target_field": "[bar][json_target]"
}
}
]
}

View file

@ -1,11 +0,0 @@
{
"description": "ExampleRename",
"processors": [
{
"rename": {
"field": "foo.bar",
"target_field": "foo.baz"
}
}
]
}

View file

@ -1,11 +0,0 @@
{
"description": "SetExample",
"processors": [
{
"set": {
"field": "foo.bar",
"value": "baz"
}
}
]
}

View file

@ -1,16 +0,0 @@
{
"description": "Add geoip info",
"processors": [
{
"geoip": {
"field": "ip",
"target_field": "geo",
"database_file": "GeoLite2-Country.mmdb.gz",
"properties": [
"continent_name",
"country_iso_code"
]
}
}
]
}

View file

@ -1,15 +0,0 @@
{
"description": "Pipeline for parsing apache error logs",
"processors": [
{
"grok": {
"field": "message",
"patterns": [
"%{IPORHOST:apache2.access.remote_ip} - %{DATA:apache2.access.user_name} \\[%{HTTPDATE:apache2.access.time}\\] \"%{WORD:apache2.access.method} %{DATA:apache2.access.url} HTTP/%{NUMBER:apache2.access.http_version}\" %{NUMBER:apache2.access.response_code} (?:%{NUMBER:apache2.access.body_sent.bytes}|-)( \"%{DATA:apache2.access.referrer}\")?( \"%{DATA:apache2.access.agent}\")?",
"%{IPORHOST:apache2.access.remote_ip} - %{DATA:apache2.access.user_name} \\[%{HTTPDATE:apache2.access.time}\\] \"-\" %{NUMBER:apache2.access.response_code} -"
],
"ignore_missing": true
}
}
]
}

View file

@ -1,19 +0,0 @@
{
"description":"Syslog",
"processors":[
{
"grok":{
"field":"message",
"patterns":[
"%{SYSLOGTIMESTAMP:system.syslog.timestamp} %{SYSLOGHOST:system.syslog.hostname} %{DATA:system.syslog.program}(?:\\[%{POSINT:system.syslog.pid}\\])?: %{GREEDYMULTILINE:system.syslog.message}",
"%{SYSLOGTIMESTAMP:system.syslog.timestamp} %{GREEDYMULTILINE:system.syslog.message}"
],
"pattern_definitions":{
"GREEDYMULTILINE":"(.|\\n)*",
"AUDIT_TYPE": "^type=%{NOTSPACE:auditd.log.record_type}"
},
"ignore_missing":true
}
}
]
}

View file

@ -1,18 +0,0 @@
{
"description":"Syslog",
"processors":[
{
"grok":{
"field":"message",
"patterns":[
"%{SYSLOGTIMESTAMP:system.syslog.timestamp} %{SYSLOGHOST:system.syslog.hostname} %{DATA:system.syslog.program}(?:\\[%{POSINT:system.syslog.pid}\\])?: %{GREEDYMULTILINE:system.syslog.message}",
"%{SYSLOGTIMESTAMP:system.syslog.timestamp} %{GREEDYMULTILINE:system.syslog.message}"
],
"pattern_definitions":{
"GREEDYMULTILINE":"(.|\\n)*"
},
"ignore_missing":true
}
}
]
}

View file

@ -1,12 +0,0 @@
{
"description": "ExampleGsub",
"processors": [
{
"gsub": {
"field": "field1",
"pattern": "\\.",
"replacement": "_"
}
}
]
}

View file

@ -1,10 +0,0 @@
{
"description": "ExampleJson",
"processors": [
{
"json": {
"field": "string_source"
}
}
]
}

View file

@ -1,11 +0,0 @@
{
"description": "ExampleJson",
"processors": [
{
"json": {
"field": "string_source",
"target_field": "json_target"
}
}
]
}

View file

@ -1,10 +0,0 @@
{
"description": "ExampleLowercase",
"processors": [
{
"lowercase": {
"field": "foo.bar"
}
}
]
}

View file

@ -1,10 +0,0 @@
{
"description": "ExampleLowercase",
"processors": [
{
"lowercase": {
"field": "foo"
}
}
]
}

View file

@ -1,11 +0,0 @@
{
"description": "ExampleRename",
"processors": [
{
"rename": {
"field": "foo",
"target_field": "foobar"
}
}
]
}

View file

@ -1,12 +0,0 @@
{
"description": "SetExample",
"processors": [
{
"set": {
"field": "field1",
"value": "bar"
}
}
]
}

View file

@ -1,11 +0,0 @@
{
"description": "SetExample",
"processors": [
{
"set": {
"field": "field1",
"value": 5344.4
}
}
]
}

View file

@ -1,15 +0,0 @@
filter {
mutate {
add_field => {
"client" => [
"host1",
"host2"
]
}
}
}
output {
elasticsearch {
hosts => "localhost"
}
}

View file

@ -1,12 +0,0 @@
filter {
mutate {
add_field => {
"foo" => "bar"
}
}
}
output {
elasticsearch {
hosts => "localhost"
}
}

View file

@ -1,46 +0,0 @@
filter {
grok {
match => {
"message" => "%{COMBINEDAPACHELOG}"
}
}
date {
match => [
"timestamp",
"dd/MMM/YYYY:HH:mm:ss Z"
]
target => "@timestamp"
locale => "en"
}
geoip {
source => "[client][ip]"
target => "geo"
}
mutate {
convert => {
"bytes" => "integer"
}
}
mutate {
add_field => {
"response_code" => [
"200",
"400",
"503"
]
}
}
json {
source => "string_source"
}
mutate {
rename => {
"foo" => "foobar"
}
}
}
output {
elasticsearch {
hosts => "localhost"
}
}

View file

@ -1,33 +0,0 @@
filter {
grok {
match => {
"message" => "%{COMBINEDAPACHELOG}"
}
}
date {
match => [
"timestamp",
"dd/MMM/YYYY:HH:mm:ss Z"
]
target => "@timestamp"
locale => "en"
}
geoip {
source => "[client][ip]"
target => "[client][geo]"
}
geoip {
source => "[source][ip]"
target => "[source][geo]"
}
mutate {
convert => {
"[client][bytes]" => "integer"
}
}
}
output {
elasticsearch {
hosts => "localhost"
}
}

View file

@ -1,31 +0,0 @@
filter {
grok {
match => {
"message" => "%{COMBINEDAPACHELOG}"
}
}
if "_grokparsefailure" in [tags] {
mutate {
add_field => {
"error" => "field does not exist"
}
}
}
date {
match => [
"timestamp",
"dd/MMM/YYYY:HH:mm:ss Z"
]
target => "@timestamp"
locale => "en"
}
geoip {
source => "clientip"
target => "geo"
}
}
output {
elasticsearch {
hosts => "localhost"
}
}

View file

@ -1,36 +0,0 @@
filter {
grok {
match => {
"message" => "%{COMBINEDAPACHELOG}"
}
}
if "_grokparsefailure" in [tags] {
mutate {
add_field => {
"error" => "field does not exist"
}
}
mutate {
convert => {
"[client][ip]" => "integer"
}
}
}
date {
match => [
"timestamp",
"dd/MMM/YYYY:HH:mm:ss Z"
]
target => "@timestamp"
locale => "en"
}
geoip {
source => "clientip"
target => "geo"
}
}
output {
elasticsearch {
hosts => "localhost"
}
}

View file

@ -1,12 +0,0 @@
filter {
mutate {
convert => {
"bytes" => "integer"
}
}
}
output {
elasticsearch {
hosts => "localhost"
}
}

View file

@ -1,12 +0,0 @@
filter {
mutate {
convert => {
"delete" => "boolean"
}
}
}
output {
elasticsearch {
hosts => "localhost"
}
}

View file

@ -1,12 +0,0 @@
filter {
mutate {
convert => {
"blah" => "string"
}
}
}
output {
elasticsearch {
hosts => "localhost"
}
}

View file

@ -1,15 +0,0 @@
filter {
date {
match => [
"initial_date",
"dd/MM/yyyy hh:mm:ss",
"dd/MM/yyyy"
]
target => "timestamp"
}
}
output {
elasticsearch {
hosts => "localhost"
}
}

View file

@ -1,17 +0,0 @@
filter {
date {
match => [
"initial_date",
"dd/MM/yyyy hh:mm:ss",
"dd/MM/yyyy"
]
target => "timestamp"
timezone => "Europe/Amsterdam"
locale => "en"
}
}
output {
elasticsearch {
hosts => "localhost"
}
}

View file

@ -1,15 +0,0 @@
filter {
mutate {
add_field => {
"[client][ip]" => [
"127.0.0.1",
"127.0.0.2"
]
}
}
}
output {
elasticsearch {
hosts => "localhost"
}
}

View file

@ -1,12 +0,0 @@
filter {
mutate {
convert => {
"[client][bytes]" => "float"
}
}
}
output {
elasticsearch {
hosts => "localhost"
}
}

View file

@ -1,17 +0,0 @@
filter {
date {
match => [
"initial_date",
"dd/MM/yyyy hh:mm:ss",
"dd/MM/yyyy"
]
target => "[apache][timestamp]"
timezone => "Europe/Amsterdam"
locale => "en"
}
}
output {
elasticsearch {
hosts => "localhost"
}
}

View file

@ -1,15 +0,0 @@
filter {
geoip {
source => "ip"
target => "[apache][geo]"
fields => [
"continent_name",
"country_iso_code"
]
}
}
output {
elasticsearch {
hosts => "localhost"
}
}

View file

@ -1,11 +0,0 @@
filter {
json {
source => "[foo][string_source]"
target => "[bar][json_target]"
}
}
output {
elasticsearch {
hosts => "localhost"
}
}

View file

@ -1,12 +0,0 @@
filter {
mutate {
rename => {
"[foo][bar]" => "[foo][baz]"
}
}
}
output {
elasticsearch {
hosts => "localhost"
}
}

View file

@ -1,12 +0,0 @@
filter {
mutate {
add_field => {
"[foo][bar]" => "baz"
}
}
}
output {
elasticsearch {
hosts => "localhost"
}
}

View file

@ -1,15 +0,0 @@
filter {
geoip {
source => "ip"
target => "geo"
fields => [
"continent_name",
"country_iso_code"
]
}
}
output {
elasticsearch {
hosts => "localhost"
}
}

View file

@ -1,15 +0,0 @@
filter {
grok {
match => {
"message" => [
"%{IPORHOST:[apache2][access][remote_ip]} - %{DATA:[apache2][access][user_name]} \[%{HTTPDATE:[apache2][access][time]}\] \"%{WORD:[apache2][access][method]} %{DATA:[apache2][access][url]} HTTP/%{NUMBER:[apache2][access][http_version]}\" %{NUMBER:[apache2][access][response_code]} (?:%{NUMBER:apache2.access.body_sent.bytes}|-)( \"%{DATA:[apache2][access][referrer]}\")?( \"%{DATA:[apache2][access][agent]}\")?",
"%{IPORHOST:[apache2][access][remote_ip]} - %{DATA:[apache2][access][user_name]} \[%{HTTPDATE:[apache2][access][time]}\] \"-\" %{NUMBER:[apache2][access][response_code]} -"
]
}
}
}
output {
elasticsearch {
hosts => "localhost"
}
}

View file

@ -1,19 +0,0 @@
filter {
grok {
match => {
"message" => [
"%{SYSLOGTIMESTAMP:[system][syslog][timestamp]} %{SYSLOGHOST:[system][syslog][hostname]} %{DATA:[system][syslog][program]}(?:\[%{POSINT:[system][syslog][pid]}\])?: %{GREEDYMULTILINE:[system][syslog][message]}",
"%{SYSLOGTIMESTAMP:[system][syslog][timestamp]} %{GREEDYMULTILINE:[system][syslog][message]}"
]
}
pattern_definitions => {
"GREEDYMULTILINE" => "(.|\n)*"
"AUDIT_TYPE" => "^type=%{NOTSPACE:[auditd][log][record_type]}"
}
}
}
output {
elasticsearch {
hosts => "localhost"
}
}

View file

@ -1,18 +0,0 @@
filter {
grok {
match => {
"message" => [
"%{SYSLOGTIMESTAMP:[system][syslog][timestamp]} %{SYSLOGHOST:[system][syslog][hostname]} %{DATA:[system][syslog][program]}(?:\[%{POSINT:[system][syslog][pid]}\])?: %{GREEDYMULTILINE:[system][syslog][message]}",
"%{SYSLOGTIMESTAMP:[system][syslog][timestamp]} %{GREEDYMULTILINE:[system][syslog][message]}"
]
}
pattern_definitions => {
"GREEDYMULTILINE" => "(.|\n)*"
}
}
}
output {
elasticsearch {
hosts => "localhost"
}
}

View file

@ -1,12 +0,0 @@
filter {
mutate {
gsub => [
"field1", "\.", "_"
]
}
}
output {
elasticsearch {
hosts => "localhost"
}
}

View file

@ -1,10 +0,0 @@
filter {
json {
source => "string_source"
}
}
output {
elasticsearch {
hosts => "localhost"
}
}

Some files were not shown because too many files have changed in this diff Show more