logstash-core & logstash-core-event extraction to support logstash-core-event-java impl, relates to #4191

fixed timezone issue

extracted logstash-core and reorganized specs

extracted logstash-core-event

extract java Event into logstash-core-event-java in a proper gem

remove obsolete jruby_event bootstrapping

fix require path

add java code bootstrap

use logstash-core-event/logstash-core-event.rb

remove obsolete files

basic instructions

LogStash::Json need to be initialized from event

update jruby and gradle versions

update compile:logstash-core-event-java rake task

WIP tasks refactor

fix gem.files

skip test if class is not defined

fix gem related tasks for new structure

add gem spec dirs in core tests

bootstrap java implementation when requiring timestamp

new Cloner class and Event clone impl

fix array fields assignments, see #4140

don't rely on json implementation ordering

fix skipped last iterpolation char

remove implementation specific unnecessary check

also require ruby classes

define error class in ruby

raise exception on invalid format

remove implementation specific tests and extract and put logger related test in pending

missing bits for having all core timestamp specs pass

run all core specs

remove leftover

comment regex

missing encoding header

revert to logstash-core-event by default

finished proper gemification

useless require

dynamically pick specs depending on logstash-core-event-* implementation

logstash root package version

missing file for proper gemification

do not build java event by default

always check for root logstash lib dir

fix concurrent-ruby version confict

fix rebase conflict

re-enable specs

user vars instead of constants

move non core code in bootstrap

document version files

move version file

remove useless code

use version in logstash-core

fix gem files list

put back concurrent-ruby version constrain as in master

add dependency on logstash-core-event

remove dependency on logstash-core to avoid circular dependency

fix rebase conflict

remove circular dependency

fix specs

update README
This commit is contained in:
Colin Surprenant 2015-10-21 17:34:10 -04:00
parent e28f188e12
commit d74d41cb30
153 changed files with 1041 additions and 508 deletions

View file

@ -2,7 +2,9 @@
# If you modify this file manually all comments and formatting will be lost. # If you modify this file manually all comments and formatting will be lost.
source "https://rubygems.org" source "https://rubygems.org"
gem "logstash-core", "3.0.0.dev", :path => "." gem "logstash-core", "3.0.0.dev", :path => "./logstash-core"
gem "logstash-core-event", "3.0.0.dev", :path => "./logstash-core-event"
# gem "logstash-core-event-java", "3.0.0.dev", :path => "./logstash-core-event-java"
gem "file-dependencies", "0.1.6" gem "file-dependencies", "0.1.6"
gem "ci_reporter_rspec", "1.0.0", :group => :development gem "ci_reporter_rspec", "1.0.0", :group => :development
gem "simplecov", :group => :development gem "simplecov", :group => :development

View file

@ -1,14 +1,14 @@
PATH PATH
remote: . remote: ./logstash-core
specs: specs:
logstash-core (3.0.0.dev-java) logstash-core (3.0.0.dev-java)
cabin (~> 0.7.0) cabin (~> 0.7.0)
clamp (~> 0.6.5) clamp (~> 0.6.5)
concurrent-ruby (~> 0.9.1) concurrent-ruby (= 0.9.1)
filesize (= 0.0.4) filesize (= 0.0.4)
gems (~> 0.8.3) gems (~> 0.8.3)
i18n (= 0.6.9) i18n (= 0.6.9)
jrjackson (~> 0.3.5) jrjackson (~> 0.3.6)
jruby-openssl (>= 0.9.11) jruby-openssl (>= 0.9.11)
minitar (~> 0.5.4) minitar (~> 0.5.4)
pry (~> 0.10.1) pry (~> 0.10.1)
@ -16,17 +16,23 @@ PATH
thread_safe (~> 0.3.5) thread_safe (~> 0.3.5)
treetop (< 1.5.0) treetop (< 1.5.0)
PATH
remote: ./logstash-core-event
specs:
logstash-core-event (3.0.0.dev-java)
logstash-core (>= 2.0.0.beta2, < 3.0.0)
GEM GEM
remote: https://rubygems.org/ remote: https://rubygems.org/
specs: specs:
addressable (2.3.8) addressable (2.3.8)
arr-pm (0.0.10) arr-pm (0.0.10)
cabin (> 0) cabin (> 0)
backports (3.6.6) backports (3.6.7)
benchmark-ips (2.3.0) benchmark-ips (2.3.0)
builder (3.2.2) builder (3.2.2)
cabin (0.7.1) cabin (0.7.2)
childprocess (0.5.6) childprocess (0.5.7)
ffi (~> 1.0, >= 1.0.11) ffi (~> 1.0, >= 1.0.11)
ci_reporter (2.0.0) ci_reporter (2.0.0)
builder (>= 2.1.2) builder (>= 2.1.2)
@ -67,7 +73,7 @@ GEM
domain_name (~> 0.5) domain_name (~> 0.5)
i18n (0.6.9) i18n (0.6.9)
insist (1.0.0) insist (1.0.0)
jrjackson (0.3.5) jrjackson (0.3.6)
jruby-openssl (0.9.12-java) jruby-openssl (0.9.12-java)
json (1.8.3-java) json (1.8.3-java)
kramdown (1.9.0) kramdown (1.9.0)
@ -84,11 +90,11 @@ GEM
mime-types (2.6.2) mime-types (2.6.2)
minitar (0.5.4) minitar (0.5.4)
multipart-post (2.0.0) multipart-post (2.0.0)
netrc (0.10.3) netrc (0.11.0)
octokit (3.8.0) octokit (3.8.0)
sawyer (~> 0.6.0, >= 0.5.3) sawyer (~> 0.6.0, >= 0.5.3)
polyglot (0.3.5) polyglot (0.3.5)
pry (0.10.2-java) pry (0.10.3-java)
coderay (~> 1.1.0) coderay (~> 1.1.0)
method_source (~> 0.8.1) method_source (~> 0.8.1)
slop (~> 3.4) slop (~> 3.4)
@ -147,6 +153,7 @@ DEPENDENCIES
fpm (~> 1.3.3) fpm (~> 1.3.3)
gems (~> 0.8.3) gems (~> 0.8.3)
logstash-core (= 3.0.0.dev)! logstash-core (= 3.0.0.dev)!
logstash-core-event (= 3.0.0.dev)!
logstash-devutils (~> 0.0.15) logstash-devutils (~> 0.0.15)
octokit (= 3.8.0) octokit (= 3.8.0)
rspec (~> 3.1.0) rspec (~> 3.1.0)

View file

@ -1,2 +0,0 @@
rootProject.name = 'logstash'
include 'logstash-event'

View file

@ -35,6 +35,15 @@ module LogStash
def logstash_gem_home def logstash_gem_home
::File.join(BUNDLE_DIR, ruby_engine, gem_ruby_version) ::File.join(BUNDLE_DIR, ruby_engine, gem_ruby_version)
end end
def vendor_path(path)
return ::File.join(LOGSTASH_HOME, "vendor", path)
end
def pattern_path(path)
return ::File.join(LOGSTASH_HOME, "patterns", path)
end
end end
end end

View file

@ -1,45 +0,0 @@
begin
require 'ant'
rescue
puts("error: unable to load Ant, make sure Ant is installed, in your PATH and $ANT_HOME is defined properly")
puts("\nerror details:\n#{$!}")
exit(1)
end
BASEDIR = File.expand_path("../../../", __FILE__)
TARGET = File.join(BASEDIR, "out/production/main/")
SRC = File.join(BASEDIR, "src/main/java")
JAR = File.join(BASEDIR, "lib/jruby_event/jruby_event.jar")
task :setup do
ant.mkdir 'dir' => TARGET
ant.path 'id' => 'classpath' do
fileset 'dir' => TARGET
end
end
desc "compile Java classes"
task :build => [:setup] do |t, args|
require 'jruby/jrubyc'
ant.javac(
:srcdir => SRC,
:destdir => TARGET,
:classpathref => "classpath",
:debug => true,
:includeantruntime => "no",
:verbose => false,
:listfiles => true,
:source => "1.7",
:target => "1.7",
) {}
end
task :setup_jar do
ant.delete 'file' => JAR
end
desc "build the jar"
task :jar => [:setup_jar, :build] do
ant.jar :basedir => TARGET, :destfile => JAR, :includes => "**/*.class"
end

View file

@ -1,10 +0,0 @@
Gem::Specification.new do |s|
s.name = "jruby_event"
s.version = "0.0.1"
s.summary = 'A helper Gem for using the Docker API'
s.description = 'This gem is intended to aid in using Docker images and containers, specifically with regards to integration testing in RSpec.'
s.authors = ['Aaron Mildenstein', 'Tal Levy']
s.email = 'aaron@mildensteins.com'
s.homepage = 'http://github.com/untergeek/longshoreman'
s.licenses = ['Apache License (2.0)']
s.require_paths = ['lib']

Binary file not shown.

View file

@ -1,7 +0,0 @@
# encoding: utf-8
require "java"
require "cabin"
require_relative "../../java/logstash-event/build/libs/logstash-event-all.jar"
require "jruby_event_ext"
require "jruby_timestamp_ext"

View file

@ -1,2 +0,0 @@
# encoding: utf-8
require "logstash/event"

View file

@ -1,36 +0,0 @@
# encoding: utf-8
# Bundler monkey patches
module ::Bundler
# Patch bundler to write a .lock file specific to the version of ruby.
# This keeps MRI/JRuby/RBX from conflicting over the Gemfile.lock updates
module SharedHelpers
def default_lockfile
ruby = "#{LogStash::Environment.ruby_engine}-#{LogStash::Environment.ruby_abi_version}"
Pathname.new("#{default_gemfile}.#{ruby}.lock")
end
end
# Patch to prevent Bundler to save a .bundle/config file in the root
# of the application
class Settings
def set_key(key, value, hash, file)
key = key_for(key)
unless hash[key] == value
hash[key] = value
hash.delete(key) if value.nil?
end
value
end
end
# Add the Bundler.reset! method which has been added in master but is not in 1.7.9.
class << self
unless self.method_defined?("reset!")
def reset!
@definition = nil
end
end
end
end

View file

@ -1,97 +0,0 @@
# encoding: utf-8
require "logstash/environment"
require "logstash/json"
require "forwardable"
require "date"
require "time"
# module LogStash
# class TimestampParserError < StandardError; end
#
# class Timestamp
# extend Forwardable
# include Comparable
#
# def_delegators :@time, :tv_usec, :usec, :year, :iso8601, :to_i, :tv_sec, :to_f, :to_edn, :<=>, :+
#
# attr_reader :time
#
# ISO8601_STRFTIME = "%04d-%02d-%02dT%02d:%02d:%02d.%06d%+03d:00".freeze
# ISO8601_PRECISION = 3
#
# def initialize(time = Time.new)
# @time = time.utc
# end
#
# def self.at(*args)
# Timestamp.new(::Time.at(*args))
# end
#
# def self.parse(*args)
# Timestamp.new(::Time.parse(*args))
# end
#
# def self.now
# Timestamp.new(::Time.now)
# end
#
# # coerce tries different strategies based on the time object class to convert into a Timestamp.
# # @param [String, Time, Timestamp] time the time object to try coerce
# # @return [Timestamp, nil] Timestamp will be returned if successful otherwise nil
# # @raise [TimestampParserError] on String with invalid format
# def self.coerce(time)
# case time
# when String
# LogStash::Timestamp.parse_iso8601(time)
# when LogStash::Timestamp
# time
# when Time
# LogStash::Timestamp.new(time)
# else
# nil
# end
# end
#
# if LogStash::Environment.jruby?
# JODA_ISO8601_PARSER = org.joda.time.format.ISODateTimeFormat.dateTimeParser
# UTC = org.joda.time.DateTimeZone.forID("UTC")
#
# def self.parse_iso8601(t)
# millis = JODA_ISO8601_PARSER.parseMillis(t)
# LogStash::Timestamp.at(millis / 1000, (millis % 1000) * 1000)
# rescue => e
# raise(TimestampParserError, "invalid timestamp string #{t.inspect}, error=#{e.inspect}")
# end
#
# else
#
# def self.parse_iso8601(t)
# # warning, ruby's Time.parse is *really* terrible and slow.
# LogStash::Timestamp.new(::Time.parse(t))
# rescue => e
# raise(TimestampParserError, "invalid timestamp string #{t.inspect}, error=#{e.inspect}")
# end
# end
#
# def utc
# @time.utc # modifies the receiver
# self
# end
# alias_method :gmtime, :utc
#
# def to_json(*args)
# # ignore arguments to respect accepted to_json method signature
# "\"" + to_iso8601 + "\""
# end
# alias_method :inspect, :to_json
#
# def to_iso8601
# @iso8601 ||= @time.iso8601(ISO8601_PRECISION)
# end
# alias_method :to_s, :to_iso8601
#
# def -(value)
# @time - (value.is_a?(Timestamp) ? value.time : value)
# end
# end
# end

View file

@ -1,124 +0,0 @@
# encoding: utf-8
require "logstash/namespace"
require "logstash/util"
require "thread_safe"
# module LogStash::Util
#
# # PathCache is a singleton which globally caches the relation between a field reference and its
# # decomposition into a [key, path array] tuple. For example the field reference [foo][bar][baz]
# # is decomposed into ["baz", ["foo", "bar"]].
# module PathCache
# extend self
#
# # requiring libraries and defining constants is thread safe in JRuby so
# # PathCache::CACHE will be corretly initialized, once, when accessors.rb
# # will be first required
# CACHE = ThreadSafe::Cache.new
#
# def get(field_reference)
# # the "get_or_default(x, nil) || put(x, parse(x))" is ~2x faster than "get || put" because the get call is
# # proxied through the JRuby JavaProxy op_aref method. the correct idiom here would be to use
# # "compute_if_absent(x){parse(x)}" but because of the closure creation, it is ~1.5x slower than
# # "get_or_default || put".
# # this "get_or_default || put" is obviously non-atomic which is not really important here
# # since all threads will set the same value and this cache will stabilize very quickly after the first
# # few events.
# CACHE.get_or_default(field_reference, nil) || CACHE.put(field_reference, parse(field_reference))
# end
#
# def parse(field_reference)
# path = field_reference.split(/[\[\]]/).select{|s| !s.empty?}
# [path.pop, path]
# end
# end
#
# # Accessors uses a lookup table to speedup access of a field reference of the form
# # "[hello][world]" to the underlying store hash into {"hello" => {"world" => "foo"}}
# class Accessors
#
# # @param store [Hash] the backing data store field refereces point to
# def initialize(store)
# @store = store
#
# # @lut is a lookup table between a field reference and a [target, key] tuple
# # where target is the containing Hash or Array for key in @store.
# # this allows us to directly access the containing object for key instead of
# # walking the field reference path into the inner @store objects
# @lut = {}
# end
#
# # @param field_reference [String] the field reference
# # @return [Object] the value in @store for this field reference
# def get(field_reference)
# target, key = lookup(field_reference)
# return nil unless target
# target.is_a?(Array) ? target[key.to_i] : target[key]
# end
#
# # @param field_reference [String] the field reference
# # @param value [Object] the value to set in @store for this field reference
# # @return [Object] the value set
# def set(field_reference, value)
# target, key = lookup_or_create(field_reference)
# target[target.is_a?(Array) ? key.to_i : key] = value
# end
#
# # @param field_reference [String] the field reference to remove
# # @return [Object] the removed value in @store for this field reference
# def del(field_reference)
# target, key = lookup(field_reference)
# return nil unless target
# target.is_a?(Array) ? target.delete_at(key.to_i) : target.delete(key)
# end
#
# # @param field_reference [String] the field reference to test for inclusion in the store
# # @return [Boolean] true if the store contains a value for this field reference
# def include?(field_reference)
# target, key = lookup(field_reference)
# return false unless target
#
# target.is_a?(Array) ? !target[key.to_i].nil? : target.include?(key)
# end
#
# private
#
# # retrieve the [target, key] tuple associated with this field reference
# # @param field_reference [String] the field referece
# # @return [[Object, String]] the [target, key] tuple associated with this field reference
# def lookup(field_reference)
# @lut[field_reference] ||= find_target(field_reference)
# end
#
# # retrieve the [target, key] tuple associated with this field reference and create inner
# # container objects if they do not exists
# # @param field_reference [String] the field referece
# # @return [[Object, String]] the [target, key] tuple associated with this field reference
# def lookup_or_create(field_reference)
# @lut[field_reference] ||= find_or_create_target(field_reference)
# end
#
# # find the target container object in store for this field reference
# # @param field_reference [String] the field referece
# # @return [Object] the target container object in store associated with this field reference
# def find_target(field_reference)
# key, path = PathCache.get(field_reference)
# target = path.inject(@store) do |r, k|
# return nil unless r
# r[r.is_a?(Array) ? k.to_i : k]
# end
# target ? [target, key] : nil
# end
#
# # find the target container object in store for this field reference and create inner
# # container objects if they do not exists
# # @param field_reference [String] the field referece
# # @return [Object] the target container object in store associated with this field reference
# def find_or_create_target(accessor)
# key, path = PathCache.get(accessor)
# target = path.inject(@store) {|r, k| r[r.is_a?(Array) ? k.to_i : k] ||= {}}
# [target, key]
# end
# end # class Accessors
# end # module LogStash::Util

View file

@ -0,0 +1,63 @@
# logstash-core-event-java
## dev install
1- build code with
```
$ cd logstash-core-event-java
$ gradle build
```
A bunch of warning are expected, it should end with:
```
BUILD SUCCESSFUL
```
2- update root logstash `Gemfile` to use this gem with:
```
# gem "logstash-core-event", "x.y.z", :path => "./logstash-core-event"
gem "logstash-core-event-java", "x.y.z", :path => "./logstash-core-event-java"
```
3- update `logstash-core/logstash-core.gemspec` with:
```
# gem.add_runtime_dependency "logstash-core-event", "x.y.z"
gem.add_runtime_dependency "logstash-core-event-java", "x.y.z"
```
4- and install:
```
$ bin/bundle
```
- install core plugins for tests
```
$ rake test:install-core
```
## specs
```
$ bin/rspec spec
$ bin/rspec logstash-core/spec
$ bin/rspec logstash-core-event/spec
$ bin/rspec logstash-core-event-java/spec
```
or
```
$ rake test:core
```
also
```
$ rake test:plugins
```

View file

@ -10,7 +10,8 @@ buildscript {
} }
} }
allprojects { //allprojects {
repositories { repositories {
mavenLocal() mavenLocal()
mavenCentral() mavenCentral()
@ -22,9 +23,10 @@ allprojects {
} }
} }
} //}
//subprojects { project ->
subprojects { project ->
apply plugin: 'java' apply plugin: 'java'
apply plugin: 'idea' apply plugin: 'idea'
apply plugin: 'com.github.johnrengelman.shadow' apply plugin: 'com.github.johnrengelman.shadow'
@ -87,15 +89,16 @@ subprojects { project ->
compile 'joda-time:joda-time:2.8.2' compile 'joda-time:joda-time:2.8.2'
compile 'com.google.guava:guava:18.0' compile 'com.google.guava:guava:18.0'
compile 'org.slf4j:slf4j-api:1.7.12' compile 'org.slf4j:slf4j-api:1.7.12'
provided 'org.jruby:jruby-core:1.7.21' provided 'org.jruby:jruby-core:1.7.22'
testCompile 'org.testng:testng:6.9.6' testCompile 'org.testng:testng:6.9.6'
testCompile 'org.mockito:mockito-core:1.10.19' testCompile 'org.mockito:mockito-core:1.10.19'
} }
}
//}
// See http://www.gradle.org/docs/current/userguide/gradle_wrapper.html // See http://www.gradle.org/docs/current/userguide/gradle_wrapper.html
task wrapper(type: Wrapper) { task wrapper(type: Wrapper) {
description = 'Install Gradle wrapper' description = 'Install Gradle wrapper'
gradleVersion = '2.3' gradleVersion = '2.7'
} }

View file

@ -0,0 +1 @@
require "logstash-core-event-java/logstash-core-event-java"

View file

@ -0,0 +1,31 @@
# encoding: utf-8
require "java"
module LogStash
end
# TODO: (colin) integrate jar loading with gradle and verify dev vs prod environment setups
# insert all jars in this directory into CLASSPATH
Dir.glob(File.join(File.expand_path("..", __FILE__), "*.jar")).each do |jar|
$CLASSPATH << jar unless $CLASSPATH.include?(jar)
end
# TODO: (colin) correctly handle dev env build/ dir and local jar
# local dev setup
classes_dir = File.expand_path("../../../build/classes/main", __FILE__)
if File.directory?(classes_dir)
# if in local dev setup, add target to classpath
$CLASSPATH << classes_dir unless $CLASSPATH.include?(classes_dir)
else
# otherwise use included jar
raise("TODO build dir not found and no jar file")
end
require "jruby_event_ext"
require "jruby_timestamp_ext"
require "logstash/event"
require "logstash/timestamp"

View file

@ -0,0 +1,8 @@
# encoding: utf-8
# The version of logstash core event java gem.
#
# Note to authors: this should not include dashes because 'gem' barfs if
# you include a dash in the version string.
LOGSTASH_CORE_EVENT_JAVA_VERSION = "3.0.0.dev"

View file

@ -0,0 +1 @@
require "logstash-core-event-java/logstash-core-event-java"

View file

@ -1,14 +1,7 @@
# encoding: utf-8 # encoding: utf-8
require "jruby_event/jruby_event"
# require "time"
# require "date"
# require "cabin"
require "logstash/namespace" require "logstash/namespace"
# require "logstash/util/accessors"
# require "logstash/timestamp"
require "logstash/json" require "logstash/json"
require "logstash/string_interpolation"
# transcient pipeline events for normal in-flow signaling as opposed to # transcient pipeline events for normal in-flow signaling as opposed to
# flow altering exceptions. for now having base classes is adequate and # flow altering exceptions. for now having base classes is adequate and
@ -24,3 +17,8 @@ module LogStash
# LogStash::SHUTDOWN is used by plugins # LogStash::SHUTDOWN is used by plugins
SHUTDOWN = LogStash::ShutdownEvent.new SHUTDOWN = LogStash::ShutdownEvent.new
end end
# for backward compatibility, require "logstash/event" is used a lots of places so let's bootstrap the
# Java code loading from here.
# TODO: (colin) I think we should mass replace require "logstash/event" with require "logstash-core-event"
require "logstash-core-event"

View file

@ -0,0 +1,28 @@
# encoding: utf-8
require "logstash/namespace"
require "logstash-core-event"
module LogStash
class TimestampParserError < StandardError; end
class Timestamp
include Comparable
# TODO (colin) implement in Java
def <=>(other)
self.time <=> other.time
end
# TODO (colin) implement in Java
def +(other)
self.time + other
end
# TODO (colin) implement in Java
def -(value)
self.time - (value.is_a?(Timestamp) ? value.time : value)
end
end
end

View file

@ -0,0 +1,23 @@
# -*- encoding: utf-8 -*-
lib = File.expand_path('../lib', __FILE__)
$LOAD_PATH.unshift(lib) unless $LOAD_PATH.include?(lib)
require 'logstash-core-event-java/version'
Gem::Specification.new do |gem|
gem.authors = ["Jordan Sissel", "Pete Fritchman", "Elasticsearch"]
gem.email = ["jls@semicomplete.com", "petef@databits.net", "info@elasticsearch.com"]
gem.description = %q{The core event component of logstash, the scalable log and event management tool}
gem.summary = %q{logstash-core-event-java - The core event component of logstash}
gem.homepage = "http://www.elastic.co/guide/en/logstash/current/index.html"
gem.license = "Apache License (2.0)"
gem.files = Dir.glob(["logstash-core-event-java.gemspec", "lib/**/*.rb", "spec/**/*.rb"])
gem.test_files = gem.files.grep(%r{^(test|spec|features)/})
gem.name = "logstash-core-event-java"
gem.require_paths = ["lib"]
gem.version = LOGSTASH_CORE_EVENT_JAVA_VERSION
if RUBY_PLATFORM == 'java'
gem.platform = RUBY_PLATFORM
end
end

View file

@ -0,0 +1,2 @@
rootProject.name = 'logstash-core-event-java'

View file

@ -1,26 +1,27 @@
$LOAD_PATH << File.expand_path("../../../lib", __FILE__) # encoding: utf-8
require "jruby_event/jruby_event" require "spec_helper"
require "logstash/util" require "logstash/util"
require "logstash/event" require "logstash/event"
require "json"
TIMESTAMP = "@timestamp" TIMESTAMP = "@timestamp"
describe LogStash::Event do describe LogStash::Event do
context "to_json" do context "to_json" do
it "should serialize snmple values" do it "should serialize simple values" do
e = LogStash::Event.new({"foo" => "bar", "bar" => 1, "baz" => 1.0, TIMESTAMP => "2015-05-28T23:02:05.350Z"}) e = LogStash::Event.new({"foo" => "bar", "bar" => 1, "baz" => 1.0, TIMESTAMP => "2015-05-28T23:02:05.350Z"})
expect(e.to_json).to eq("{\"foo\":\"bar\",\"bar\":1,\"baz\":1.0,\"@timestamp\":\"2015-05-28T23:02:05.350Z\",\"@version\":\"1\"}") expect(JSON.parse(e.to_json)).to eq(JSON.parse("{\"foo\":\"bar\",\"bar\":1,\"baz\":1.0,\"@timestamp\":\"2015-05-28T23:02:05.350Z\",\"@version\":\"1\"}"))
end end
it "should serialize deep hash values" do it "should serialize deep hash values" do
e = LogStash::Event.new({"foo" => {"bar" => 1, "baz" => 1.0, "biz" => "boz"}, TIMESTAMP => "2015-05-28T23:02:05.350Z"}) e = LogStash::Event.new({"foo" => {"bar" => 1, "baz" => 1.0, "biz" => "boz"}, TIMESTAMP => "2015-05-28T23:02:05.350Z"})
expect(e.to_json).to eq("{\"foo\":{\"bar\":1,\"baz\":1.0,\"biz\":\"boz\"},\"@timestamp\":\"2015-05-28T23:02:05.350Z\",\"@version\":\"1\"}") expect(JSON.parse(e.to_json)).to eq(JSON.parse("{\"foo\":{\"bar\":1,\"baz\":1.0,\"biz\":\"boz\"},\"@timestamp\":\"2015-05-28T23:02:05.350Z\",\"@version\":\"1\"}"))
end end
it "should serialize deep array values" do it "should serialize deep array values" do
e = LogStash::Event.new({"foo" => ["bar", 1, 1.0], TIMESTAMP => "2015-05-28T23:02:05.350Z"}) e = LogStash::Event.new({"foo" => ["bar", 1, 1.0], TIMESTAMP => "2015-05-28T23:02:05.350Z"})
expect(e.to_json).to eq("{\"foo\":[\"bar\",1,1.0],\"@timestamp\":\"2015-05-28T23:02:05.350Z\",\"@version\":\"1\"}") expect(JSON.parse(e.to_json)).to eq(JSON.parse("{\"foo\":[\"bar\",1,1.0],\"@timestamp\":\"2015-05-28T23:02:05.350Z\",\"@version\":\"1\"}"))
end end
it "should serialize deep hash from field reference assignments" do it "should serialize deep hash from field reference assignments" do
@ -29,7 +30,7 @@ describe LogStash::Event do
e["bar"] = 1 e["bar"] = 1
e["baz"] = 1.0 e["baz"] = 1.0
e["[fancy][pants][socks]"] = "shoes" e["[fancy][pants][socks]"] = "shoes"
expect(e.to_json).to eq("{\"@timestamp\":\"2015-05-28T23:02:05.350Z\",\"@version\":\"1\",\"foo\":\"bar\",\"bar\":1,\"baz\":1.0,\"fancy\":{\"pants\":{\"socks\":\"shoes\"}}}") expect(JSON.parse(e.to_json)).to eq(JSON.parse("{\"@timestamp\":\"2015-05-28T23:02:05.350Z\",\"@version\":\"1\",\"foo\":\"bar\",\"bar\":1,\"baz\":1.0,\"fancy\":{\"pants\":{\"socks\":\"shoes\"}}}"))
end end
end end
@ -119,7 +120,7 @@ describe LogStash::Event do
end end
context "append" do context "append" do
it "show append" do it "should append" do
event = LogStash::Event.new("message" => "hello world") event = LogStash::Event.new("message" => "hello world")
event.append(LogStash::Event.new("message" => "another thing")) event.append(LogStash::Event.new("message" => "another thing"))
expect(event["message"]).to eq(["hello world", "another thing"]) expect(event["message"]).to eq(["hello world", "another thing"])
@ -129,10 +130,9 @@ describe LogStash::Event do
context "tags" do context "tags" do
it "should tag" do it "should tag" do
event = LogStash::Event.new("message" => "hello world") event = LogStash::Event.new("message" => "hello world")
tag = "foo" expect(event["tags"]).to be_nil
event["tags"] = [] event["tags"] = ["foo"]
event["tags"] << tag unless event["tags"].include?(tag) expect(event["tags"]).to eq(["foo"])
end end
end end
end end
``

View file

@ -1,6 +1,7 @@
$LOAD_PATH << File.expand_path("../../../lib", __FILE__) # encoding: utf-8
require "jruby_event/jruby_event" require "spec_helper"
require "logstash/timestamp"
describe LogStash::Timestamp do describe LogStash::Timestamp do
context "constructors" do context "constructors" do
@ -19,7 +20,10 @@ describe LogStash::Timestamp do
expect(t.time.to_i).to eq(now.to_i) expect(t.time.to_i).to eq(now.to_i)
end end
it "should raise exception on invalid format" do
expect{LogStash::Timestamp.new("foobar")}.to raise_error
end
end end
end end
``

View file

@ -0,0 +1,56 @@
package com.logstash;
import java.util.*;
public final class Cloner {
private Cloner(){}
public static <T> T deep(final T input) {
if (input instanceof Map<?, ?>) {
return (T) deepMap((Map<?, ?>) input);
} else if (input instanceof List<?>) {
return (T) deepList((List<?>) input);
} else if (input instanceof Collection<?>) {
throw new ClassCastException("unexpected Collection type " + input.getClass());
}
return input;
}
private static <E> List<E> deepList(final List<E> list) {
List<E> clone;
if (list instanceof LinkedList<?>) {
clone = new LinkedList<E>();
} else if (list instanceof ArrayList<?>) {
clone = new ArrayList<E>();
} else {
throw new ClassCastException("unexpected List type " + list.getClass());
}
for (E item : list) {
clone.add(deep(item));
}
return clone;
}
private static <K, V> Map<K, V> deepMap(final Map<K, V> map) {
Map<K, V> clone;
if (map instanceof LinkedHashMap<?, ?>) {
clone = new LinkedHashMap<K, V>();
} else if (map instanceof TreeMap<?, ?>) {
clone = new TreeMap<K, V>();
} else if (map instanceof HashMap<?, ?>) {
clone = new HashMap<K, V>();
} else {
throw new ClassCastException("unexpected Map type " + map.getClass());
}
for (Map.Entry<K, V> entry : map.entrySet()) {
clone.put(entry.getKey(), deep(entry.getValue()));
}
return clone;
}
}

View file

@ -188,8 +188,10 @@ public class Event implements Cloneable, Serializable {
public Event clone() public Event clone()
throws CloneNotSupportedException throws CloneNotSupportedException
{ {
Event clone = (Event)super.clone(); // Event clone = (Event)super.clone();
clone.setAccessors(new Accessors(clone.getData())); // clone.setAccessors(new Accessors(clone.getData()));
Event clone = new Event(Cloner.deep(getData()));
return clone; return clone;
} }

View file

@ -63,7 +63,7 @@ public class StringInterpolation {
pos = matcher.end(); pos = matcher.end();
} }
if(pos < template.length() - 1) { if(pos <= template.length() - 1) {
compiledTemplate.add(new StaticNode(template.substring(pos))); compiledTemplate.add(new StaticNode(template.substring(pos)));
} }
} }

View file

@ -5,6 +5,7 @@ import org.codehaus.jackson.map.annotate.JsonSerialize;
import org.jruby.*; import org.jruby.*;
import org.jruby.anno.JRubyClass; import org.jruby.anno.JRubyClass;
import org.jruby.anno.JRubyMethod; import org.jruby.anno.JRubyMethod;
import org.jruby.exceptions.RaiseException;
import org.jruby.javasupport.JavaUtil; import org.jruby.javasupport.JavaUtil;
import org.jruby.runtime.Arity; import org.jruby.runtime.Arity;
import org.jruby.runtime.ObjectAllocator; import org.jruby.runtime.ObjectAllocator;
@ -79,6 +80,18 @@ public class JrubyTimestampExtLibrary implements Library {
this.timestamp = new Timestamp(); this.timestamp = new Timestamp();
} else if (time instanceof RubyTime) { } else if (time instanceof RubyTime) {
this.timestamp = new Timestamp(((RubyTime)time).getDateTime()); this.timestamp = new Timestamp(((RubyTime)time).getDateTime());
} else if (time instanceof RubyString) {
try {
this.timestamp = new Timestamp(((RubyString) time).toString());
} catch (IllegalArgumentException e) {
throw new RaiseException(
getRuntime(),
getRuntime().getModule("LogStash").getClass("TimestampParserError"),
"invalid timestamp string format " + time,
true
);
}
} else { } else {
throw context.runtime.newTypeError("wrong argument type " + time.getMetaClass() + " (expected Time)"); throw context.runtime.newTypeError("wrong argument type " + time.getMetaClass() + " (expected Time)");
} }
@ -121,6 +134,59 @@ public class JrubyTimestampExtLibrary implements Library {
return RubyString.newString(context.runtime, "\"" + this.timestamp.toIso8601() + "\""); return RubyString.newString(context.runtime, "\"" + this.timestamp.toIso8601() + "\"");
} }
public static Timestamp newTimetsamp(IRubyObject time)
{
if (time.isNil()) {
return new Timestamp();
} else if (time instanceof RubyTime) {
return new Timestamp(((RubyTime)time).getDateTime());
} else if (time instanceof RubyString) {
return new Timestamp(((RubyString) time).toString());
} else if (time instanceof RubyTimestamp) {
return new Timestamp(((RubyTimestamp) time).timestamp);
} else {
return null;
}
}
@JRubyMethod(name = "coerce", required = 1, meta = true)
public static IRubyObject ruby_coerce(ThreadContext context, IRubyObject recv, IRubyObject time)
{
try {
Timestamp ts = newTimetsamp(time);
return (ts == null) ? context.runtime.getNil() : RubyTimestamp.newRubyTimestamp(context.runtime, ts);
} catch (IllegalArgumentException e) {
throw new RaiseException(
context.runtime,
context.runtime.getModule("LogStash").getClass("TimestampParserError"),
"invalid timestamp format " + e.getMessage(),
true
);
}
}
@JRubyMethod(name = "parse_iso8601", required = 1, meta = true)
public static IRubyObject ruby_parse_iso8601(ThreadContext context, IRubyObject recv, IRubyObject time)
{
if (time instanceof RubyString) {
try {
return RubyTimestamp.newRubyTimestamp(context.runtime, newTimetsamp(time));
} catch (IllegalArgumentException e) {
throw new RaiseException(
context.runtime,
context.runtime.getModule("LogStash").getClass("TimestampParserError"),
"invalid timestamp format " + e.getMessage(),
true
);
}
} else {
throw context.runtime.newTypeError("wrong argument type " + time.getMetaClass() + " (expected String)");
}
}
@JRubyMethod(name = "at", required = 1, optional = 1, meta = true) @JRubyMethod(name = "at", required = 1, optional = 1, meta = true)
public static IRubyObject ruby_at(ThreadContext context, IRubyObject recv, IRubyObject[] args) public static IRubyObject ruby_at(ThreadContext context, IRubyObject recv, IRubyObject[] args)
{ {

View file

@ -2,6 +2,7 @@ package com.logstash;
import org.joda.time.DateTime; import org.joda.time.DateTime;
import org.joda.time.DateTimeZone;
import org.junit.Test; import org.junit.Test;
import java.io.IOException; import java.io.IOException;
@ -97,7 +98,7 @@ public class StringInterpolationTest {
Event event = getTestEvent(); Event event = getTestEvent();
String path = "%{+%s}"; String path = "%{+%s}";
StringInterpolation si = StringInterpolation.getInstance(); StringInterpolation si = StringInterpolation.getInstance();
assertEquals("1443682800", si.evaluate(event, path)); assertEquals("1443657600", si.evaluate(event, path));
} }
@Test @Test
@ -132,7 +133,7 @@ public class StringInterpolationTest {
data.put("bar", "foo"); data.put("bar", "foo");
data.put("awesome", "logstash"); data.put("awesome", "logstash");
data.put("j", inner); data.put("j", inner);
data.put("@timestamp", new DateTime(2015, 10, 1, 0, 0, 0)); data.put("@timestamp", new DateTime(2015, 10, 1, 0, 0, 0, DateTimeZone.UTC));
Event event = new Event(data); Event event = new Event(data);

View file

@ -0,0 +1 @@
require "logstash-core-event/logstash-core-event"

View file

@ -0,0 +1,5 @@
# encoding: utf-8
module LogStash
end
require "logstash/event"

View file

@ -0,0 +1,8 @@
# encoding: utf-8
# The version of logstash core event gem.
#
# Note to authors: this should not include dashes because 'gem' barfs if
# you include a dash in the version string.
LOGSTASH_CORE_EVENT_VERSION = "3.0.0.dev"

View file

@ -0,0 +1,275 @@
# encoding: utf-8
require "time"
require "date"
require "cabin"
require "logstash/namespace"
require "logstash/util/accessors"
require "logstash/timestamp"
require "logstash/json"
require "logstash/string_interpolation"
# transcient pipeline events for normal in-flow signaling as opposed to
# flow altering exceptions. for now having base classes is adequate and
# in the future it might be necessary to refactor using like a BaseEvent
# class to have a common interface for all pileline events to support
# eventual queueing persistence for example, TBD.
class LogStash::ShutdownEvent; end
class LogStash::FlushEvent; end
module LogStash
FLUSH = LogStash::FlushEvent.new
# LogStash::SHUTDOWN is used by plugins
SHUTDOWN = LogStash::ShutdownEvent.new
end
# the logstash event object.
#
# An event is simply a tuple of (timestamp, data).
# The 'timestamp' is an ISO8601 timestamp. Data is anything - any message,
# context, references, etc that are relevant to this event.
#
# Internally, this is represented as a hash with only two guaranteed fields.
#
# * "@timestamp" - an ISO8601 timestamp representing the time the event
# occurred at.
# * "@version" - the version of the schema. Currently "1"
#
# They are prefixed with an "@" symbol to avoid clashing with your
# own custom fields.
#
# When serialized, this is represented in JSON. For example:
#
# {
# "@timestamp": "2013-02-09T20:39:26.234Z",
# "@version": "1",
# message: "hello world"
# }
class LogStash::Event
class DeprecatedMethod < StandardError; end
CHAR_PLUS = "+"
TIMESTAMP = "@timestamp"
VERSION = "@version"
VERSION_ONE = "1"
TIMESTAMP_FAILURE_TAG = "_timestampparsefailure"
TIMESTAMP_FAILURE_FIELD = "_@timestamp"
METADATA = "@metadata".freeze
METADATA_BRACKETS = "[#{METADATA}]".freeze
# Floats outside of these upper and lower bounds are forcibly converted
# to scientific notation by Float#to_s
MIN_FLOAT_BEFORE_SCI_NOT = 0.0001
MAX_FLOAT_BEFORE_SCI_NOT = 1000000000000000.0
LOGGER = Cabin::Channel.get(LogStash)
public
def initialize(data = {})
@cancelled = false
@data = data
@accessors = LogStash::Util::Accessors.new(data)
@data[VERSION] ||= VERSION_ONE
ts = @data[TIMESTAMP]
@data[TIMESTAMP] = ts ? init_timestamp(ts) : LogStash::Timestamp.now
@metadata = @data.delete(METADATA) || {}
@metadata_accessors = LogStash::Util::Accessors.new(@metadata)
end # def initialize
public
def cancel
@cancelled = true
end # def cancel
public
def uncancel
@cancelled = false
end # def uncancel
public
def cancelled?
return @cancelled
end # def cancelled?
# Create a deep-ish copy of this event.
public
def clone
copy = {}
@data.each do |k,v|
# TODO(sissel): Recurse if this is a hash/array?
copy[k] = begin v.clone rescue v end
end
return self.class.new(copy)
end # def clone
public
def to_s
self.sprintf("#{timestamp.to_iso8601} %{host} %{message}")
end # def to_s
public
def timestamp; return @data[TIMESTAMP]; end # def timestamp
def timestamp=(val); return @data[TIMESTAMP] = val; end # def timestamp=
def unix_timestamp
raise DeprecatedMethod
end # def unix_timestamp
def ruby_timestamp
raise DeprecatedMethod
end # def unix_timestamp
public
def [](fieldref)
if fieldref.start_with?(METADATA_BRACKETS)
@metadata_accessors.get(fieldref[METADATA_BRACKETS.length .. -1])
elsif fieldref == METADATA
@metadata
else
@accessors.get(fieldref)
end
end # def []
public
def []=(fieldref, value)
if fieldref == TIMESTAMP && !value.is_a?(LogStash::Timestamp)
raise TypeError, "The field '@timestamp' must be a (LogStash::Timestamp, not a #{value.class} (#{value})"
end
if fieldref.start_with?(METADATA_BRACKETS)
@metadata_accessors.set(fieldref[METADATA_BRACKETS.length .. -1], value)
elsif fieldref == METADATA
@metadata = value
@metadata_accessors = LogStash::Util::Accessors.new(@metadata)
else
@accessors.set(fieldref, value)
end
end # def []=
public
def fields
raise DeprecatedMethod
end
public
def to_json(*args)
# ignore arguments to respect accepted to_json method signature
LogStash::Json.dump(@data)
end # def to_json
public
def to_hash
@data
end # def to_hash
public
def overwrite(event)
# pickup new event @data and also pickup @accessors
# otherwise it will be pointing on previous data
@data = event.instance_variable_get(:@data)
@accessors = event.instance_variable_get(:@accessors)
#convert timestamp if it is a String
if @data[TIMESTAMP].is_a?(String)
@data[TIMESTAMP] = LogStash::Timestamp.parse_iso8601(@data[TIMESTAMP])
end
end
public
def include?(fieldref)
if fieldref.start_with?(METADATA_BRACKETS)
@metadata_accessors.include?(fieldref[METADATA_BRACKETS.length .. -1])
elsif fieldref == METADATA
true
else
@accessors.include?(fieldref)
end
end # def include?
# Append an event to this one.
public
def append(event)
# non-destructively merge that event with ourselves.
# no need to reset @accessors here because merging will not disrupt any existing field paths
# and if new ones are created they will be picked up.
LogStash::Util.hash_merge(@data, event.to_hash)
end # append
# Remove a field or field reference. Returns the value of that field when
# deleted
public
def remove(fieldref)
@accessors.del(fieldref)
end # def remove
# sprintf. This could use a better method name.
# The idea is to take an event and convert it to a string based on
# any format values, delimited by %{foo} where 'foo' is a field or
# metadata member.
#
# For example, if the event has type == "foo" and host == "bar"
# then this string:
# "type is %{type} and source is %{host}"
# will return
# "type is foo and source is bar"
#
# If a %{name} value is an array, then we will join by ','
# If a %{name} value does not exist, then no substitution occurs.
public
def sprintf(format)
LogStash::StringInterpolation.evaluate(self, format)
end
def tag(value)
# Generalize this method for more usability
self["tags"] ||= []
self["tags"] << value unless self["tags"].include?(value)
end
private
def init_timestamp(o)
begin
timestamp = LogStash::Timestamp.coerce(o)
return timestamp if timestamp
LOGGER.warn("Unrecognized #{TIMESTAMP} value, setting current time to #{TIMESTAMP}, original in #{TIMESTAMP_FAILURE_FIELD}field", :value => o.inspect)
rescue LogStash::TimestampParserError => e
LOGGER.warn("Error parsing #{TIMESTAMP} string, setting current time to #{TIMESTAMP}, original in #{TIMESTAMP_FAILURE_FIELD} field", :value => o.inspect, :exception => e.message)
end
@data["tags"] ||= []
@data["tags"] << TIMESTAMP_FAILURE_TAG unless @data["tags"].include?(TIMESTAMP_FAILURE_TAG)
@data[TIMESTAMP_FAILURE_FIELD] = o
LogStash::Timestamp.now
end
public
def to_hash_with_metadata
@metadata.empty? ? to_hash : to_hash.merge(METADATA => @metadata)
end
public
def to_json_with_metadata(*args)
# ignore arguments to respect accepted to_json method signature
LogStash::Json.dump(to_hash_with_metadata)
end # def to_json
def self.validate_value(value)
case value
when String
raise("expected UTF-8 encoding for value=#{value}, encoding=#{value.encoding.inspect}") unless value.encoding == Encoding::UTF_8
raise("invalid UTF-8 encoding for value=#{value}, encoding=#{value.encoding.inspect}") unless value.valid_encoding?
value
when Array
value.each{|v| validate_value(v)} # don't map, return original object
value
else
value
end
end
end # class LogStash::Event

View file

@ -0,0 +1,97 @@
# encoding: utf-8
require "logstash/environment"
require "logstash/json"
require "forwardable"
require "date"
require "time"
module LogStash
class TimestampParserError < StandardError; end
class Timestamp
extend Forwardable
include Comparable
def_delegators :@time, :tv_usec, :usec, :year, :iso8601, :to_i, :tv_sec, :to_f, :to_edn, :<=>, :+
attr_reader :time
ISO8601_STRFTIME = "%04d-%02d-%02dT%02d:%02d:%02d.%06d%+03d:00".freeze
ISO8601_PRECISION = 3
def initialize(time = Time.new)
@time = time.utc
end
def self.at(*args)
Timestamp.new(::Time.at(*args))
end
def self.parse(*args)
Timestamp.new(::Time.parse(*args))
end
def self.now
Timestamp.new(::Time.now)
end
# coerce tries different strategies based on the time object class to convert into a Timestamp.
# @param [String, Time, Timestamp] time the time object to try coerce
# @return [Timestamp, nil] Timestamp will be returned if successful otherwise nil
# @raise [TimestampParserError] on String with invalid format
def self.coerce(time)
case time
when String
LogStash::Timestamp.parse_iso8601(time)
when LogStash::Timestamp
time
when Time
LogStash::Timestamp.new(time)
else
nil
end
end
if LogStash::Environment.jruby?
JODA_ISO8601_PARSER = org.joda.time.format.ISODateTimeFormat.dateTimeParser
UTC = org.joda.time.DateTimeZone.forID("UTC")
def self.parse_iso8601(t)
millis = JODA_ISO8601_PARSER.parseMillis(t)
LogStash::Timestamp.at(millis / 1000, (millis % 1000) * 1000)
rescue => e
raise(TimestampParserError, "invalid timestamp string #{t.inspect}, error=#{e.inspect}")
end
else
def self.parse_iso8601(t)
# warning, ruby's Time.parse is *really* terrible and slow.
LogStash::Timestamp.new(::Time.parse(t))
rescue => e
raise(TimestampParserError, "invalid timestamp string #{t.inspect}, error=#{e.inspect}")
end
end
def utc
@time.utc # modifies the receiver
self
end
alias_method :gmtime, :utc
def to_json(*args)
# ignore arguments to respect accepted to_json method signature
"\"" + to_iso8601 + "\""
end
alias_method :inspect, :to_json
def to_iso8601
@iso8601 ||= @time.iso8601(ISO8601_PRECISION)
end
alias_method :to_s, :to_iso8601
def -(value)
@time - (value.is_a?(Timestamp) ? value.time : value)
end
end
end

View file

@ -0,0 +1,123 @@
# encoding: utf-8
require "logstash/namespace"
require "logstash/util"
require "thread_safe"
module LogStash::Util
# PathCache is a singleton which globally caches the relation between a field reference and its
# decomposition into a [key, path array] tuple. For example the field reference [foo][bar][baz]
# is decomposed into ["baz", ["foo", "bar"]].
module PathCache
extend self
# requiring libraries and defining constants is thread safe in JRuby so
# PathCache::CACHE will be corretly initialized, once, when accessors.rb
# will be first required
CACHE = ThreadSafe::Cache.new
def get(field_reference)
# the "get_or_default(x, nil) || put(x, parse(x))" is ~2x faster than "get || put" because the get call is
# proxied through the JRuby JavaProxy op_aref method. the correct idiom here would be to use
# "compute_if_absent(x){parse(x)}" but because of the closure creation, it is ~1.5x slower than
# "get_or_default || put".
# this "get_or_default || put" is obviously non-atomic which is not really important here
# since all threads will set the same value and this cache will stabilize very quickly after the first
# few events.
CACHE.get_or_default(field_reference, nil) || CACHE.put(field_reference, parse(field_reference))
end
def parse(field_reference)
path = field_reference.split(/[\[\]]/).select{|s| !s.empty?}
[path.pop, path]
end
end
# Accessors uses a lookup table to speedup access of a field reference of the form
# "[hello][world]" to the underlying store hash into {"hello" => {"world" => "foo"}}
class Accessors
# @param store [Hash] the backing data store field refereces point to
def initialize(store)
@store = store
# @lut is a lookup table between a field reference and a [target, key] tuple
# where target is the containing Hash or Array for key in @store.
# this allows us to directly access the containing object for key instead of
# walking the field reference path into the inner @store objects
@lut = {}
end
# @param field_reference [String] the field reference
# @return [Object] the value in @store for this field reference
def get(field_reference)
target, key = lookup(field_reference)
return nil unless target
target.is_a?(Array) ? target[key.to_i] : target[key]
end
# @param field_reference [String] the field reference
# @param value [Object] the value to set in @store for this field reference
# @return [Object] the value set
def set(field_reference, value)
target, key = lookup_or_create(field_reference)
target[target.is_a?(Array) ? key.to_i : key] = value
end
# @param field_reference [String] the field reference to remove
# @return [Object] the removed value in @store for this field reference
def del(field_reference)
target, key = lookup(field_reference)
return nil unless target
target.is_a?(Array) ? target.delete_at(key.to_i) : target.delete(key)
end
# @param field_reference [String] the field reference to test for inclusion in the store
# @return [Boolean] true if the store contains a value for this field reference
def include?(field_reference)
target, key = lookup(field_reference)
return false unless target
target.is_a?(Array) ? !target[key.to_i].nil? : target.include?(key)
end
private
# retrieve the [target, key] tuple associated with this field reference
# @param field_reference [String] the field referece
# @return [[Object, String]] the [target, key] tuple associated with this field reference
def lookup(field_reference)
@lut[field_reference] ||= find_target(field_reference)
end
# retrieve the [target, key] tuple associated with this field reference and create inner
# container objects if they do not exists
# @param field_reference [String] the field referece
# @return [[Object, String]] the [target, key] tuple associated with this field reference
def lookup_or_create(field_reference)
@lut[field_reference] ||= find_or_create_target(field_reference)
end
# find the target container object in store for this field reference
# @param field_reference [String] the field referece
# @return [Object] the target container object in store associated with this field reference
def find_target(field_reference)
key, path = PathCache.get(field_reference)
target = path.inject(@store) do |r, k|
return nil unless r
r[r.is_a?(Array) ? k.to_i : k]
end
target ? [target, key] : nil
end
# find the target container object in store for this field reference and create inner
# container objects if they do not exists
# @param field_reference [String] the field referece
# @return [Object] the target container object in store associated with this field reference
def find_or_create_target(accessor)
key, path = PathCache.get(accessor)
target = path.inject(@store) {|r, k| r[r.is_a?(Array) ? k.to_i : k] ||= {}}
[target, key]
end
end # class Accessors
end # module LogStash::Util

View file

@ -0,0 +1,23 @@
# -*- encoding: utf-8 -*-
lib = File.expand_path('../lib', __FILE__)
$LOAD_PATH.unshift(lib) unless $LOAD_PATH.include?(lib)
require 'logstash-core-event/version'
Gem::Specification.new do |gem|
gem.authors = ["Jordan Sissel", "Pete Fritchman", "Elasticsearch"]
gem.email = ["jls@semicomplete.com", "petef@databits.net", "info@elasticsearch.com"]
gem.description = %q{The core event component of logstash, the scalable log and event management tool}
gem.summary = %q{logstash-core-event - The core event component of logstash}
gem.homepage = "http://www.elastic.co/guide/en/logstash/current/index.html"
gem.license = "Apache License (2.0)"
gem.files = Dir.glob(["logstash-core-event.gemspec", "lib/**/*.rb", "spec/**/*.rb"])
gem.test_files = gem.files.grep(%r{^(test|spec|features)/})
gem.name = "logstash-core-event"
gem.require_paths = ["lib"]
gem.version = LOGSTASH_CORE_EVENT_VERSION
if RUBY_PLATFORM == 'java'
gem.platform = RUBY_PLATFORM
end
end

View file

@ -1,5 +1,6 @@
# encoding: utf-8 # encoding: utf-8
require "spec_helper" require "spec_helper"
require "json"
describe LogStash::Event do describe LogStash::Event do
@ -184,7 +185,6 @@ describe LogStash::Event do
"type" => "new", "type" => "new",
"message" => "foo bar", "message" => "foo bar",
) )
subject.overwrite(new_event) subject.overwrite(new_event)
expect(subject["message"]).to eq("foo bar") expect(subject["message"]).to eq("foo bar")
@ -198,7 +198,7 @@ describe LogStash::Event do
context "#append" do context "#append" do
it "should append strings to an array" do it "should append strings to an array" do
what = subject.append(LogStash::Event.new("message" => "another thing")) subject.append(LogStash::Event.new("message" => "another thing"))
expect(subject["message"]).to eq([ "hello world", "another thing" ]) expect(subject["message"]).to eq([ "hello world", "another thing" ])
end end
@ -241,7 +241,6 @@ describe LogStash::Event do
expect(subject[ "field1" ]).to eq([ "original1", "append1" ]) expect(subject[ "field1" ]).to eq([ "original1", "append1" ])
end end
end end
context "when event field is an array" do context "when event field is an array" do
before { subject[ "field1" ] = [ "original1", "original2" ] } before { subject[ "field1" ] = [ "original1", "original2" ] }
@ -319,46 +318,45 @@ describe LogStash::Event do
it "should coerce timestamp" do it "should coerce timestamp" do
t = Time.iso8601("2014-06-12T00:12:17.114Z") t = Time.iso8601("2014-06-12T00:12:17.114Z")
# expect(LogStash::Timestamp).to receive(:coerce).exactly(3).times.and_call_original
expect(LogStash::Event.new("@timestamp" => t).timestamp.to_i).to eq(t.to_i) expect(LogStash::Event.new("@timestamp" => t).timestamp.to_i).to eq(t.to_i)
expect(LogStash::Event.new("@timestamp" => LogStash::Timestamp.new(t)).timestamp.to_i).to eq(t.to_i) expect(LogStash::Event.new("@timestamp" => LogStash::Timestamp.new(t)).timestamp.to_i).to eq(t.to_i)
expect(LogStash::Event.new("@timestamp" => "2014-06-12T00:12:17.114Z").timestamp.to_i).to eq(t.to_i) expect(LogStash::Event.new("@timestamp" => "2014-06-12T00:12:17.114Z").timestamp.to_i).to eq(t.to_i)
end end
it "should assign current time when no timestamp" do it "should assign current time when no timestamp" do
# ts = LogStash::Timestamp.now expect(LogStash::Event.new({}).timestamp.to_i).to be_within(1).of (Time.now.to_i)
# expect(LogStash::Timestamp).to receive(:now).and_return(ts)
expect(LogStash::Event.new({}).timestamp.to_i).to be_within(1).of Time.now.to_i
end end
it "should tag and warn for invalid value" do it "should tag for invalid value" do
ts = LogStash::Timestamp.now
# TODO(talevy): make pass. bridge between error in Java to Ruby
# expect(LogStash::Timestamp).to receive(:now).twice.and_return(ts)
# expect(LogStash::Event::LOGGER).to receive(:warn).twice
event = LogStash::Event.new("@timestamp" => :foo) event = LogStash::Event.new("@timestamp" => :foo)
expect(event.timestamp.to_i).to eq(ts.to_i) expect(event.timestamp.to_i).to be_within(1).of Time.now.to_i
expect(event["tags"]).to eq([LogStash::Event::TIMESTAMP_FAILURE_TAG]) expect(event["tags"]).to eq([LogStash::Event::TIMESTAMP_FAILURE_TAG])
expect(event[LogStash::Event::TIMESTAMP_FAILURE_FIELD]).to eq(:foo) expect(event[LogStash::Event::TIMESTAMP_FAILURE_FIELD]).to eq(:foo)
event = LogStash::Event.new("@timestamp" => 666) event = LogStash::Event.new("@timestamp" => 666)
expect(event.timestamp.to_i).to eq(ts.to_i) expect(event.timestamp.to_i).to be_within(1).of Time.now.to_i
expect(event["tags"]).to eq([LogStash::Event::TIMESTAMP_FAILURE_TAG]) expect(event["tags"]).to eq([LogStash::Event::TIMESTAMP_FAILURE_TAG])
expect(event[LogStash::Event::TIMESTAMP_FAILURE_FIELD]).to eq(666) expect(event[LogStash::Event::TIMESTAMP_FAILURE_FIELD]).to eq(666)
end end
it "should tag and warn for invalid string format" do it "should warn for invalid value" do
ts = LogStash::Timestamp.now expect(LogStash::Event::LOGGER).to receive(:warn).twice
# TODO(talevy): make pass. bridge between error in Java to Ruby
# expect(LogStash::Timestamp).to receive(:now).and_return(ts)
# expect(LogStash::Event::LOGGER).to receive(:warn)
LogStash::Event.new("@timestamp" => :foo)
LogStash::Event.new("@timestamp" => 666)
end
it "should tag for invalid string format" do
event = LogStash::Event.new("@timestamp" => "foo") event = LogStash::Event.new("@timestamp" => "foo")
expect(event.timestamp.to_i).to eq(ts.to_i) expect(event.timestamp.to_i).to be_within(1).of Time.now.to_i
expect(event["tags"]).to eq([LogStash::Event::TIMESTAMP_FAILURE_TAG]) expect(event["tags"]).to eq([LogStash::Event::TIMESTAMP_FAILURE_TAG])
expect(event[LogStash::Event::TIMESTAMP_FAILURE_FIELD]).to eq("foo") expect(event[LogStash::Event::TIMESTAMP_FAILURE_FIELD]).to eq("foo")
end end
it "should warn for invalid string format" do
expect(LogStash::Event::LOGGER).to receive(:warn)
LogStash::Event.new("@timestamp" => "foo")
end
end end
context "to_json" do context "to_json" do
@ -369,7 +367,7 @@ describe LogStash::Event do
) )
json = new_event.to_json json = new_event.to_json
expect(json).to eq( "{\"@timestamp\":\"2014-09-23T19:26:15.832Z\",\"@version\":\"1\",\"message\":\"foo bar\"}") expect(JSON.parse(json)).to eq( JSON.parse("{\"@timestamp\":\"2014-09-23T19:26:15.832Z\",\"message\":\"foo bar\",\"@version\":\"1\"}"))
end end
it "should support to_json and ignore arguments" do it "should support to_json and ignore arguments" do
@ -379,7 +377,7 @@ describe LogStash::Event do
) )
json = new_event.to_json(:foo => 1, :bar => "baz") json = new_event.to_json(:foo => 1, :bar => "baz")
expect(json).to eq( "{\"@timestamp\":\"2014-09-23T19:26:15.832Z\",\"@version\":\"1\",\"message\":\"foo bar\"}") expect(JSON.parse(json)).to eq( JSON.parse("{\"@timestamp\":\"2014-09-23T19:26:15.832Z\",\"message\":\"foo bar\",\"@version\":\"1\"}"))
end end
end end

View file

@ -1,8 +1,17 @@
# encoding: utf-8 # encoding: utf-8
require "spec_helper" require "spec_helper"
require "logstash/util/accessors"
describe LogStash::Util::Accessors, :if => true dogit ad # this is to skip specs when running agains an alternate logstash-core-event implementation
# that does not define the Accessors class. For example, in logstash-core-event-java
# the Accessors class does not exists in the Ruby namespace.
class_exists = begin
require "logstash/util/accessors"
true
rescue LoadError
false
end
describe "LogStash::Util::Accessors", :if => class_exists do
context "using simple field" do context "using simple field" do

View file

@ -0,0 +1 @@
require "logstash-core/logstash-core"

View file

@ -1,6 +1,8 @@
# encoding: utf-8 # encoding: utf-8
# The version of logstash.
LOGSTASH_VERSION = "3.0.0.dev"
# The version of logstash core gem.
#
# Note to authors: this should not include dashes because 'gem' barfs if # Note to authors: this should not include dashes because 'gem' barfs if
# you include a dash in the version string. # you include a dash in the version string.
LOGSTASH_CORE_VERSION = "3.0.0.dev"

View file

@ -1,18 +1,10 @@
# encoding: utf-8 # encoding: utf-8
require "logstash/errors" require "logstash/errors"
require "logstash/version"
module LogStash module LogStash
module Environment module Environment
extend self extend self
# rehydrate the bootstrap environment if the startup was not done by executing bootstrap.rb
# and we are in the context of the logstash package
if !LogStash::Environment.const_defined?("LOGSTASH_HOME") && !ENV["LOGSTASH_HOME"].to_s.empty?
$LOAD_PATH << ::File.join(ENV["LOGSTASH_HOME"], "lib")
require "bootstrap/environment"
end
LOGSTASH_CORE = ::File.expand_path(::File.join(::File.dirname(__FILE__), "..", "..")) LOGSTASH_CORE = ::File.expand_path(::File.join(::File.dirname(__FILE__), "..", ".."))
LOGSTASH_ENV = (ENV["LS_ENV"] || 'production').to_s.freeze LOGSTASH_ENV = (ENV["LS_ENV"] || 'production').to_s.freeze
@ -81,14 +73,6 @@ module LogStash
::Gem.win_platform? ::Gem.win_platform?
end end
def vendor_path(path)
return ::File.join(LOGSTASH_HOME, "vendor", path)
end
def pattern_path(path)
return ::File.join(LOGSTASH_HOME, "patterns", path)
end
def locales_path(path) def locales_path(path)
return ::File.join(LOGSTASH_CORE, "locales", path) return ::File.join(LOGSTASH_CORE, "locales", path)
end end

View file

@ -178,12 +178,16 @@ class LogStash::Filters::Base < LogStash::Plugin
LogStash::Util::Decorators.add_tags(@add_tag,event,"filters/#{self.class.name}") LogStash::Util::Decorators.add_tags(@add_tag,event,"filters/#{self.class.name}")
# note below that the tags array field needs to be updated then reassigned to the event.
# this is important because a construct like event["tags"].delete(tag) will not work
# in the current Java event implementation. see https://github.com/elastic/logstash/issues/4140
@remove_tag.each do |tag| @remove_tag.each do |tag|
break if event["tags"].nil? tags = event["tags"]
break if tags.nil? || tags.empty?
tag = event.sprintf(tag) tag = event.sprintf(tag)
@logger.debug? and @logger.debug("filters/#{self.class.name}: removing tag", @logger.debug? and @logger.debug("filters/#{self.class.name}: removing tag", :tag => tag)
:tag => tag) tags.delete(tag)
event["tags"].delete(tag) event["tags"] = tags
end end
end # def filter_matched end # def filter_matched

Some files were not shown because too many files have changed in this diff Show more