Add SC and EB input TLS support for the Logstash ArcSight module (#10056)

* Add SC and EB input TLS support for the Logstash ArcSight module
I added the ssl for the smart connector (tcp) and ssl and sasl for the
event broker.
Needs verification on a current stack.

* So this is the actual extent of changes (I thought it was to simple)
Fixed the docs omission
Fixed the load path issue
Fixed the ERB binding context problem
Added some basic happy path tests

* remove module settings in logstash.yml

* Overwrite my logstash.yml with the content on master

* Add comments to yaml fixtures
This commit is contained in:
Guy Boertje 2018-10-23 09:12:12 +01:00 committed by GitHub
parent 96ae3fc158
commit 3fdee027d9
No known key found for this signature in database
GPG key ID: 4AEE18F83AFDEB23
9 changed files with 399 additions and 27 deletions

View file

@ -93,7 +93,7 @@
# ------------ Module Settings ---------------
# Define modules here. Modules definitions must be defined as an array.
# The simple way to see this is to prepend each `name` with a `-`, and keep
# all associated variables under the `name` they are associated with, and
# all associated variables under the `name` they are associated with, and
# above the next, like this:
#
# modules:
@ -103,7 +103,7 @@
# var.PLUGINTYPE2.PLUGINNAME1.KEY1: VALUE
# var.PLUGINTYPE3.PLUGINNAME3.KEY1: VALUE
#
# Module variable names must be in the format of
# Module variable names must be in the format of
#
# var.PLUGIN_TYPE.PLUGIN_NAME.KEY
#

View file

@ -12,10 +12,10 @@ and is therefore free to use. Please contact
mailto:arcsight@elastic.co[arcsight@elastic.co] for questions or more
information.
The Logstash ArcSight module enables you to easily integrate your ArcSight data with the Elastic Stack.
With a single command, the module taps directly into the ArcSight Smart Connector or the Event Broker,
parses and indexes the security events into Elasticsearch, and installs a suite of Kibana dashboards
to get you exploring your data immediately.
The Logstash ArcSight module enables you to easily integrate your ArcSight data with the Elastic Stack.
With a single command, the module taps directly into the ArcSight Smart Connector or the Event Broker,
parses and indexes the security events into Elasticsearch, and installs a suite of Kibana dashboards
to get you exploring your data immediately.
[[arcsight-prereqs]]
==== Prerequisites
@ -51,10 +51,10 @@ the Smart Connector directly.
image::static/images/arcsight-diagram-smart-connectors.svg[ArcSight Smart Connector architecture]
Smart Connector has been configured to publish ArcSight data (to TCP port `5000`) using the CEF syslog
Smart Connector has been configured to publish ArcSight data (to TCP port `5000`) using the CEF syslog
destination.
NOTE: Logstash, Elasticsearch, and Kibana must run locally. Note that you can also run
NOTE: Logstash, Elasticsearch, and Kibana must run locally. Note that you can also run
Elasticsearch, Kibana and Logstash on separate hosts to consume data from ArcSight.
[[arcsight-instructions-smartconnector]]
@ -66,9 +66,9 @@ Logstash install directory with your respective Smart Connector host and port:
["source","shell",subs="attributes"]
-----
bin/logstash --modules arcsight --setup \
-M "arcsight.var.input.smartconnector.bootstrap_servers={smart_connect_host}:{smart_connect_port}" \
-M "arcsight.var.input.smartconnector.port={smart_connect_port}" \
-M "arcsight.var.elasticsearch.hosts=localhost:9200" \
-M "arcsight.var.kibana.host=localhost:5601"
-M "arcsight.var.kibana.host=localhost:5601"
-----
+
--
@ -105,9 +105,9 @@ image::static/images/arcsight-diagram-adp.svg[ArcSight Event Broker architecture
By default, the Logstash ArcSight module consumes from the Event Broker "eb-cef" topic.
For additional settings, see <<arcsight-module-config>>. Consuming from a
secured Event Broker port is not currently available.
secured Event Broker port is possible, see <<arcsight-module-config>>.
NOTE: Logstash, Elasticsearch, and Kibana must run locally. Note that you can also run
NOTE: Logstash, Elasticsearch, and Kibana must run locally. Note that you can also run
Elasticsearch, Kibana and Logstash on separate hosts to consume data from ArcSight.
[[arcsight-instructions-eventbroker]]
@ -204,7 +204,7 @@ like in the getting started. For more information about configuring modules, see
As an example, the following settings can be appended to `logstash.yml` to
configure your module:
["source","yaml",subs="attributes"]
["source","yaml",subs="attributes"]
-----
modules:
- name: arcsight
@ -223,7 +223,7 @@ modules:
The ArcSight module provides the following settings for configuring the behavior
of the module. These settings include ArcSight-specific options plus common
options that are supported by all Logstash modules.
options that are supported by all Logstash modules.
When you override a setting at the command line, remember to prefix the setting
with the module name, for example, `arcsight.var.inputs` instead of `var.inputs`.
@ -243,6 +243,8 @@ Set the input(s) to expose for the Logstash ArcSight module. Valid settings are
"eventbroker", "smartconnector", or "eventbroker,smartconnector" (exposes both
inputs concurrently).
*ArcSight Module Event Broker specific Options*
*`var.input.eventbroker.bootstrap_servers`*::
+
--
@ -265,15 +267,181 @@ servers. (You may want more than one in case a server is down.)
+
A list of Event Broker topics to subscribe to.
*`var.input.eventbroker.security_protocol`*::
+
--
* Value can be any of: `PLAINTEXT`, `SSL`, `SASL_PLAINTEXT`, `SASL_SSL`
* Default value is `"PLAINTEXT"`
--
Security protocol to use, which can be either of PLAINTEXT, SSL, SASL_PLAINTEXT, SASL_SSL. If you specify anything other than PLAINTEXT then you need to also specify some of the options listed below. When specifying `SSL` or `SASL_SSL` you should supply values for the options prefixed with `ssl_`, when specifying `SASL_PLAINTEXT` or `SASL_SSL` you should supply values for `jaas_path`, `kerberos_config`, `sasl_mechanism` and `sasl_kerberos_service_name`.
*`var.input.eventbroker.ssl_key_password`*::
+
--
* Value type is <<password,password>>
* There is no default value for this setting.
--
The password of the private key in the key store file.
*`var.input.eventbroker.ssl_keystore_location`*::
+
--
* Value type is <<path,path>>
* There is no default value for this setting.
--
If client authentication is required, this setting stores the keystore path.
*`var.input.eventbroker.ssl_keystore_password`*::
+
--
* Value type is <<password,password>>
* There is no default value for this setting.
--
If client authentication is required, this setting stores the keystore password.
*`var.input.eventbroker.ssl_keystore_type`*::
+
--
* Value type is <<string,string>>
* There is no default value for this setting.
--
The keystore type.
*`var.input.eventbroker.ssl_truststore_location`*::
+
--
* Value type is <<path,path>>
* There is no default value for this setting.
--
The JKS truststore path to validate the Kafka broker's certificate.
*`var.input.eventbroker.ssl_truststore_password`*::
+
--
* Value type is <<password,password>>
* There is no default value for this setting.
--
The truststore password.
*`var.input.eventbroker.ssl_truststore_type`*::
+
--
* Value type is <<string,string>>
* There is no default value for this setting.
--
The truststore type.
*`var.input.eventbroker.sasl_kerberos_service_name`*::
+
--
* Value type is <<string,string>>
* There is no default value for this setting.
--
The Kerberos principal name that Kafka broker runs as.
This can be defined either in Kafka's JAAS config or in Kafka's config.
*`var.input.eventbroker.sasl_mechanism`*::
+
--
* Value type is <<string,string>>
* Default value is `"GSSAPI"`
--
http://kafka.apache.org/documentation.html#security_sasl[SASL mechanism] used for client connections.
This may be any mechanism for which a security provider is available.
GSSAPI is the default mechanism.
*`var.input.eventbroker.jaas_path`*::
+
--
* Value type is <<path,path>>
* There is no default value for this setting.
--
The Java Authentication and Authorization Service (JAAS) API supplies user authentication and authorization
services for Kafka. This setting provides the path to the JAAS file. Sample JAAS file for Kafka client:
[source,java]
----------------------------------
KafkaClient {
com.sun.security.auth.module.Krb5LoginModule required
useTicketCache=true
renewTicket=true
serviceName="kafka";
};
----------------------------------
Please note that specifying `jaas_path` and `kerberos_config` here will add these
to the global JVM system properties. This means if you have multiple Kafka inputs,
all of them would be sharing the same `jaas_path` and `kerberos_config`.
If this is not desirable, you would have to run separate instances of Logstash on
different JVM instances.
*`var.input.eventbroker.kerberos_config`*::
+
--
* Value type is <<path,path>>
* There is no default value for this setting.
--
Optional path to kerberos config file. This is krb5.conf style as detailed in https://web.mit.edu/kerberos/krb5-1.12/doc/admin/conf_files/krb5_conf.html
*ArcSight Module Smart Connector specific Options*
*`var.input.smartconnector.port`*::
+
--
* Value type is <<number,number>>
* Default value is 5000
--
+
The TCP port to listen on when receiving data from SCs.
*`var.input.smartconnector.ssl_enable`*::
+
--
* Value type is <<boolean,boolean>>
* Default value is `false`
--
Enable SSL (must be set for other `ssl_` options to take effect).
*`var.input.smartconnector.ssl_cert`*::
+
--
* Value type is <<path,path>>
* There is no default value for this setting.
--
SSL certificate path.
*`var.input.smartconnector.ssl_extra_chain_certs`*::
+
--
* Value type is <<array,array>>
* Default value is `[]`
--
An Array of paths to extra X509 certificates to be added to the certificate chain.
Useful when the CA chain is not necessary in the system store.
*`var.input.smartconnector.ssl_key`*::
+
--
* Value type is <<path,path>>
* There is no default value for this setting.
--
SSL key path
*`var.input.smartconnector.ssl_key_passphrase`*::
+
--
* Value type is <<password,password>>
* Default value is `nil`
--
SSL key passphrase
*`var.input.smartconnector.ssl_verify`*::
+
--
* Value type is <<boolean,boolean>>
* Default value is `true`
--
Verify the identity of the other end of the SSL connection against the CA.
For input, sets the field `sslsubject` to that of the client certificate.
:smart_connect_host!:
:smart_connect_port!:
:event_broker_host!:

View file

@ -56,11 +56,25 @@ module LogStash module Modules class LogStashConfig
get_setting(LogStash::Setting::NullableString.new(name, default.to_s))
when Numeric
get_setting(LogStash::Setting::Numeric.new(name, default))
when true, false
get_setting(LogStash::Setting::Boolean.new(name, default))
else
get_setting(LogStash::Setting::NullableString.new(name, default.to_s))
end
end
def has_setting?(name)
@settings.key?(name)
end
def raw_setting(name)
@settings[name]
end
def fetch_raw_setting(name, default)
@settings.fetch(name, default)
end
def elasticsearch_output_config(type_string = nil)
hosts = array_to_string(get_setting(LogStash::Setting::SplittableStringArray.new("var.elasticsearch.hosts", String, ["localhost:9200"])))
index = "#{@name}-#{setting("var.elasticsearch.index_suffix", "%{+YYYY.MM.dd}")}"

View file

@ -3,7 +3,10 @@
# you may not use this file except in compliance with the Elastic License.
require "logstash/runner" # needed for LogStash::XPACK_PATH
$LOAD_PATH << File.join(LogStash::XPACK_PATH, "modules", "azure", "lib")
xpack_modules = ["azure", "arcsight"]
xpack_modules.each do |name|
$LOAD_PATH << File.join(LogStash::XPACK_PATH, "modules", name, "lib")
end
require "logstash/plugins/registry"
require "logstash/modules/util"
require "monitoring/monitoring"
@ -15,15 +18,14 @@ require "filters/azure_event"
LogStash::PLUGIN_REGISTRY.add(:input, "metrics", LogStash::Inputs::Metrics)
LogStash::PLUGIN_REGISTRY.add(:universal, "monitoring", LogStash::MonitoringExtension)
LogStash::PLUGIN_REGISTRY.add(:universal, "config_management", LogStash::ConfigManagement::Extension)
LogStash::PLUGIN_REGISTRY.add(:modules, "arcsight",
LogStash::Modules::XpackScaffold.new("arcsight",
File.join(File.dirname(__FILE__), "..", "..", "modules", "arcsight", "configuration"),
["basic", "trial", "standard", "gold", "platinum"]
))
LogStash::PLUGIN_REGISTRY.add(:modules, "azure",
LogStash::Modules::XpackScaffold.new("azure",
File.join(File.dirname(__FILE__), "..", "..", "modules", "azure", "configuration"),
["basic", "trial", "standard", "gold", "platinum"]
))
LogStash::PLUGIN_REGISTRY.add(:filter, "azure_event", LogStash::Filters::AzureEvent)
license_levels = Hash.new
license_levels.default = ["basic", "trial", "standard", "gold", "platinum"]
xpack_modules.each do |name|
path = File.join(File.dirname(__FILE__), "..", "..", "modules", name, "configuration")
LogStash::PLUGIN_REGISTRY.add(:modules, name,
LogStash::Modules::XpackScaffold.new(name, path, license_levels[name]))
end
LogStash::PLUGIN_REGISTRY.add(:filter, "azure_event", LogStash::Filters::AzureEvent)

View file

@ -11,6 +11,7 @@ alias_settings_keys!(
"var.input.kafka" => "var.input.eventbroker",
"var.input.tcp" => "var.input.smartconnector"
})
require 'arcsight_module_config_helper'
%>
input {
@ -19,6 +20,7 @@ input {
codec => cef
bootstrap_servers => <%= csv_string(get_setting(LogStash::Setting::SplittableStringArray.new("var.input.kafka.bootstrap_servers", String, "localhost:39092"))) %>
topics => <%= array_to_string(get_setting(LogStash::Setting::SplittableStringArray.new("var.input.kafka.topics", String, ["eb-cef"]))) %>
<%= LogStash::Arcsight::ConfigHelper.kafka_input_ssl_sasl_config(self) %>
type => syslog
}
<% end %>
@ -28,6 +30,7 @@ input {
# The delimiter config used is for TCP interpretation
codec => cef { delimiter => "\r\n" }
port => <%= setting("var.input.tcp.port", 5000) %>
<%= LogStash::Arcsight::ConfigHelper.tcp_input_ssl_config(self) %>
type => syslog
}
<% end %>

View file

@ -0,0 +1,67 @@
# Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
# or more contributor license agreements. Licensed under the Elastic License;
# you may not use this file except in compliance with the Elastic License.
require 'logstash/namespace'
module LogStash
module Arcsight
module ConfigHelper
extend self
def kafka_input_ssl_sasl_config(bound_scope)
security_protocol = bound_scope.setting("var.input.kafka.security_protocol", "unset")
return "" if security_protocol == "unset"
lines = ["security_protocol => '#{security_protocol}'"]
lines.push("ssl_truststore_type => '#{bound_scope.setting("var.input.kafka.ssl_truststore_type", "")}'")
ssl_truststore_location = bound_scope.setting("var.input.kafka.ssl_truststore_location","")
lines.push("ssl_truststore_location => '#{ssl_truststore_location}'") unless ssl_truststore_location.empty?
["ssl_truststore_password", "ssl_keystore_password", "ssl_key_password"].each do |name|
full_name = "var.input.kafka.".concat(name)
lines.push("#{name} => '#{bound_scope.raw_setting(full_name)}'") if bound_scope.has_setting?(full_name)
end
lines.push("ssl_keystore_type => '#{bound_scope.setting("var.input.kafka.ssl_keystore_type", "")}'")
ssl_keystore_location = bound_scope.setting("var.input.kafka.ssl_keystore_location","")
lines.push("ssl_keystore_location => '#{ssl_keystore_location}'") unless ssl_keystore_location.empty?
lines.push("sasl_mechanism => '#{bound_scope.setting("var.input.kafka.sasl_mechanism", "")}'")
lines.push("sasl_kerberos_service_name => '#{bound_scope.setting("var.input.kafka.sasl_kerberos_service_name", "")}'")
jaas_path = bound_scope.setting("var.input.kafka.jaas_path","")
lines.push("jaas_path => '#{jaas_path}'") unless jaas_path.empty?
kerberos_config = bound_scope.setting("var.input.kafka.kerberos_config","")
lines.push("kerberos_config => '#{kerberos_config}'") unless kerberos_config.empty?
lines.compact.join("\n ")
end
def tcp_input_ssl_config(bound_scope)
ssl_enabled = bound_scope.setting("var.input.tcp.ssl_enable", false)
return "" unless ssl_enabled
lines = ["ssl_enable => true"]
verify_enabled = bound_scope.setting("var.input.tcp.ssl_verify", true)
lines.push("ssl_verify => #{verify_enabled}")
ssl_cert = bound_scope.setting("var.input.tcp.ssl_cert","")
lines.push("ssl_cert => '#{ssl_cert}'") unless ssl_cert.empty?
ssl_key = bound_scope.setting("var.input.tcp.ssl_key","")
lines.push("ssl_key => '#{ssl_key}'") unless ssl_key.empty?
lines.push("ssl_key_passphrase => '#{ bound_scope.setting("var.input.tcp.ssl_key_passphrase", "")}'")
certs_array_as_string = bound_scope.array_to_string(
bound_scope.get_setting(LogStash::Setting::SplittableStringArray.new("var.input.tcp.ssl_extra_chain_certs", String, []))
)
lines.push("ssl_extra_chain_certs => #{certs_array_as_string}")
lines.compact.join("\n ")
end
end
end
end

View file

@ -0,0 +1,81 @@
# encoding: utf-8
# Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
# or more contributor license agreements. Licensed under the Elastic License;
# you may not use this file except in compliance with the Elastic License.
# require "logstash/devutils/rspec/spec_helper"
require 'x-pack/logstash_registry'
require 'logstash-core'
require 'logstash/settings'
require 'logstash/util/modules_setting_array'
require 'logstash/modules/scaffold'
require 'arcsight_module_config_helper'
describe LogStash::Arcsight::ConfigHelper do
let(:sample_yaml_folder) { ::File.join(File.dirname(__FILE__), "yaml") }
let(:settings) { settings = LogStash::SETTINGS.clone }
let(:module_name) { "arcsight" }
let(:module_hash) { Hash.new }
let(:scaffolding) { LogStash::Modules::Scaffold.new(module_name, ::File.join(File.dirname(__FILE__), '..', '..', '..', 'modules', 'arcsight', 'configuration')) }
let(:module_config) { LogStash::Modules::LogStashConfig.new(scaffolding, module_hash) }
describe "`tcp_input_ssl_config` method" do
context "when ssl_enabled is not enabled" do
it "returns an empty string" do
expect(described_class.tcp_input_ssl_config(module_config)).to be_empty
end
end
context "when ssl_enabled is enabled" do
before do
settings.from_yaml(sample_yaml_folder, "smart_connector_with_ssl.yml")
end
let(:module_hash) { settings.get("modules").find {|m| m["name"] == module_name} }
it "returns a tcp input ssl settings snippet" do
actual = described_class.tcp_input_ssl_config(module_config)
expect(actual).not_to be_empty
expect(actual).to include("ssl_enable => true")
expect(actual).to include("ssl_verify => false")
expect(actual).to include("ssl_cert => '/some/path/to/a/cert.p12'")
expect(actual).to include("ssl_key => '/some/path/to/a/cert.p12'")
expect(actual).to include("ssl_key_passphrase => 'foobar'")
expect(actual).to include("ssl_extra_chain_certs => ['/some/path/to/a/cert.p12']")
end
end
end
describe "`kafka_input_ssl_sasl_config` method" do
context "when security_protocol is not set" do
it "returns an empty string" do
expect(described_class.kafka_input_ssl_sasl_config(module_config)).to be_empty
end
end
context "when security_protocol is set" do
before do
settings.from_yaml(sample_yaml_folder, "event_broker_with_security.yml")
end
let(:module_hash) { settings.get("modules").find {|m| m["name"] == module_name} }
it "returns a kafka input security settings snippet" do
actual = described_class.kafka_input_ssl_sasl_config(module_config)
expect(actual).not_to be_empty
expect(actual).to include("security_protocol => 'SASL_SSL'")
expect(actual).to include("ssl_truststore_type => 'foo'")
expect(actual).to include("ssl_truststore_location => '/some/path/to/a/file'")
expect(actual).to include("ssl_truststore_password => '<password>'")
expect(actual).to include("ssl_keystore_password => '<password>'")
expect(actual).to include("ssl_key_password => '<password>'")
expect(actual).to include("ssl_keystore_type => 'bar'")
expect(actual).to include("ssl_keystore_location => '/some/path/to/a/file'")
expect(actual).to include("sasl_mechanism => 'GSSAPI'")
expect(actual).to include("sasl_kerberos_service_name => 'baz'")
expect(actual).to include("jaas_path => '/some/path/to/a/file'")
expect(actual).to include("kerberos_config => '/some/path/to/a/file'")
end
end
end
end

View file

@ -0,0 +1,22 @@
modules:
- name: arcsight
var.inputs: "kafka"
var.input.kafka.security_protocol: "SASL_SSL"
var.input.kafka.ssl_truststore_type: "foo"
var.input.kafka.ssl_truststore_location: "/some/path/to/a/file"
var.input.kafka.ssl_truststore_password: "foobar"
var.input.kafka.ssl_keystore_type: "bar"
var.input.kafka.ssl_keystore_location: "/some/path/to/a/file"
var.input.kafka.ssl_keystore_password: "barfoo"
var.input.kafka.ssl_key_password: "foobaz"
var.input.kafka.sasl_mechanism: "GSSAPI"
var.input.kafka.sasl_kerberos_service_name: "baz"
var.input.kafka.jaas_path: "/some/path/to/a/file"
var.input.kafka.kerberos_config: "/some/path/to/a/file"
var.input.kafka.bootstrap_servers: "kcluster.local:1234"
var.input.kafka.topics: ["foo", "bar", "baz"]
# although this file is named eventbroker..., the arcsight config ERB
# file duplicates the eventbroker based settings to be kafka based ones
# plus, the arcsight_module_config_helper's module methods expect
# kafka based settings

View file

@ -0,0 +1,15 @@
modules:
- name: arcsight
var.inputs: "tcp"
var.input.tcp.port: 5050
var.input.tcp.ssl_enable: true
var.input.tcp.ssl_cert: "/some/path/to/a/cert.p12"
var.input.tcp.ssl_extra_chain_certs: ["/some/path/to/a/cert.p12"]
var.input.tcp.ssl_key: "/some/path/to/a/cert.p12"
var.input.tcp.ssl_key_passphrase: "foobar"
var.input.tcp.ssl_verify: false
# although this file is named smartconnector..., the arcsight config ERB
# file duplicates the smartconnector based settings to be tcp based ones
# plus, the arcsight_module_config_helper's module methods expect
# tcp based settings