- update to use proper markdown syntax. Not sure why it wasn't updated

yet.
This commit is contained in:
Jordan Sissel 2011-10-12 18:47:03 -07:00
parent a158101de4
commit ae7a24b300

View file

@ -50,7 +50,7 @@ backend; if you want to know more read the elasticsearch docs).
To start the service, run `bin/elasticsearch`. If you want to run it in the
foreground, use 'bin/elasticsearch -f'
== AMQP Broker ==
## AMQP Broker
AMQP is a standard for message-based communication. It supports publish-subscribe, queues, etc.
AMQP is supported way to ship your logs between servers with logstash.
@ -69,7 +69,7 @@ you can use, and you'll be ready to go to the next section.
If you want/need to configure RabbitMQ, seek the rabbitmq docs.
== grok ==
## grok
Site for download and install docs: <http://code.google.com/p/semicomplete/wiki/Grok>
@ -93,11 +93,9 @@ and tokyocabinet (see above grok/INSTALL url)
(logstash-%VERSION%.jar, etc) Once you have grok installed, you need to install the
'jls-grok' rubygem, which you can do by running:
{{{
gem install jls-grok
}}}
gem install jls-grok
== logstash ==
## logstash
Once you have elasticsearch and rabbitmq (or any AMQP server) running, you're
ready to configure logstash.
@ -113,7 +111,7 @@ agent roles: a shipper and an indexer. You will ship logs from all servers to a
single AMQP message queue and have another agent receive those messages, parse
them, and index them in elasticsearch.
=== logstash log shipper ===
### logstash log shipper
This agent you will run on all of your servers you want to collect logs on.
Here's a good sample config:
@ -158,7 +156,7 @@ This should start tailing the file inputs specified above and ships them out
over amqp. If you included the 'stdout' output you will see events written to
stdout as they are found.
=== logstash indexer ===
### logstash indexer
This agent will parse and index your logs as they come in over AMQP. Here's a
sample config based on the previous section.
@ -223,7 +221,7 @@ parse them to use as the real timestamp value for the event.
The above config will take raw logs in over amqp, parse them with grok and date
filters, and index them into elasticsearch.
== logstash web interface ==
## logstash web interface
Run this on the same server as your elasticsearch server.