Logstash configuration and grok patterns for parsing postfix logging
The developer of this repository has not created any items for sale yet. Need a bug fixed? Help with integration? A different license? Create a request here:
A set of grok patterns for parsing postfix logging using grok. Also included is a sample Logstash config file for applying the grok patterns as a filter.
pipelinedir for dockerized Logstash
The included Logstash config file requires two input fields to exist in input events:
program: the name of the program that generated the log line, f.i.
tagin syslog lingo)
message: the log message payload without additional fields (program, pid, etc), f.i.
connect from 1234.static.ctinets.com[18.104.22.168]
This event format is supported by the Logstash
sysloginput plugin out of the box, but several other plugins produce input that can be adapted fairly easy to produce these fields too. See ALTERNATIVE INPUTS for details.
A optional aggregation filter is available, that will combine fields from different log lines. The key on which log lines are aggregated is the postfix queue id. For example:
In this example, the
postfix_fromfield from a
postfix/qmgrlog line is reused, and added to a log line from
To use the aggregation filter, add
pipelinedir for dockerized Logstash.
test/directory, there is a test suite that tries to make sure that no previously supported log line will break because of changing common patterns and such. It also returns results a lot faster than doing
sudo service logstash restart:-).
The test suite needs the patterns provided by Logstash, you can easily pull these from github by running
git submodule update --init. To run the test suite, you also need
ruby 2.2or higher, and the
minitestgems. Then simply execute
Adding new test cases can easily be done by creating new yaml files in the test directory. Each file specifies a grok pattern to validate, a sample log line, and a list of expected results.
Also, the example Logstash config file adds some informative tags that aid in finding grok failures and unparsed lines. If you're not interested in those, you can remove all occurrences of
tag_on_failurefrom the config file.
I only have access to my own log samples, and my setup does not support or use every feature in postfix. If you miss anything, please open a pull request on github. If you're not very well versed in regular expressions, it's also fine to only submit sample unsupported log lines.
Everything in this repository is available under the New (3-clause) BSD license.
I use postfix, logstash, elasticsearch and kibana in order to get everything working. For writing the grok patterns I depend heavily on grokdebug, and I looked a lot at antispin's useful logstash grok patterns.