Analysing Apache Log Files with Logstash and Kibana

Logstash doesn’t have to be that complicated. Last week’s example with log files from IIS looked so scary because the fields can vary from one IIS to the other. But when you want to use logstash to parse a well-known file format then all can be much simpler. Today I will show you the configuration to parse log files from the Apache web server.

This post is part of the Improve Your Log Messages series. You can find the other parts here:

 

Configuration for Apache

A typical entry in the log files of Apache may look like this one:

Since nearly no one tries to modify how those messages are created you can go with one of the default patterns that are shipped with logstash. All you now need to parse an Apache log file is this configuration:

All you have to do is to modify the path to your log files. When you run logstash with the agent flag and your configuration then all your log files are pushed into Elasticsearch:

Apache logs inside Kibana

 

Next

If all log files would be that simple we would not need to know much about the grok syntax. Next week we will look how SharePoint writes its log files and what tools you can use to read them.

 

Update 23 Nov 2015

The default patterns are now in the logstash-plugins repository at GitHub

2 thoughts on “Analysing Apache Log Files with Logstash and Kibana”

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.