The IIS log files collect all the actions that occur on the web server. As explained last week you can use Log Parser to filter the events. However, when you use Kibana for all your applications then you would prefer to have the IIS log events there as well. Today I will explain how you can use logstash to read the log files from IIS and store the events in Elasticsearch.
With logstash you can parse log files, extract the events and store them in any format that helps you to work with those events. If you don’t like Elasticsearch you can easily switch to another output format and write the events to Redis, MongoDB or send them away in an e-mail. The documentation explains in great detail all the possibilities and options you have. If you run into any kind of trouble you definitely should go there and read this carefully prepared documentation.
This post is part of the Improve Your Log Messages series. You can find the other parts here:
- Part 1: The Missed Opportunities of Log Files
- Part 2: Structured Logging with Serilog
- Part 3: RavenDB as a Sink for Serilog
- Part 4: Seq as a Sink for Serilog
- Part 5: How to Influence the Output of Serilog
- Part 6: Monitor your Application with Seq
- Part 7: Debugging Serilog
- Part 8: Elasticsearch as a Sink for Serilog
- Part 9: Monitor your Applications with Kibana
- Part 10: Closing the Feedback Loop from Log Messages to Knowledge
- Part 11: How To Analyse IIS Log Files
- Part 12: Using Logstash to Analyse IIS Log Files with Kibana
- Part 13: Analysing Apache Log Files with Logstash and Kibana
- Part 14: How to Analyse SharePoint Log Files
Installation
All you need to do to install logstash is to download and extract the *.zip file form the official web page. Logstash is using the JVM and needs like Elasticsearch a Java runtime and the JAVA_HOME
environment variable.
If you get an error message about too long lines or reaching a max request size when you use logstash with Elasticsearch then you should try to install Ruby (with the RubyInstaller) and JRuby. That solved those problems for me on two different machines.
A Simple Configuration
Before you try to connect logstash with the IIS logs and Elasticsearch you should check that logstash works as expected. Create a simple configuration file with this content:
1 2 3 4 |
input { stdin { } } output { stdout { codec => rubydebug } } |
Logstash follows the idea of an ETL-Tool (Extract, Transform, Load) and needs an input, an output and if you like a filter to transform the data. This example reads from standard input and writes to standard output using the rubydebug codec.
To run this file you open the command line tool of your choosing, got to the bin folder of the extracted logstash files and run the agent with this command:
1 |
C:\logstash-1.4.2\bin>logstash.bat agent -f "C:/simple.conf" |
Attention: The path to the config file uses /
(slash) and not the \
(backslash) as you do normally on Windows.
Attention (update on 6.4.2016): Recent versions switched this behavior and now only work with \
(backslash) in the path on Windows.
Logstash is waiting on an input and when you enter “hello” and hit enter you should get an output like this one:
1 2 3 4 5 6 7 |
hello { "message" => "hello\r", "@version" => "1", "@timestamp" => "2014-11-22T20:23:15.537Z", "host" => "jgraber-win8" } |
A Configuration for IIS
The configuration to read from IIS log files and write them to Elasticsearch is a bit more complex. To read the files we need a file adapter where we have to specify which files should be read. For the output we need to declare where our Elasticsearch server is and which protocol we want to use.
The main activity however is in the filter command. Using grok we can split a text into different parts. The syntax is a bit special but it’s not that hard to understand it. The other parts are a nice addition to set the correct time zone and manipulate the fields for the user agent.
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 |
input { file { #type => "iis" path => "C:/logs/*.log" start_position => "beginning" } } filter { #ignore log comments if [message] =~ "^#" { drop {} } grok { # check that fields match your IIS log settings match => ["message", "%{TIMESTAMP_ISO8601:log_timestamp} %{IPORHOST:site} %{WORD:method} %{URIPATH:page} %{NOTSPACE:querystring} %{NUMBER:port} %{NOTSPACE:username} %{IPORHOST:clienthost} %{NOTSPACE:useragent} %{NUMBER:response} %{NUMBER:subresponse} %{NUMBER:scstatus} %{NUMBER:time_taken}"] } #Set the Event Timesteamp from the log date { match => [ "log_timestamp", "YYYY-MM-dd HH:mm:ss" ] timezone => "Etc/UTC" } useragent { source=> "useragent" prefix=> "browser" } mutate { remove_field => [ "log_timestamp"] } } # See documentation for different protocols: # http://logstash.net/docs/1.4.2/outputs/elasticsearch output { stdout { codec => rubydebug } elasticsearch { host => "127.0.0.1" port => "9200" protocol => "http" } } |
IIS let you configure how the log files should be written. If you modify the fields or change the order you need to change the fields in the grok section as well.
To read the log files you go back to the command line and run this command:
1 |
logstash.bat agent -f 'C:/Your/Path/To/iis.conf' |
If all is correctly configured you will see what events are detected (thanks to the stdout part in the output) and send to Elasticsearch. If you open the Kibana interface you should see your log messages:
Grok Debugger
If you run in any kind of trouble with grok then you should use the Grok Debugger. On this page you can add a typical log message and your grok pattern to see what is detected. It doesn’t show where exactly the error is but you can figure that out by deleting tokens from the end. As soon as you no longer have an error in your pattern the message is parsed and you can look at the resulting structure:
Next
A big configuration file like the one for IIS with all the transformation in the action block is a bit scary. Next week we will look how simple a logstash configuration can be when the format of the log messages is well known.
I am really enjoying this series on ELK stack because I am trying to introduce Elasticsearch in my environment. It helps a lot – thanks!
One question – how would you run logstash in a production environment? Would logstash be running as a service using srvany or similar? AFAIK unlike Elasticsearch, logstash doesn’t come with a service executable. Also does logstash keep track of what’s been processed so far? In other words, what happens when something goes wrong during processing a log file or a set of files? Does logstash know how to “continue”? How do you run logstash in “real time”? for IIS for example?
Hi Jiho,
It’s great when my posts can help. To your questions: Logstash keeps an internal „pointer“ on which messages it processed. With that it can resume its work after a restart of your machine or when the agent was shut down.
So far I copy the log files to a dedicated server and use logstash only on this machine. I don’t need the data in real time and prefer to have one single place to update and to install (J)Ruby. That’s however no option if you need the data in real time. In this helpful post sbagmeijer shows how to use NSSM (the Non-Sucking Service Manager) to run logstash as a service. I haven’t tried it myself but maybe this tool can help you.
Regards,
Johnny
Hi Johnny,
when trying your config I found out it works better with “Etc/UTC” instead of “Etc/UCT” for the timezone field.
Regards
Yves
Hi Yves,
Thanks for the feedback. I’ve fixed the typo.
Regards
Johnny
> Attention: The path to the config file uses / (slash) and not the \ (backslash) as you do normally on Windows.
This is wrong for newer versions. You get an error for forward slash and it works for back.
Hi Luke,
Thanks for the input. I changed the note to reflect the new behavior.
Good article. Don’t forget that the last part (output) needs to be updated in the latest version of LogStash.
output {
stdout { codec => rubydebug }
elasticsearch {
hosts => “127.0.0.1:9200” # <== Hosts instead of Host
}
}
Use IIS Logs, marvellous
Can I use ASP.NET Performance Counters with LogStash ?
http://geekswithblogs.net/akraus1/archive/2009/05/27/132456.aspx
Is there an issue with different versions of IIS if I am looking at IIS logs to be in logstash ? and will Kibana work with different versions of IIS ?