When multiple systems continuously produce log messages they accumulate quickly. In combination with a longer retention time we easily talk about millions of log messages that need to be stored and searched. A single System like Seq will reach its limits and we need a bigger, more scalable solution.
Elasticsearch is such a system, which combined with Logstash and Kibana to the ELK-stack, can handle a massive amount of data. Even better, it’s not limited to Serilog as a producer but instead can use Logstash to include your legacy or third-party application like SharePoint and IIS.
This post is part of the Improve Your Log Messages series. You can find the other parts here:
- Part 1: The Missed Opportunities of Log Files
- Part 2: Structured Logging with Serilog
- Part 3: RavenDB as a Sink for Serilog
- Part 4: Seq as a Sink for Serilog
- Part 5: How to Influence the Output of Serilog
- Part 6: Monitor your Application with Seq
- Part 7: Debugging Serilog
- Part 8: Elasticsearch as a Sink for Serilog
- Part 9: Monitor your Applications with Kibana
- Part 10: Closing the Feedback Loop from Log Messages to Knowledge
- Part 11: How To Analyse IIS Log Files
- Part 12: Using Logstash to Analyse IIS Log Files with Kibana
- Part 13: Analysing Apache Log Files with Logstash and Kibana
- Part 14: How to Analyse SharePoint Log Files
Installing Elasticsearch
Elasticsearch is a search server written in Java that stores its data as schema-free JSON-documents. Behind the scene works Apache Lucene that you may already know from RavenDB.
You can use Elasticsearch under the terms of the Apache License for free, buy support from Elasticsearch.com or use a cloud hosting provider like Facetflow.
For your first steps with Elasticsearch you can install it on your local machine. All you need is a computer with a Java runtime, a correctly set JAVA_HOME
environment variable and the *.zip file from Elasticsearch.org. After unpacking the file you need to start the script bin/elasticsearch.bat
and you are done.
Using Elasticsearch as Sink
To use Elasticsearch with Serilog you need to install the NuGet package Serilog.Sinks.ElasticSearch
:
If you prefer the command line you can insert this command in the package console to get the same package installed:
1 |
PM> Install-Package Serilog.Sinks.ElasticSearch |
Configure the Sink
When Elasticsearch is installed and you started the elasticsearch.bat
script you only need this minimal configuration for your logger:
1 2 3 4 |
// Create Logger Log.Logger = new LoggerConfiguration() .WriteTo.ElasticSearch() .CreateLogger(); |
This will use localhost:9200
as default address of your Elasticsearch server. If you use a cloud service or another machine then you can configure it accordingly:
1 2 3 4 5 6 7 |
// Create URI to cloud service var facetflow = new Uri("https://**key**@jgraber.east-us.azr.facetflow.io"); // Create Logger Log.Logger = new LoggerConfiguration() .WriteTo.ElasticSearch(server: facetflow) .CreateLogger(); |
Query your Messages
Out of the box you don’t get a GUI to query Elasticsearch. There is a powerful HTTP API that you can query with something like Postman or curl:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 |
curl "http://localhost:9200/logstash-2014.09.21/_search?q=JG" { "took": 19, "timed_out": false, "_shards": { "total": 5, "successful": 5, "failed": 0 }, "hits": { "total": 1, "max_score": 0.081366636, "hits": [ { "_index": "logstash-2014.09.21", "_type": "logevent", "_id": "rFPiv20XS36qBFvytpJoQA", "_score": 0.081366636, "_source": { "@timestamp": "2014-09-21T20:58:03.2287957+02:00", "messageTemplate": "Processed order {orderId} by {customer}", "level": "Information", "message": "Processed order 123 by \"JG - Johnny Graber\"", "fields": { "orderId": 123, "customer": "JG - Johnny Graber" } } } ] } } |
While this works it’s a big step backwards from the simplicity that Seq offers. There is however Kibana, a web interface that gives you the same flexibility as Seq. You can use the dashboard to get the overview on what’s going on with your applications:
From there you can drill down to the single event that you are looking for:
Next
Next week I explain how you can install and configure Kibana. There are a few pitfalls you need to look out before you can explore your events and enjoy the great feature set of Kibana.