Elasticsearch as a Sink for Serilog

When multiple systems continuously produce log messages they accumulate quickly. In combination with a longer retention time we easily talk about millions of log messages that need to be stored and searched. A single System like Seq will reach its limits and we need a bigger, more scalable solution.

Elasticsearch is such a system, which combined with Logstash and Kibana to the ELK-stack, can handle a massive amount of data. Even better, it’s not limited to Serilog as a producer but instead can use Logstash to include your legacy or third-party application like SharePoint and IIS.

This post is part of the Improve Your Log Messages series. You can find the other parts here:

 

Installing Elasticsearch

Elasticsearch is a search server written in Java that stores its data as schema-free JSON-documents. Behind the scene works Apache Lucene that you may already know from RavenDB.
You can use Elasticsearch under the terms of the Apache License for free, buy support from Elasticsearch.com or use a cloud hosting provider like Facetflow.

For your first steps with Elasticsearch you can install it on your local machine. All you need is a computer with a Java runtime, a correctly set JAVA_HOME environment variable and the *.zip file from Elasticsearch.org. After unpacking the file you need to start the script bin/elasticsearch.bat and you are done.

 

Using Elasticsearch as Sink

To use Elasticsearch with Serilog you need to install the NuGet package Serilog.Sinks.ElasticSearch:

NuGet package

If you prefer the command line you can insert this command in the package console to get the same package installed:

 

Configure the Sink

When Elasticsearch is installed and you started the elasticsearch.bat script you only need this minimal configuration for your logger:

This will use localhost:9200 as default address of your Elasticsearch server. If you use a cloud service or another machine then you can configure it accordingly:

 

Query your Messages

Out of the box you don’t get a GUI to query Elasticsearch. There is a powerful HTTP API that you can query with something like Postman or curl:

While this works it’s a big step backwards from the simplicity that Seq offers. There is however Kibana, a web interface that gives you the same flexibility as Seq. You can use the dashboard to get the overview on what’s going on with your applications:

Kibana dashboard

From there you can drill down to the single event that you are looking for:

Kibana filter by field

 

Next

Next week I explain how you can install and configure Kibana. There are a few pitfalls you need to look out before you can explore your events and enjoy the great feature set of Kibana.

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.