Click here to Skip to main content
13,801,506 members
Click here to Skip to main content
Add your own
alternative version

Stats

11K views
27 bookmarked
Posted 3 Dec 2017
Licenced CPOL

Elasticsearch, Kibana and Docker using .NET Standard 2

, 3 Dec 2017
Rate this:
Please Sign up or sign in to vote.
Advanced logging in modern .NET applications. A sunday-morning-proof-of-concept.

Introduction

In this article you'll learn how to set up a development environment to use Elasticsearch and Kibana with loggin purposes, during this process you will use Docker, very basic usage, and you will learn, also, how that easy is to use .NET Standard libraries within our Windows Classic Desktop Applications.

Background

I will demonstrate that Elasticsearch and Kibana integration within our systems is quite easy. To do so I'll set up and configure a Docker network and two Docker Images, one for Elasticsearch and another for Kibana. This infrastructure will be used by a Windows Classic Desktop console  that is going to perform some randomized logging. Continuing with what I showed in y last article, regarding proper logging in .Net Core, I'm going to configure three logging outputs, the console, and with Serilog I'll configure and use a rolling file shink and a Elasticsearch Shink. This Shinks are going to be used within a .Net Standard 2 library, but this library is going to be referenced by the console application, using latest Microsoft Dependency Injection Extensions.

Setting up the infrastructure

First of all we need to have Docker installed and well configured in our machine, I'm using W10 with Docker for Windows. (In case you don't have it you can find it here https://www.docker.com/docker-windows ). We only need to run three commands to set everything with Docker. (This is awesome, isn't it?)

  • docker network create elk-logging-poc --driver=bridge
  • docker run -p 5601:5601 --name kibana -d --network elk-logging-poc kibana 
  • docker run -p 9200:9200 -p 9300:9300 --name elasticsearch -d --network elk-logging-poc elasticsearch

The first one creates a network to be used in this Proof Of Concept, the second runs Kibana, pulling the image from Docker Hub if it doesn't exists locally and the third one does the same with Elasticsearch.

Initial Status:

1.- Create Network:

2.- Install and run Kibana:

At this point Kibina is installed but if we go to localhost:5601 we can see that Kibana is missing Elasticsearch:

Install Elasticsearch:

And now Kibana is ready to be configured:

Setting up the Solution

Well, now it is time to set up our solution, and, as I said, I am going to use a Windows Classic Desktop console application and a .Net Standard Library. So, I'll create a new solution and the WCD console project:

Hint: Although it is a POC I always try to use robust namespacing, I always have in mind that a POC can become a prototype and after that a product. These things hapens.

Now, let's create the .NET Standard library:

And the result will look like:

Console Files:

In order to save you time, you need to install the NuGet packages listed in the packages.config to the Console application, and is easier to do it manually. 

packages.config

<?xml version="1.0" encoding="utf-8"?>
<packages>
  <package id="Elasticsearch.Net" version="5.5.0" targetFramework="net462" />
  <package id="Microsoft.DotNet.InternalAbstractions" version="1.0.0" targetFramework="net462" />
  <package id="Microsoft.Extensions.Configuration" version="2.0.0" targetFramework="net462" />
  <package id="Microsoft.Extensions.Configuration.Abstractions" version="2.0.0" targetFramework="net462" />
  <package id="Microsoft.Extensions.Configuration.Binder" version="2.0.0" targetFramework="net462" />
  <package id="Microsoft.Extensions.Configuration.FileExtensions" version="2.0.0" targetFramework="net462" />
  <package id="Microsoft.Extensions.Configuration.Json" version="2.0.0" targetFramework="net462" />
  <package id="Microsoft.Extensions.DependencyInjection" version="2.0.0" targetFramework="net462" />
  <package id="Microsoft.Extensions.DependencyInjection.Abstractions" version="2.0.0" targetFramework="net462" />
  <package id="Microsoft.Extensions.DependencyModel" version="1.0.0" targetFramework="net462" />
  <package id="Microsoft.Extensions.FileProviders.Abstractions" version="2.0.0" targetFramework="net462" />
  <package id="Microsoft.Extensions.FileProviders.Physical" version="2.0.0" targetFramework="net462" />
  <package id="Microsoft.Extensions.FileSystemGlobbing" version="2.0.0" targetFramework="net462" />
  <package id="Microsoft.Extensions.Logging" version="2.0.0" targetFramework="net462" />
  <package id="Microsoft.Extensions.Logging.Abstractions" version="2.0.0" targetFramework="net462" />
  <package id="Microsoft.Extensions.Logging.Console" version="2.0.0" targetFramework="net462" />
  <package id="Microsoft.Extensions.Logging.Debug" version="2.0.0" targetFramework="net462" />
  <package id="Microsoft.Extensions.Options" version="2.0.0" targetFramework="net462" />
  <package id="Microsoft.Extensions.Options.ConfigurationExtensions" version="2.0.0" targetFramework="net462" />
  <package id="Microsoft.Extensions.Primitives" version="2.0.0" targetFramework="net462" />
  <package id="Newtonsoft.Json" version="10.0.1" targetFramework="net462" />
  <package id="Serilog" version="2.5.0" targetFramework="net462" />
  <package id="Serilog.Extensions.Logging" version="2.0.2" targetFramework="net462" />
  <package id="Serilog.Settings.Configuration" version="2.4.0" targetFramework="net462" />
  <package id="Serilog.Sinks.Elasticsearch" version="5.4.0" targetFramework="net462" />
  <package id="Serilog.Sinks.File" version="3.2.0" targetFramework="net462" />
  <package id="Serilog.Sinks.PeriodicBatching" version="2.1.0" targetFramework="net462" />
  <package id="Serilog.Sinks.RollingFile" version="3.3.0" targetFramework="net462" />
  <package id="System.Linq" version="4.1.0" targetFramework="net462" />
  <package id="System.Resources.ResourceManager" version="4.0.1" targetFramework="net462" />
  <package id="System.Runtime" version="4.1.0" targetFramework="net462" />
  <package id="System.Runtime.CompilerServices.Unsafe" version="4.4.0" targetFramework="net462" />
</packages

appsettings.json

{
  "Logging": {
    "IncludeScopes": true,
 
    "Debug": {
      "LogLevel": {
        "Default": "Critical"
      }
    },
    "Console": {
      "LogLevel": {
        "Microsoft.AspNetCore.Mvc.Razor.Internal": "Warning",
        "Microsoft.AspNetCore.Mvc.Razor.Razor": "Debug",
        "Microsoft.AspNetCore.Mvc.Razor": "Error",
        "Default": "Critical"
      }
    },
    "LogLevel": {
      "Default": "Critical"
    }
  },
  "Serilog": {
    "WriteTo": [
      {
        "Name": "Elasticsearch",
        "Args": {
          "nodeUris": "http://localhost:9200;http://remotehost:9200/",
          "indexFormat": "elk-poc-index-{0:yyyy.MM}",
          "templateName": "myCustomTemplate",
          "typeName": "myCustomLogEventType",
          "pipelineName": "myCustomPipelineName",
          "batchPostingLimit": 50,
          "period": 2000,
          "inlineFields": true,
          "minimumLogEventLevel": "Trace",
          "bufferBaseFilename": "C:/Logs/docker-elk-serilog-web-buffer",
          "bufferFileSizeLimitBytes": 5242880,
          "bufferLogShippingInterval": 5000,
          "connectionGlobalHeaders": "Authorization=Bearer SOME-TOKEN;OtherHeader=OTHER-HEADER-VALUE"
        }
      }
    ],
    "LogFile": "C:/Logs/ElasticSearchPoc.log",
    "MinimumLevel": "Information"
  }
}

Program.cs

using ElasticSearchPoc.Domain.LogProducer;
using Microsoft.Extensions.Configuration;
using Microsoft.Extensions.DependencyInjection;
using Microsoft.Extensions.Logging;
using Serilog;
using System;
 
namespace ElasticSearchPoc.Presentation.WCD.Console
{
    class Program
    {
        static void Main(string[] args)
        {
            // Create service collection
            var serviceCollection = new ServiceCollection();
            ConfigureServices(serviceCollection);
 
            // Create service provider
            var serviceProvider = serviceCollection.BuildServiceProvider();
 
            // Run app (Every execution should create a new RunId)
            serviceProvider.GetService<BasicLogProducer>().Run();
        }
 
        private static void ConfigureServices(IServiceCollection serviceCollection)
        {
 
            // Build configuration
            var configuration = new ConfigurationBuilder()
                .SetBasePath(AppContext.BaseDirectory)
                .AddJsonFile("appsettings.json", false)
                .Build();
 
            // Add console logging
            serviceCollection.AddSingleton(new LoggerFactory()
                .AddConsole(configuration.GetSection("Logging"))
                .AddSerilog()
                .AddDebug());
            serviceCollection.AddLogging();
 
            // Add Serilog logging           
            Log.Logger = new LoggerConfiguration()
                .WriteTo.RollingFile(configuration["Serilog:LogFile"])
                .ReadFrom.Configuration(configuration)
                .CreateLogger();
 
            // Add access to generic IConfigurationRoot
            serviceCollection.AddSingleton(configuration);
 
            // Add the App
            serviceCollection.AddTransient<BasicLogProducer>();
        }
    }
}

I'd like to add few comments about those files, take a look at the different log levels in appsettings.json and the Elasticsearch options, settings are key on this POC. Also I'd like to emphasize that the source code of out program is clean and SOLID and I have to publicly say thank you to the .Net foundation and its contributors for these extremely useful libraries and extensions. 

Let's take a look at the files in the .NET Standard library, LogProducer.

LogProducer Files:

As you will see it's only dependency is with Microsoft.Extensions.Logging...

BasicLogProducer.cs

using Microsoft.Extensions.Logging;
using System;
using System.Threading;
 
namespace ElasticSearchPoc.Domain.LogProducer
{
    public class BasicLogProducer
    {
        private readonly ILogger<BasicLogProducer> _logger;
 
        public BasicLogProducer(ILogger<BasicLogProducer> logger)
        {
            _logger = logger;
        }
 
        public void Run()
        {
            var runDate = DateTime.Now;
            while (true)
            {
                // Let's randomize our logs...
                Array values = Enum.GetValues(typeof(LogLevel));
                Random random = new Random();
                LogLevel randomLogLevel = (LogLevel)values.GetValue(random.Next(values.Length));
 
                switch (randomLogLevel)
                {
                    case LogLevel.Trace:
                        _logger.LogTrace($"RunDate: {runDate}; Message Id: {Guid.NewGuid()}; LogLevel: Trace; LogLevelValue: {randomLogLevel.ToString("D")}");
                        break;
                    case LogLevel.Debug:
                        _logger.LogDebug($"RunDate: {runDate}; Message Id: {Guid.NewGuid()}; LogLevel: Debug; LogLevelValue: {randomLogLevel.ToString("D")}");
                        break;
                    case LogLevel.Information:
                        _logger.LogInformation($"RunDate: {runDate}; Message Id: {Guid.NewGuid()}; LogLevel: Information; LogLevelValue: {randomLogLevel.ToString("D")}");
                        break;
                    case LogLevel.Warning:
                        _logger.LogWarning($"RunDate: {runDate}; Message Id: {Guid.NewGuid()}; LogLevel: Warning; LogLevelValue: {randomLogLevel.ToString("D")}");
                        break;
                    case LogLevel.Error:
                        _logger.LogError($"RunDate: {runDate}; Message Id: {Guid.NewGuid()}; LogLevel: Error; LogLevelValue: {randomLogLevel.ToString("D")}");
                        break;
                    case LogLevel.Critical:
                        _logger.LogCritical($"RunDate: {runDate}; Message Id: {Guid.NewGuid()}; LogLevel: Critical; LogLevelValue: {randomLogLevel.ToString("D")}");
                        break;
                    case LogLevel.None:
                    default:
                        break;
                }
 
                Thread.Sleep(100);
 
            }
        } 
    }
}

Although I always try to write self-explanatory source code this class deserves a brief explanation. It uses DI (Depnednecy Injection), the ILogger<T> is being created because the host app registered ILoggingFactory and Serilog so that logger is going to have whatever we have configured in the main application, in this case our three Shinks, console, file and Elasticsearch. The Run method gets initial run date and starts generating logs with a random log level on each iteration. It is very basic but It will do its job.

Configure, Run & Test

I'm not going to give too many details, I guess that with some screenshots you'll like to try it out by yourself. First thing we have to do is to configure in Kibana the main index pattern and as you can see that is one of the config parameters we have in our appsettings.json file, concretely I used  "indexFormat": "elk-poc-index-{0:yyyy.MM}",   so we have to configure Kibana index like "elk-poc-index-*":

Well, let's launch our app and see what happens...

Console is logging only critical,  according to configuration parameters:

We are also logging into a file, as stated in config file and in our Main method, I'll demonstrate with few screenshots how to use best Tail tool I know, Tail Blazer (If you have read until this point and you don't know TailBlazer then you must go and try it right now because you'll love it):

1.- Plain output:

2.- Clicikng on top right geair icon we can add some highlights:

3.- And also iun main window we can add some filters:

I'll write Fatal so see only entries with Fatal keyword:

And, what about Elasticsearch and Kibana? Well, meanwhile I was doing these screenshots the console application has been generating logs, in the console (Critial only), in files in the local file system with Information Level as we have just  seen through TailBlazer, and the program has been sending logs to the Elasticsearch Shink with Trace level, so, if we go to Kibana and select Timelion menu item on the left menu we should be able to see a graph indicating the number of log received, something like this:

The graph makes sense because the code is sending around 10 logs per second because there is a Thread.Sleep(100), right?. Let's force the application a little and set the sleep to only 10 ms. If I run it again the timeline looks like this:And what if I quit the sleep and I force my machine to generate as much logs as possible?

Well, the file log grows extremely fast, as expected (blue means recently created, look at the millis in the timestamp):

The CPU reach 100% (as expected also):

And Kibana receives a peak of 772 logs per second. Not too bad considering that everything is running on the same machine and I'm debugging and monitoring log files, we could probably force it quite a lot more.

And, well, that's all for today,I still don't know how to properly visualize data with Kibana but the POC finishes here because it logs, as expected. 

Points of Interest

  • .NET standard libraries are best solution whenever it is possible because are the most reusable choice nowadays.
  • Configure Elasticsearch and Kibana with Docker takes only three steps
  • Logging modern .Net applications has many possibilities and we've learn how to:
    • Properly log tothe console.
    • Properly log to log files and how to visualize them in real-time with TailBlazer
    • How to configure a Serilog Shink to log to Elasticsearch

History

A sunday POC at home, probably I won't add anything else, despite some typos corrections or some clarifications if requested.

License

This article, along with any associated source code and files, is licensed under The Code Project Open License (CPOL)

Share

About the Author

Juan G. Carmona
Technical Lead
Spain Spain
Father, husband, mountaineer, developer and software architect.

I enjoy solving problems and producing high quality software.


You may also be interested in...

Pro

Comments and Discussions

 
GeneralMy vote of 5 Pin
Igor Ladnik6-Jan-18 9:06
professionalIgor Ladnik6-Jan-18 9:06 
GeneralElastic Search and Tall Blazer Pin
RolandPheasant5-Dec-17 10:35
memberRolandPheasant5-Dec-17 10:35 
GeneralRe: Elastic Search and Tall Blazer Pin
Juan G. Carmona8-Dec-17 7:05
professionalJuan G. Carmona8-Dec-17 7:05 
Questionerrors in pngs?? Pin
Ragnar4-Dec-17 2:27
professionalRagnar4-Dec-17 2:27 
AnswerRe: errors in pngs?? Pin
Juan G. Carmona4-Dec-17 19:46
professionalJuan G. Carmona4-Dec-17 19:46 
QuestionGood job Pin
Sacha Barber4-Dec-17 0:40
mvpSacha Barber4-Dec-17 0:40 
AnswerRe: Good job Pin
Juan G. Carmona4-Dec-17 1:15
professionalJuan G. Carmona4-Dec-17 1:15 
GeneralMy vote of 5 Pin
Sumuj John3-Dec-17 10:22
professionalSumuj John3-Dec-17 10:22 

General General    News News    Suggestion Suggestion    Question Question    Bug Bug    Answer Answer    Joke Joke    Praise Praise    Rant Rant    Admin Admin   

Use Ctrl+Left/Right to switch messages, Ctrl+Up/Down to switch threads, Ctrl+Shift+Left/Right to switch pages.

Permalink | Advertise | Privacy | Cookies | Terms of Use | Mobile
Web02 | 2.8.181215.1 | Last Updated 4 Dec 2017
Article Copyright 2017 by Juan G. Carmona
Everything else Copyright © CodeProject, 1999-2018
Layout: fixed | fluid