Monitoring Java applications with ELK

 Monitoring Java applications with ELKMonitoring Java applications with ELK (Elasticsearch, Logstash and Kibana) will show you step-by-step how to properly centralize your Java application logs by sending all its log messages to a remote ELK server. Using this approach you can have all information generated by Java applications, running along multiple servers, in a centralized place. This way you can easily create dashboards and start to analyze your applications in a more high level and practical manner.

You know it’s sad but true

Let’s think about a very common scenario in many companies: many developed Java applications running across multiple application servers, each application performing many operations per day and logging thousands and thousands of lines that generally nobody checks unless some problem occurs along the applications. It sounds familiar, doesn’t it? The biggest issue here is that, unless we are debugging a production problem, the logs have no value at all. They are not telling us anything about aspects we must care about, such as business process performance. There’s gold within these logs!

How about building a better scenario?

Think about the sad story I just told you. Now imagine all your Java applications producing the same amount of logs but then sending them to a centralized place where all received data is accordingly analyzed, modified and finally presented in a real accessible way. Would you like to know how many payments did your system realize in the last minute, day or week? What about how many times a specific exception was thrown? The possibilities are infinite.

Let’s see how to achieve this desired scenario using the ELK stack.

Proposed solution

Our proposed solution will combine one Java Application configured to use Logback (the successor of the famous Log4J), one specialized Log Appender class “LogstashTcpSocketAppender” (provided by the Logstash team) and one ELK server.

Tutorial – Monitoring Java applications with ELK

Step 1 – Setup the ELK stack

We have two detailed articles about how to setup the ELK stack on Ubuntu and Windows, please check them following the links bellow:

Step 2 – Configure Logstash to receive our logs

Within the ELK server, create a new configuration file, /etc/logstash/conf.d/logback-listener.conf for Ubuntu 16.04 and D:\ELK\logstash-2.3.4\conf.d\logback-listener.conf for Windows, inserting the following content:

Don’t forget to restart the Logstash service after editing the file above.

Step 3 – Configure your Java application to use SLF4J + Logback

For those who are not familiar with SLF4J, it’s important to highlight that it serves as a simple facade or abstraction for various logging frameworks (e.g. java.util.logging, Logback, Log4j) allowing the end user to plug in the desired logging framework at deployment time. As for the Logback, it’s a logging framework intended to be the successor of the popular Log4j project. Logback might also be referred as Log4j 2.

The first thing we need to do is add a few dependencies to our project so we can use SLF4J and Logback.

Let’s modify or pom.xml to include these required dependencies:

Step 4 – Configure LogstashTcpSocketAppender

LogstashTcpSocketAppender is a specialized log appender created by the Logstash team and distributed as a Maven dependency. This library has all the necessary logic to send your log messages, formatted as JSON, to a remote server over the TCP protocol.

Let’s include the logstash-logback-encoder dependency in our pom.xml:

Once the mentioned dependency is available in our classpath we can now configure Logback using logback.xml. Remember that this file needs to be available in the classpath as well. Generally in a typical Maven project we include this file in the resources folder.

Time to configure our logback.xml:

Step 5 – Add some logging statements

Now we can create a simple class just to add some logging statements in your code somewhere you know they’ll be fired right away when you run your application.

For example:

After including these statements it’s necessary to execute the application at least once so we can see the generated log entries in the next step.

Step 6 – Check the log entries using Kibana

The last step is to check your log entries using Kibana.

Monitoring Java applications with ELK - Kibana 1

Checking the generated log entries using Kibana

Monitoring Java applications with ELK - Kibana 2

A simple dashboard example

Conclusion

Monitoring Java applications with ELK is a very interesting approach to improve your logs visibility, making them much more useful for developers and non-technical people. I’d say that the effort to implement this mechanism in your projects is really small compared to the benefits it can bring.