Introduction
A few days back I encountered with a simple but painful issue. I am using ELK to parse my application logs and generate some meaningful views. Here I met with an issue which is, logstash inserts my logs into elasticsearch as per the current timestamp, instead of the actual time of log generation.
This creates a mess to generate graphs with correct time value on Kibana.
So I had a dig around this and found a way to overcome this concern. I made some changes in my logstash configuration to replace default time-stamp of logstash with the actual timestamp of my logs.
Logstash Filter
Add following piece of code in your filter plugin section of logstash's configuration file, and it will make logstash to insert logs into elasticsearch with the actual timestamp of your logs, besides the timestamp of logstash (current timestamp).
date {
locale => "en"
timezone => "GMT"
match => [ "timestamp", "yyyy-mm-dd HH:mm:ss +0000" ]
}
In my case, the timezone was GMT for my logs. You need to change these entries "yyyy-mm-dd HH:mm:ss +0000" with the corresponding to the regex for actual timestamp of your logs. Description
Date plugin will override the logstash's timestamp with the timestamp of your logs. Now you can easily adjust timezone in kibana and it will show your logs on correct time.
(Note: Kibana adjust UTC time with you bowser's timezone)
Valuable information thanks for sharing devops Online Training
ReplyDeleteIt's Really Great Post, Thank you for sharing such a nice information.
ReplyDeleteBest Web Desiging Training in Bangalore
Best Oracle Training in Bangalore
Really it was an awesome article...very interesting to read..You have provided an nice article....Thanks for sharing..
ReplyDeleteDevops Training
EMC SAN Training