Monday, 30 December 2013

Logstash for weblogic - Part III - Using GROK patterns

This post explains the concept of GROK filter which gives more flexibility in parsing the logs and analyzing. Weblogic SOA logs will have information’s like severity, host details, composite details, timestamps etc. and this information will be more helpful when we use logstash centralized logging solution across multiple environments.

Using grok, we can parse unstructured log data into structured and also queryable. Logstash has 120 patterns which is available in this link. https://github.com/logstash/logstash/tree/v1.3.2/patterns

General information about logstash is available in this link - http://logstash.net/docs/1.3.2/filters/grok 

Below picture explains the structure of admin log in weblogic and how we can identify the pattern to parse it using grok. 


This is the format of admin logs and you can decide your own pattern using this link which has 120 patterns for Logstash.  Link is https://github.com/logstash/logstash/tree/v1.3.2/patterns

"####<%{DATA:wls_timestamp}> <%{WORD:severity}> <%{DATA:wls_topic}> %{HOST:hostname}> <(%{WORD:server})?> %{GREEDYDATA:logmessage}"

Words that are in bold are valid patterns that can be used in logstash. When your log entries matches these patterns which are in bold, then those particular data will be indexed in the name of one that is highlighted in Yellow.

Please use below config plan to achieve grok filters and indexing, you can change accordingly to different logs. You need to include multiline filter as well to get rid of spaces in Java exception. Using multiline filter is discussed in this post.  

input {
 stdin {
    type => "stdin-type"
  }
  file {
    type => "ADMdomainlog"
    path => [ "D:/Logstash/Log/soa_domain.log"]
  }
  }
 
  filter {
  multiline {
    type => "ADMdomainlog"
    pattern => "^####"
    negate => true
    what => "previous"
  }
    grok {
    type => "ADMdomainlog"
    pattern => ["####<%{DATA:wls_timestamp}> <%{WORD:severity}> <%{DATA:wls_topic}> <%{HOST:hostname}> <(%{WORD:server})?> %{GREEDYDATA:logmessage}"]
    add_field => ["Log", "Admin Domain Log"]
  }
  }
 
output {
  elasticsearch { embedded => true }
}

Run using the logstash and open Kibana to view the logs, you can see that there are new indexes as per your grok patterns and you can even filter using those indexes as mentioned in below screen shots.





This grok adds more flexibility in analyzing the logs and we can use it more effectively when we define our own dashboard in Kibana which will be discussed in further posts...

No comments:

Post a Comment