Those who wonder what is logstash, it is an open source tool for managing events and logs. This can be used to collect logs, parse them, and store them for later use (like, for searching). Speaking of searching, log stash comes with a web interface for searching and drilling into all of your logs.
All sorts of information required to deploy logstash is available in logstash official website (http://logstash.net/) but this post will share my experience and learning’s while we tried to use it in our project especially for analyzing the weblogic and SOA logs…
Step 0:
All sorts of information required to deploy logstash is available in logstash official website (http://logstash.net/) but this post will share my experience and learning’s while we tried to use it in our project especially for analyzing the weblogic and SOA logs…
Step 0:
- Download https://download.elasticsearch.org/logstash/logstash/logstash-1.2.2-flatjar.jar
- Works well in windows as well, so no worries for POC
- This seems to be very important one; I spend hours to get this clicked. LOGSTASH can be used only to read live logs or new logs and not existing logs from your logs folders. There is a way to read those logs which will be shared in the end of this post.
- Logstash is shipped along with elastic search as storage engine and Kibana to display the parsed logs
Another important thing in logstash is
Configuration file:
To configure logstash in standalone server,
Save as sample.conf. Now we need to run the logstash using below syntax which is very simple. Before that if you are trying in windows, set your java home in environment variables and if it is in linux, then you need to run with java path. Both the examples are provided below.
java -jar logstash-1.2.2-flatjar.jar agent -f logstash_sample.conf --- For windows
/u01/dev/fmwadm/Middleware/jrockit_160_29_D1.2.0-10/jre/bin/java -jar logstash-1.2.2-flatjar.jar agent -f logstash_sample.conf --- For Linux
make sure that you have placed a empty soa_domain.log file in the mentioned path. After executing this command, paste your log entries to the soa_domain.log file so that logstash parse it as new entry in log file :).
These logs will be captured as events and stored in elastic search and to view this in UI, kibana is used. But to acheive this, stop the above command with ctrl+c and execute below command which will invoke the logstash with kibana web service.
java -jar logstash-1.2.2-flatjar.jar agent -f logstash-sample.conf – web
Purpose of this post to give basic introduction about logstash and to understand how it can be used for analyzing weblogic logs... Feel free to contact me if you face any difficulties.
- Configuration file is the simple .conf file that is used to read and parse the logs
- It primarily deals with input, output and filters
- Inputs are inputs to logstash which can be a file path to the logs
- Outputs are section which deals with how the parsed logs needs to be viewed
To configure logstash in standalone server,
java -jar logstash-1.2.2-flatjar.jar agent -f logstash_sample.conf --- For windows
/u01/dev/fmwadm/Middleware/jrockit_160_29_D1.2.0-10/jre/bin/java -jar logstash-1.2.2-flatjar.jar agent -f logstash_sample.conf --- For Linux
make sure that you have placed a empty soa_domain.log file in the mentioned path. After executing this command, paste your log entries to the soa_domain.log file so that logstash parse it as new entry in log file :).
These logs will be captured as events and stored in elastic search and to view this in UI, kibana is used. But to acheive this, stop the above command with ctrl+c and execute below command which will invoke the logstash with kibana web service.
java -jar logstash-1.2.2-flatjar.jar agent -f logstash-sample.conf – web
Kibana Web by default use
9292 port and make sure that nothing is running in this port, open http://localhost:9292/ which will kick start kibana for you. There are lot many
options in kibana which I will share in separate post.
As of now use default
dashboard and view your logs that are parsed with default indexes like _id, _index, type,@timestamp,@version.
Now you can search the logs as you wish...
Same steps are used in linux or unix
well.
Here we do face problem when your logs
has space in between the log entries and that is where we need to use filter which I will share
in next post.
Purpose of this post to give basic introduction about logstash and to understand how it can be used for analyzing weblogic logs... Feel free to contact me if you face any difficulties.
No comments:
Post a Comment