Thursday 19 June 2014

Logstash for Weblogic - Part IV - Working across clustered environments

This is where logstash act as a real Champ..!!!

Logstash can be installed in any servers where it act as shipper and all these logs are accumulated in single place and can be used for real time monitoring. 
For this, additional components like redis, elsatic search are used. 

Logstash acts as a log collector, processor, shippers and indexer. 
It needs to be installed and running in servers where it will read the logs as events and as per the conf file configurations, here logstash acts as log collector and shipper. 
And in main server, it will act as indexer and processor. 

Refer previous post for more details. Post1Post2Post3;

Logstash has Kibana which is a dashboard for data visualization. 

Redis acts as a broker and receives the captured events from other servers ( Logstash that is acting as shipper) to main server. 

Elastic search acts as a persistent storage & search medium for all captured events


Below flow chart is self-explanatory. 


Now I shall explain steps to configure it across the servers.

For easy understanding, we have SOA server1 and SOA server2 and SOA Server1 will act as main server here and logs will be captured as event and shipped to SOA Server1 from SOA server2. So we can have as many as servers. 

Redis and Elastic search needs to be downloaded and running before any shipper starts shipping the events.  So as per our work case, we need to install it in SOA Server1.

Redis: 
  •       Download redis and extract the source.
  •       Execute the redis : src/redis-server --loglevel verbose 
You will get output as mentioned in below picture.


Elastic Search:

  • Download the latest version of elastic search.
  • Execute the command : bin/elasticsearch –f

Logstash as Shipper:

Run the logstash as shipper now in SOA Server2.
In config, make sure that output is directed to redis that is running in Server 1.

Like this,



Save this as say, shipper_server2.conf and run the logstash using below command.

java -jar logstash-1.2.0-flatjar.jar agent -f shipper_server2.conf

Now check your redis output, clients connected value will be 2 which indicated that shipper is successfully connected to Broker.

Now all the events captured will be stored in elastic search.

Logstash as Indexer:

To run logstash as indexer, make sure that input is read from redis.
List details mentioned in shipper output will be used here in input. 

Conf file will look like below,


  
And save this as indexer.conf and run using below command,

java -jar logstash-1.2.0-flatjar.jar agent -f indexer.conf

Kibana:

Use individual terminal to run kibana using below command,

java -jar logstash-1.2.0-flatjar.jar – web

Open kibana using this URL – http://localhost:9292

Logstash is so modular that all the above components can be installed on dedicated servers to distribute the load and specify the host accordingly.

No comments:

Post a Comment