Saturday, February 18, 2017

Sitecore-Diagnose Performance threads - A complete guidelines for Logs and Visualization ELK-(ElasticSearch, Logstash, and Kibana)

It's hard headlines - Isn't?
Ok, let me ask one simple question - I am assuming all Senior Sitecore folks faced these challenges.
Scenario#1- Sometimes a day or week you came to know - Sitecore editorial performance went down, very delayed even to open a tree node of 10 items, or just a single click, it could be worst where it would be completely choked.
Kibana Time series.
This may not happen on particular time then what would be the first steps to diagnose the RCA.
Off thiscourse  will not happens because of Sitecore but this could happen for any reason but the blame may come by saying - " Sitecore is not working, It's very slow, very uncertain behavior"
Here are the details - How to find out the RCA and fix these issues so next time no one will complain and will say "Happy Sitecorios!!"
Example - Let's say by mistake Admin user published the whole site, Offcourse you will start getting calls for the performance and slowness, You will look into the event queue, publish queue, connections, error logs, common logs, Network latency, publish activities, History of items and so on.
Good till now, You may found an issue that high volume data in the event queue, someone published the site, error and there could be any reason.
Problem statement - let's say in off-hours (2 AM) you found these issues and application setup to clean all event queue in every 4 hours and publishing queue in 1 day ( keep alive let's say 15 days).
All your previous logs had been moved out through these services.
Here is the life saver to work for you - Setup ELK and watch the pattern of the previous logs on these tables mostly.
  1. EventQueue Table
  2. Publish Queue Table
  3. Connection details - Active, Sleep etc.
Result:-
You may found - Daily 2 AM Some jobs are running which were publishing all content
You may found - Sometime DevOps team apply the patch and without notify publish the site.
You may found - Editorial team with high access while publishing the single content always choosing full publish with related all items.
You may found some different reason but this ETL tool will help a lot to diagnose the issue and to find out the RCA.
Now, How to implement this ETL.
Start all configuration and steps from here
  1. Setup Kibana - Steps
Download kabana from- https://www.elastic.co/products/kibana
Extract the folder and go to command prompt then go to the root folder, At the same time please run the elastic services)
kibana-5.5.0-windows-x86\kibana-5.5.0-windows-x86\bin\kibana
Ref:- https://www.elastic.co/guide/en/kibana/current/setup.html
  1. Setup Elastic Search - Steps
Download the latest version from the below URL there
https://www.elastic.co/products/elasticsearch
There are few options available - like zar file or MSI installer - Will go with MSL installer. - Start installation.
If you are using window machine and not setup java then setup java sdk first and environmental variables, below are two steps to cover this.
  1. Download Java SDK and install.
  2. Go to advance setting - type in search you will get it - Go to environmental variable --> select advance and -->Add JAVA_HOMECheck your current java version to ensure it has been installedcorrectly.
           
Ref:https://www.mkyong.com/java/how-to-set- java_home-on-windows-10/
Setup the path - You can change based on your current setup.
Install this as a service - It will help you to keep running and not to open and run manually.
Setup the cluster details here

Select the required plugin - In this case please check X-pack, If it got missed, we can install this through the command prompt.

Installation Inprogress......You are on the track.

You can run through command prompt and make sure all required permission provided.

Run Kibana from the command prompt
wait for some time for the setup
You may get some error for living connections
3 Setup logstash - Steps
Download this from the https://www.elastic.co/products/logstash
Run the below command at the root level
bin\logstash-plugin update logstash-input-beats
Beat sends events to Logstash. Logstash receives these events by using the Beats input plugin for Logstash and then sends the transaction to Elasticsearch by using the Elasticsearch output plugin for Logstash. The Elasticsearch output plugin uses the bulk API, making indexing very efficient.
Install JDBC Driver for the connection, You can download this from here.
Loading first screen of Kibana.
Important note:-
Rivers are deprecated and it's recommended to use JDBC instead of this.
I have found one issue on the window machine during installation -
After a lot of research, I found this is a bug #6369
You can try this through SQL, Here is a good link which talks about this in details.
http://blog.remcam.net/index.php/2017/06/07/sqlserver-to-elk/
This is a high-level view, I am sure while doing this setup you will get a lot of challenges. You can message me if required any input for this.
Many Thanks!
Jitendra