Saturday 14 February 2015

Setup ELK on Linux (Elasticsearch 1.4.2 /Logstash 1.4.2/Kibana 3.1.2)

Below are instructions to setup ELK stack, in 8 simple steps.

1. Install JDK Httpd
2. Download and extract necessary components
3. Configure and start httpd and elasticsearch servers
3. Verify httpd,elasticsearch
4. Setup Kibana on HTTPD path.
5. Test Kibana and get it working with few changes to elasticsearch.
6. Add logstash configuration
7. Run logstash to push to Elasticsearch.
8. Advanced Logstash configurations to parse access_log.




Install JDK and Httpd

Make sure appropriate yum repo's are updated.

yum install java-1.7.0-openjdk
yum install httpd

Disable Firewall 
service iptables stop


Downloads:




Copy the files to a linux machine to /root folder

ElasticSearch: unzip elasticsearch-1.4.2.zip
Kibana: tar -zxvf kibana-3.1.2.tar
Logstash: tar -zxvf logstash-1.4.2.tar
Head Plugin: elasticsearch-1.4.2/bin/plugin --url file:///root/elasticsearch-head-master.zip --install mobz/elasticsearch-head

Configure Elasticsaerch

vi /root/elasticsearch-1.4.2/config/elasticsearch.yml
uncomment cluster-name and give a name. don't use the default

################################### Cluster ###################################

# Cluster name identifies your cluster for auto-discovery. If you're running
# multiple clusters on the same network, make sure you're using unique names.
#

cluster.name: vidhya-elk



Start Servers

service httpd restart


Verify the server Installation

Httpd: http://<IP/hostname>




Start Elasticsearch
/root/elasticsearch-1.4.2/bin/elasticsearch



Verify Elasticsearch :   http://<ip/hostname>:9200/

Verify Elasticsearch head :  http://<ip/hostname>:9200/_plugin/head



Kibana Setup


mkdir /var/www/kibana3
cp -r /root/kibana-3.1.2/*   /var/www/kibana3/

vi /etc/httpd/conf/httpd.conf

alias /kibana /var/www/kibana3
<Directory /var/www/kibana3>
  AllowOverride All
  Require all granted
</Directory>

Verify Kibana : 





To fix this error, changes are required in elasticsearch.yml, by adding the below mentioned line at the end of the file.

vi /root/elasticsearch-1.4.2/config/elasticsearch.yml

http.cors.enabled: true

Restart elasticsearch





Logstash Setup

Create a configuration file:
vi  /root/logstash-1.4.2/conf/es.conf
input { stdin { }}
output {
        stdout { }
        elasticsearch {
                bind_host => "127.0.0.1"
                protocol => http
        }
}

The above configuration takes any standard input and publishes to elasticsearch as well as prints it on the command line.

Verify Logstash

/root/logstash-1.4.2/bin/logstash agent -f /root/logstash-1.4.2/conf/es.conf --configtest
-- This verifies the configuration file
./logstash-1.4.2/bin/logstash agent -f logstash-1.4.2/conf/es.conf
-- This pushes whatever is typed on the command-line to elasticsearch, you can see indexes getting created using the elasticsearch head plugin.
Advanced Logstash configuration


1. Parse the access_log and publish to elasticsearch for log analysis 

vi  /root/logstash-1.4.2/conf/access_log.conf
input {
  file {
    path => "/var/log/httpd/access_log"
    start_position => "beginning"
    sincedb_path => "/dev/null"
    type => "apache-access"
  }
}

output {
        stdout { }
        elasticsearch {
              bind_host => "127.0.0.1"
              protocol => http
        }
}
2. Parse the access_log and publish to elasticsearch for log analysis, custom grok filters 

vi  /root/logstash-1.4.2/conf/access_grok_log.conf

input {
  file {
    path => "/root/log/access_log"
    start_position => "beginning"
    sincedb_path => "/dev/null"
    type => "apache-access"
  }
}
filter {
  if ([message] =~ "^::") {
      drop {}
  }
  grok {
    match => ["%{COMBINEDAPACHELOG}"]
  }
  date {
    match => [ "timestamp" ,"dd/MMM/yyyy:HH:mm:ss Z"]
  }
}

output {
        stdout { }
        elasticsearch {
                bind_host => "127.0.0.1"
                protocol => http
        }
}








No comments:

Post a Comment