ELK get started

Liu Jing

Steps

  • Install Logstash
  • Install Elasticsearch
  • Install Kibana
  • Make more fun with logstash

Logstash

Prerequisite: Java​

 

download logstash

$ java  -version
java version "1.7.0_21"
Java(TM) SE Runtime Environment (build 1.7.0_21-b12)
Java HotSpot(TM) 64-Bit Server VM (build 23.21-b01, mixed mode)
curl -O https://download.elasticsearch.org/logstash/logstash/
logstash-1.5.0.rc2.tar.gz

tar -zxvf logstash-1.5.0.rc2.tar.gz

Logstash

Up and running

 

fancier example

cd logstash-{logstash_version}

bin/logstash -e 'input { stdin { } } output { stdout {} }'
bin/logstash -e 
'input { stdin { } } 
output { stdout { codec => rubydebug } }'

Elasticsearch

Download

Install plugins

curl -O https://download.elasticsearch.org/elasticsearch/
elasticsearch/elasticsearch-1.4.4.tar.gz

tar -zxvf elasticsearch-1.4.4.tar.gz
cd elasticsearch-1.4.1/
bin/plugin install mobz/elasticsearch-head  

Running


bin/elasticsearch    

Integrate with Elasticsearch

Store logs using Elasticsearch

see how it looks


bin/logstash -e 'input { stdin { } } output { elasticsearch { host => localhost  protocol => http} }'

curl 'http://localhost:9200/_search?pretty'           

Integrate with Elasticsearch

Multiple outputs

Multiple inputs


bin/logstash -e 'input { stdin { } } 
output { elasticsearch { host => localhost } stdout { } }'

bin/logstash -e 'input { stdin { } 
 file { path => "absolute_path_to_your_file" start_position => beginning} } 
 output { stdout { } }'

Kibana

Download

Up and running

curl -O https://download.elasticsearch.org/kibana/
kibana/kibana-4.0.1-darwin-x64.tar.gz

tar -zxvf kibana-4.0.1-darwin-x64.tar.gz

open config/kibana.yml 
Set the elasticsearch_url to point at your Elasticsearch instance
./bin/kibana
Point your browser at 'http://yourhost.com:5601' 

More fun with Logstash

Persistent Configuration files

run this command

input { stdin { } }
output {
  elasticsearch { host => localhost  protocol => http }
  stdout { codec => rubydebug }
}

bin/logstash -f logstash-simple.conf

More fun with Logstash

Filters

input { stdin { } }

filter {
  grok {
    match => { "message" => "%{COMBINEDAPACHELOG}" }
  }
  date {
    match => [ "timestamp" , "dd/MMM/yyyy:HH:mm:ss Z" ]
  }
}

output {
  elasticsearch { host => localhost  protocol => http }
  stdout { codec => rubydebug }
}

Useful example

input {
    s3 {
        'bucket' => "rea-screw-the-spreadsheet"
        'prefix' => "myLogs"
        'temporary_directory' => "bin"
        'region' => "us-east-1"
        'delete' => false
        'interval' => '30'
        'type' => 'cloudfront_logs'
    }
}
filter {
    csv {
      separator => "	"
      columns => [ "date", "time", "x-edge-location", "sc-bytes", "c-ip", "cs-method", "Host", "cs-uri-stem", "sc-status", "Referer", "User-Agent", "cs-uri-query", "Cookie", "x-edge-result-type", "x-edge-request-id", "x-host-header", "cs-protocol", "cs-bytes", "time-taken" ]
      add_field => [ "listener_timestamp", "%{date} %{time}" ]
    }
    date {
      match => [ "listener_timestamp", "yy-MM-dd HH:mm:ss" ]
    }

    urldecode {
        field => "cs-uri-query"
    }
}
output {
  elasticsearch {
    host => localhost
    protocol => http
  }
  stdout { codec => rubydebug }
     
}
Made with Slides.com