LIU JING
FinanceLeads.Repository.Elkt.trackEvent(
{
"page": "universal",
"eventName": "pageload",
"url": document.location.href
});
Ships logs from any source
Parse them
Normalize the data into your destinations
Process Any Data, From Any Source
input {
s3 { 'bucket' => "your_bucket_which_store_logs" }
}
filter {
date {
match => [ "timestamp" , "dd/MMM/yyyy:HH:mm:ss Z" ]
}
}
output {
elasticsearch { host => localhost protocol => http }
}
ElasticSearch
head
marvel
elasticsearch-cloud-aws
Visualize data from ES server with search queries
MAD-walking-skeleton way except the bake an AMI
Ansible gets run through the AWS EC2 UserData scripts.
Need to fetch data from 2 S3 buckets
Send processed data to ES via private ES ELB
node to node communication (9300 by default)
HTTP traffic(9200 by default)
Public ELB for query and Kibana
Private ELB for logstash to send data in
Install plugins head, marvel during boot
need enable logging ability