$ git clone https://github.com/hillar/vagrant_moloch_bro_suricata.git
$ cd vagrant_moloch_bro_suricata
$ vagrant up
$ open http://localhost:8005
$ open http://localhost:3003
$ open http://localhost:9200/_plugin/head
$ vagrant ssh suricata
see also http://slides.com/hillar/vagrant#/
National Institute of Standards and Technology
Guide to Integrating Forensic Techniques into Incident Response
http://csrc.nist.gov/publications/nistpubs/800-86/SP800-86.pdf
International Association of Computer Investigative Specialists
NETWORK FORENSIC ANALYSIS
http://www.iacis.com/SiteAssets/Documents/NFA%20Program%20Description%20and%20Syllabus%202014.pdf
The SANS Institute
FOR572: Advanced Network Forensics and Analysis
http://www.sans.org/course/advanced-network-forensics-analysis
NETWORK FORENSICS: BLACK HAT RELEASE
https://www.blackhat.com/us-14/training/network-forensics-black-hat-release.html
When an event of interest has been identified, analysts assess, extract, and analyze network traffic data with the goal of determining what has happened and how the organizationís systems and networks have been affected. This process might be as simple as reviewing a few log entries on a single data source and determining that the event was a false alarm, or as complex as sequentially examining and analyzing dozens of sources (most of which might contain no relevant data), manually correlating data among several sources, then analyzing the collective data to determine the probable intent and significance of the event. However, even the relatively simple case of validating a few log entries can be surprisingly involved and time-consuming.
In addition to understanding the tools, analysts should also have reasonably comprehensive knowledge of:
If analysts understand the organizationís normal computing baseline, such as typical patterns of usage on systems and networks across the enterprise, they should be able to perform their work easier and faster. Analysts should also have a firm understanding of each of the network traffic data sources, as well as access to supporting materials, such as intrusion detection signature documentation. Analysts should understand the characteristics and relative value of each data source so that they can locate the relevant data quickly.
Typically, this identification is made through one of two methods:
timelined data
..
..
ElasticSearch
Kibana
Moloch*
Suricata
BRO
* has VIEWER
https://github.com/hillar/vagrant_moloch_bro_suricata/blob/master/Vagrantfile
Moloch is an open source, large scale IPv4 packet capturing (PCAP), indexing and database system.
A simple web interface is provided for PCAP browsing, searching, and exporting.
APIs are exposed that allow PCAP data and JSON-formatted session data to be downloaded directly.
Simply put, Moloch is a tool to Search for PCAP repositories
The Moloch system is comprised of 3 components:
Capture
A C application that sniffs the network interface, parses the traffic and creates the Session Profile Information (aka SPI-Data) and writes the packets to disk
Database
Elasticsearch is used for storing and searching through the SPI-Data generated by the capture component
Viewer
A web interface that allows for GUI and API access from remote hosts to browse/query SPI-Data and retrieve stored PCAP
ElasticSearch
Viewer
Capture
* has VIEWER
raw pcap files
Moloch parses various protocols to create SPI-Data:
This is not an all inclusive list
Suricata is a open source, free, and high performance Network IDS, IPS, and Network Security Monitoring engine.
Suricata implements a complete signature language to match on known threats, policy violations and malicious behaviour. Suricata will also detect many anomalies in the traffic it inspects.
Suricata will automatically detect protocols such as HTTP on any port and apply the proper detection and logging logic.
Suricata can log HTTP requests, log and store TLS certificates, extract files from flows and store them to disk.
With 2.0 we introduced “Eve”, our all JSON event and alert output. This allows for easy integration with Logstash and similar tools.
# suricata --list-app-layer-protos
=========Supported App Layer Protocols=========
http
ftp
smtp
tls
ssh
imap
msn
smb
dcerpc
dns
Bro is a powerful network analysis framework that is much different from the typical IDS you may know.
"We emphasize in particular that Bro is not a classic signature-based intrusion detection system (IDS)."
https://www.bro.org/sphinx/intro/index.html
Bro provides a comprehensive platform for network traffic analysis, with a particular focus on semantic security monitoring at scale. While often compared to classic intrusion detection/prevention systems, Bro takes a quite different approach by providing users with a flexible framework that facilitates customized, in-depth monitoring far beyond the capabilities of traditional systems.
The most immediate benefit that a site gains from deploying Bro is an extensive set of log files that record a network’s activity in high-level terms. These logs include not only a comprehensive record of every connection seen on the wire, but also application-layer transcripts such as, e.g., all HTTP sessions with their requested URIs, key headers, MIME types, and server responses; DNS requests with replies; SSL certificates; key content of SMTP sessions; and much more. By default, Bro writes all this information into well-structured tab-separated log files suitable for post-processing with external software.
Users can however also chose from a set of alternative output formats and backends to interface directly with, e.g., JSON or external databases.
In addition to the logs, Bro comes with built-in functionality for a range of analysis and detection tasks, including extracting files from HTTP sessions, detecting malware by interfacing to external registries, reporting vulnerable versions of software seen on the network, identifying popular web applications, detecting SSH brute-forcing, validating SSL certificate chains, and much more.
Elasticsearch is a search server based on Lucene.
It provides a distributed, multitenant-capable full-text search engine with a RESTful web interface and schema-free JSON documents.
$ curl -XPOST 'http://localhost:9200/pcaps/dns/' -d
'{
"hostname" : "www.kimchy.org",
"timestamp" : "2009-11-15T14:12:12",
"ip" : "192.168.1.2"
}'
$ curl -XGET 'http://localhost:9200/pcaps/dns/_search' -d '{
"query" : {
"term" : { "hostname" : "*.kimchy.*" }
}
}
'
{
"_shards":{
"total" : 5,
"successful" : 5,
"failed" : 0
},
"hits":{
"total" : 1,
"hits" : [
{
"_index" : "pcaps",
"_type" : "dns",
"_id" : "1",
"_source" : {
"hostname" : "www.kimchy.org",
"timestamp" : "2009-11-15T14:12:12",
"ip" : "192.168.1.2"
}
}
]
}
}
elasticsearch works seamlessly with kibana to let you see and interact with your data
Kibana 3 talks directly to Elasticsearch from the browser.
This means that your browser communicates directly with Elasticsearch, not via an intermediary. You may wish to configure a reverse proxy to restrict access to Elasticsearch.
This is beta software. Needs Java and Elasticsearch version 1.4.0 or later. Has it's own backend server. It's written in Ruby using Sinatra and the Puma rack server.
The power of Elasticsearch’s nested aggregations on the click of a mouse.
Facets: Removal from master. #7337
https://github.com/elasticsearch/elasticsearch/pull/7337
done
This writer plugin is still in testing and is not yet recommended for production use!
https://www.bro.org/sphinx/frameworks/logging-elasticsearch.html
Aug 14, Removing ElasticSearch from configure script.
LUA scritping
https://gist.github.com/hillar/aeae0b6d12de4ccd8ced#file-suricata_flow2ela-lua
https://gist.github.com/hillar/409a18e1604c70bb3804#file-suricata_tagger-js
https://redmine.openinfosecfoundation.org/projects/suricata/wiki/EveJSONOutput
outputs:
- eve-log:
enabled: yes
type: file #file|syslog|unix_dgram|unix_stream
filename: eve.json
types:
- alert
- http:
extended: yes # enable this for extended logging information
- dns
- tls:
extended: yes # enable this for extended logging information
- files:
force-magic: no # force logging magic on all logged files
force-md5: no # force logging of md5 checksums
#- drop
- ssh
https://www.bro.org/sphinx-git/scripts/policy/tuning/json-logs.bro.html
$ echo "@load tuning/json-logs" >> /usr/local/bro/share/bro/site/local.bro
$ echo "redef LogAscii::json_timestamps = JSON::TS_ISO8601;" >> /usr/local/bro/share/bro/site/local.bro
http://docs.fluentd.org/recipe/json/elasticsearch
<source>
type tail
format json
path /var/log/suricata/eve.json #...or where you placed your log
tag suricata.events
</source>
<match **>
type elasticsearch
logstash_format true
host <hostname> #(optional; default="localhost")
port <port> #(optional; default=9200)
index_name suricate #(optional; default=fluentd)
type_name events #(optional; default=fluentd)
</match>
http://logstash.net/docs/1.4.2/inputs/file
http://logstash.net/docs/1.4.2/codecs/json
input {
file {
path => ["/var/log/suricata/eve.json"]
codec => json
type => "SuricataIDPS-logs"
}
}
filter {
if [type] == "SuricataIDPS-logs" {
date {
match => [ "timestamp", "ISO8601" ]
}
}
}
output {
elasticsearch {
host => localhost
}
}
https://gist.github.com/hillar/4b014ba3abcc07a8c5c9
$ while read line; do curl -XPOST 'http://localhost:9200/indice/type/' -d $line; done <some.json
~$ /usr/local/bin/suricata -h
Suricata 2.1dev (rev 9a5bf82)
USAGE: suricata [OPTIONS] [BPF FILTER]
-r <path> : run in pcap file/offline mode
~$ /usr/local/bro/bin/bro --help
bro version 2.3-230
usage: /usr/local/bro/bin/bro [options] [file ...]
-r|--readfile <readfile> | read from given tcpdump file
~$ /usr/local/moloch/bin/moloch-capture --help
Usage:
moloch-capture [OPTION...] - capture
Application Options:
-r, --pcapfile Offline pcap file
-R, --pcapdir Offline pcap directory, all *.pcap files will be processed
http://digitalcorpora.org/corp/nps/packets/2008-nitroba/scenario.txt
$ cd /tmp/
$ wget http://digitalcorpora.org/corp/nps/packets/2008-nitroba/nitroba.pcap
$ /usr/local/moloch/bin/moloch-capture -c /usr/local/moloch/etc/config.ini -r /tmp/nitroba.pcap --tag=nitroba
What was the user’s username and password?
$ cd /tmp
$ wget https://www.bro.org/static/workshop-11/traces/illauth.pcap
$ /usr/local/bro/bin/bro -r illauth.pcap /usr/local/bro/share/bro/site/local.bro
$ less http.log
Who sent the email?
https://www.bro.org/bro-workshop-2011/exercises/incident-response/index.html#part-3-emailleakage
$ cd /tmp
$ wget https://www.bro.org/static/workshop-11/traces/email.pcap
$ sudo suricata -r email.pcap -l log
$ less log/eve.json