It were a great few months, after the release of the ElastAlert Kibana plugin. Even better, we have been able to receive many commits, most notably support for Kibana 5.0 up to Kibana 5.6.
BitSensor has another gift to the community: Rule Templates
Templates allow you to setup rules quickly, in this case setup an Frequency rule, that is triggered if a tool is hammering your website, maybe for data exfiltration. What if you can get a Slack alert the moment that happens, with the ip address and the name of the tool already in there?
Let’s set this up!
Pick either Kibana 4 or Kibana 5. For example, if you have kibana 5.6.4 installed, use elastalert-5.6.4-latest.zip?job=build
#Kibana 4 ./bin/kibana plugin -i elastalert -u https://git.bitsensor.io/front-end/elastalert-kibana-plugin/builds/artifacts/master/raw/build/elastalert-latest.zip?job=build #Kibana 5 ./bin/kibana-plugin install https://git.bitsensor.io/front-end/elastalert-kibana-plugin/builds/artifacts/kibana5/raw/artifact/elastalert-<replace-with-the-kibana-version>-latest.zip?job=build
Clone configuration, and start the Docker image running ElastAlert
git clone https://github.com/bitsensor/elastalert.git; cd elastalert docker run -d -p 3030:3030 \ -v `pwd`/config/elastalert.yaml:/opt/elastalert/config.yaml \ -v `pwd`/config/config.json:/opt/elastalert-server/config/config.json \ -v `pwd`/rules:/opt/elastalert/rules \ -v `pwd`/rule_templates:/opt/elastalert/rule_templates \ --net="host" \ --name elastalert bitsensor/elastalert:latest
So, lets run it all.
Done! Now go to http://localhost:5601/app/elastalert
Lets find tools on your website with the Volumetric Alert. You can play around with the number of events that have to be triggered, and the timeframe. You will also want to change the query, either to ‘*’ to search in all documents, or perhaps you have a tag by application that you want to filter on.
By default, a Slack alerter is added. In order to make that work for your channel, change the webhook.
Give it a name # Alert when there are n events coming from the same ip, userAgent within m seconds. # Rule name, must be unique name: Bad/Bot behavior # Type of alert. type: frequency Tweak the number of events and timeframe # Alert when this many documents matching the query occur within a timeframe num_events: 100 # num_events must occur within this amount of time to trigger an alert timeframe: seconds: 20 Set the correct index name and fields # Index to search, wildcard supported index: bitsensor timestamp_field: endpoint.localtime query_key: - context.ip - context.http.userAgent And use a sensible query, in this case all events # A list of elasticsearch filters used for find events # These filters are joined with AND and nested in a filtered query # For more info: http://www.elasticsearch.org/guide/en/elasticsearch/reference/current/query-dsl.html filter: - query: query_string: query: "*" And finally set the slack wehook # The alert is use when a match is found alert: - slack slack_webhook_url: "https://hooks.slack.com/services/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" slack_username_override: "ElastAlert"
Now, pres play in the corner at the right on the top, to see how many alerts are triggered. If you feel confident that you can handle the amount of alerts, press save and enjoy your bots chatting to you.
And, don’t forget to commit your templates to the rule templates! We would love to see more template usecases, such as integration with firewalls through webhooks, and ticketing systems!