Community update: Elastalert alerting in Kibana 5.6

It were a great few months, after the release of the ElastAlert Kibana plugin. Even better, we have been able to receive many commits, most notably support for Kibana 5.0 up to Kibana 5.6.

BitSensor has another gift to the community: Rule Templates

Kibana 5.6 Template

Templates allow you to setup rules quickly, in this case setup an Frequency rule, that is triggered if a tool is hammering your website, maybe for data exfiltration. What if you can get a Slack alert the moment that happens, with the ip address and the name of the tool already in there?

Let’s set this up!

Pick either Kibana 4 or Kibana 5. For example, if you have kibana 5.6.4 installed, use

#Kibana 4
./bin/kibana plugin -i elastalert -u

#Kibana 5
./bin/kibana-plugin install<replace-with-the-kibana-version>

Clone configuration, and start the Docker image running ElastAlert

git clone; cd elastalert
docker run -d -p 3030:3030 \
    -v `pwd`/config/elastalert.yaml:/opt/elastalert/config.yaml \
    -v `pwd`/config/config.json:/opt/elastalert-server/config/config.json \
    -v `pwd`/rules:/opt/elastalert/rules \
    -v `pwd`/rule_templates:/opt/elastalert/rule_templates \
    --net="host" \
    --name elastalert bitsensor/elastalert:latest

So, lets run it all.

Kibana 5.6 Template

Done! Now go to http://localhost:5601/app/elastalert

Kibana 5.6 Template

Lets find tools on your website with the Volumetric Alert. You can play around with the number of events that have to be triggered, and the timeframe. You will also want to change the query, either to ‘*’ to search in all documents, or perhaps you have a tag by application that you want to filter on.

By default, a Slack alerter is added. In order to make that work for your channel, change the webhook.

Give it a name
    # Alert when there are n events coming from the same ip, userAgent within m seconds.
    # Rule name, must be unique
    name: Bad/Bot behavior
    # Type of alert.
    type: frequency

Tweak the number of events and timeframe

    # Alert when this many documents matching the query occur within a timeframe
    num_events: 100
    # num_events must occur within this amount of time to trigger an alert
    seconds: 20

Set the correct index name and fields

    # Index to search, wildcard supported
    index: bitsensor
    timestamp_field: endpoint.localtime

    - context.ip

    - context.http.userAgent

And use a sensible query, in this case all events

    # A list of elasticsearch filters used for find events
    # These filters are joined with AND and nested in a filtered query
    # For more info:
    - query:

          query: "*"

And finally set the slack wehook

    # The alert is use when a match is found
      - slack
    slack_webhook_url: ""

    slack_username_override: "ElastAlert"

Now, pres play in the corner at the right on the top, to see how many alerts are triggered. If you feel confident that you can handle the amount of alerts, press save and enjoy your bots chatting to you.

Slack Attack

If you don’t have an application that is sending logs to ElasticSearch yet, you might want to try out the BitSensor plugin for Java / PHP / Drupal / Node.JS / Browser / Apache / Nginx or IIS.

And, don’t forget to commit your templates to the rule templates! We would love to see more template usecases, such as integration with firewalls through webhooks, and ticketing systems!

Or… work on alerting in a weekly sprint at BitSensor. Feel free to commit to the ElastAlert REST interface or the Kibana plugin.

Special thanks to CodingSpiderFox for helping out with the community, Baoban for starting Kibana 5 support, and Shahar Davidson for ensuring licences.