Download our free e-book now: "Serverless Best Practices"!

Nothing came up.

Try with different keywords or contact using the chat bubble.

Back to article.

Configuring custom events from lambdas

Dashbird can catch custom events from lambda invocation logs.

Creating a filter

After you have loaded your logs into Dashbird by configuring a collector you will need to create a Filter to on the log sources to see relevant events.

Dashbird allows creating filters from scratch or from a predefined template.

Each filter specifies a set of rules to filter the logs by and also the log sources - which are the logs you want this filter to be applied to.

After the filter is created you will be able to add rules and log sources.

Adding a rule

Click on ‘Add rule’ button next to ‘Rules’.

Query language

Query language is powerful way to parse your log lines and capture data for your events.

It is possible to pipe multiple parsers together and pick only the result you need.

// foo walks into a bar {"foo":"bar","abc":"xyz"}

| parse "bar *" as data
| json field=data abc
| fields abc

// abc: xyz

It is possible to match any log line using complex query or just simple words. That can be only as first part of the query and it won’t have any capture groups as result.

keyword AND keyword OR keyword NOT keyword
"This is a phrase"
// Matches all lines including word "report"

("billed duration" and end) or error
// Matches all lines including phrase "billed duration" and "end" or "report"

test and not unit
// Matches all lines including with "test" and ignoring with "unit"

foo bar xyz
// Matches all lines including all these words

Using start and stop anchor you can capture data from log lines.


| parse [field=<field_name>] "<start_anchor>*<stop_anchor>" as <field>


parse "GET * HTTP/1.1 * * " as url, code, size

  parse "GET * " as url
| parse "HTTP/1.1 * " as code

Whenever simple anchor parsing doesn’t work, you can use more powerful regular expression. You need to have at least 1 capture group.


| parse regex [field=<field_name>] "<start_expression>(?<field_name><field_expression>)<stop_expression>"


parse regex "GET (?<url>(.*?)) "
// parses url from the logline


| json [field=<field_name>] "<name_or_key>"[, "<name_or_key>", ...] [as <field> ...]


// parses logline as JSON and exports it as fields

  json log
| json field=log name, value

json accountId, accountName as id, name

It is possible to filter based on captured groups.

Available operators are: =, <>, <=, >=, <, >

For comparing strings use quotes field = "value"


| where <boolean expression>


| where foo = "bar"
| where status = 200
| where size < 30

Fields are capture groups from the parsed results. If you don’t need all of them, then you can filter them.


| fields <name>[, <name>, ...]


| fields name
// Capture group only contains name instead of the whole JSON document

  parse "GET * HTTP/1.1 * * " as url, code, size
| fields code
// Capture groups only contains code

Adding a log source

Click on ‘Manage’ button next to ‘Log Sources’. A list of log sources will be shown. You can search through the log sources.

Make sure you have checked all resources you want the Filter to be applied to and hit ‘Save’.

Can't find what you're looking for? We'd love to help. Send us a message through the chat bubble or email us.