Log drains

Log drains are currently in beta, and minor details may change. We'd love to hear any feedback or requests at hello@airplane.dev.
Log drains allow you to automatically stream all audit logs for your team to one or more destinations in your organization's observability stack.
To configure log drains, click on "Log drains" on the left side of the "Team settings" page. From there, you can set up one or more destination types, described in more detail below.

Audit log structure

The following example shows how audit logs are structured for export to log drains:
javascript
Copied
1
{
2
id: "aud20230509a4haxz",
3
createdAt: "2023-05-09T23:14:33Z",
4
actorID: "usr20230509hd1hfg",
5
actorEmail: "a.user@airplane.dev",
6
targetID: "run202305093jasjk",
7
context: {
8
ipAddress: "1.2.3.4",
9
userAgent: "Mozilla/5.0 (X11; Linux x86_64) ..."
10
},
11
event: {
12
type: "run.finished",
13
payload: {
14
runFinished: {
15
envID: "env20220314jasd",
16
envSlug: "prod",
17
runDurationMs: 5241,
18
runSource: "form",
19
runStatus: "Succeeded",
20
taskID: "tsk20221104sj4ja",
21
taskName: "My task",
22
taskSlug: "my_task"
23
},
24
},
25
}
26
};
Note that the structure of the payload field will vary based on the event type.

Destination types

Datadog

If the Datadog log drain is enabled, each audit log will be sent to the Datadog log collection API using the configured API key. These logs will be indexed in Datadog with the source name airplane-audit-logs and service name airplane.

OpenTelemetry

If an OpenTelemetry log drain is enabled, each audit log will be sent to an OpenTelemetry collector in your team's infrastructure using the configured URL. The collector can then filter and transform the logs before forwarding them onto other destinations in your observability stack, including Datadog, AWS CloudWatch, Splunk, etc.

Webhook

If a webhook log drain is enabled, each audit log will be sent as an HTTP POST request to the configured webhook URL. The body will consist of a single audit log in JSON format.

Errors and retries

Any non-200 HTTP response from a downstream destination will be considered an error and the associated request will be retried later. After 5 consecutive delivery errors, the corresponding event will be dropped from the log drain export pipeline and not sent again. However, it will still be visible in the Activity page and exportable via CSV.

Troubleshooting

If logs aren't reaching your team's configured drain(s), then please contact support@airplane.dev for assistance.

Creating Datadog dashboards from log drain data

If you're sending log drain data to Datadog, you can use the log events to create dashboards and alerts around your organization's Airplane activity.
To create an "Airplane runs" dashboard like the one above:
  1. Ensure that log drain data is correctly arriving in your Datadog account by searching for source:airplane-audit-logs in the Logs search page
  2. Download our Datadog dashboard JSON template and save it in an accessible place
  3. Create facets for the following attribute paths by clicking the "+ Add" button in the left panel of the Logs search page:
    1. @data.event.payload.runCreated.runSource
    2. @data.event.payload.runCreated.taskSlug
    3. @data.event.payload.runFinished.runStatus
    4. @data.event.payload.runFinished.taskSlug
    5. @data.event.type
  1. Create a facet measure for the following attribute (by selecting "Measure" instead of "Facet" in the facet creation panel):
    1. @data.event.payload.runFinished.runDurationMs
  2. Create a new, empty dashboard
  3. Click "Configure" then "Import dashboard JSON..." and select the file downloaded in step (2)
After following these steps, you should see your dashboard populated with your log data within a few minutes.
Once the initial dashboard is in place, you can extend it by adding facets and plots for other types of log drain events (e.g. task updates).