CPU and memory limits

Understand CPU and memory limits and usage for task runners
Each Airplane task runs in a container with CPU and memory limits to prevent exhausting resources on the host. Users on self-hosted agents can adjust these limits to match the demands of the tasks being run.

Default container limits

The default run container limits vary based on the agent type:
Agent typeDefault CPU limitDefault memory limitAdjustable?
Airplane-hosted1 vCPU core2GBNo
Self-hosted on AWS ECS0.5 vCPU cores1GBYes (docs)
Self-hosted on Kubernetes1 vCPU core2GBYes (docs)
In most cases, the limits are soft in the sense that going above them will not immediately cause problems. However, staying above them for extended periods of time can lead to CPU performance throttling and/or out of memory (OOM) errors that cause your task run to fail.
As noted in the table above, limits for self-hosted agents are adjustable. See the linked pages for more details.

Viewing CPU and memory usage in the UI

After a run starts, the Airplane agent will periodically poll the run container to get its current CPU and memory usage and report these back to Airplane. This polling starts out at once-per-second and then gradually gets less frequent as the run continues.
If there are at least two samples, then the UI will show charts of the CPU and memory versus time in a CPU & Memory tab at the bottom of the run details page. The CPU usage is in units of cores, and the memory in MB:
The limits will be shown in the plots if the usage gets close to them:

Chart availability

CPU and memory usage charts will not be available for all task runs. The following are some cases for which they'll be missing:
  1. Very short runs- if a run finishes in under ~3 seconds, then there won't be enough time to collect enough samples to create usage graphs.
  2. Old runs- usage stats are retained for 30 days and then cleaned out to keep the storage volumes reasonable. Usage data was also not collected for any runs before June 20, 2023.
  3. Workflows- CPU and memory data collection from the workflow runtime is not yet supported. However, any standard runs that are spun up by a workflow will have their usage reported in the UI.