Most CI/CD pipelines are still glued together with YAML and shell scripts. Dagger takes a different approach: you write your pipelines in a real programming language (Python, Go, or TypeScript), and the Dagger Engine executes every step inside containers. The same pipeline runs on your laptop, in GitHub Actions, GitLab CI, or Jenkins, with zero changes.
Created by Solomon Hykes (Docker co-founder), Dagger treats pipeline steps as typed functions that connect through a GraphQL API to an engine built on BuildKit. Each function runs in its own container, with automatic caching and parallel execution handled by the engine. This guide walks through installing Dagger on Ubuntu 24.04 (and 22.04), setting up the Python SDK, writing real pipeline functions, and running builds locally.
Tested April 2026 on Ubuntu 24.04 LTS (Noble Numbat) with Docker 29.4.0, Dagger v0.20.3, Python 3.12.3
Prerequisites
- Ubuntu 24.04 LTS or Ubuntu 22.04 LTS server with at least 2 GB RAM and 2 vCPUs
- Root or sudo access
- Internet access (Dagger pulls container images during execution)
- Docker Engine (installed in the steps below)
Dagger requires a container runtime. Docker is the most common choice, but Podman and nerdctl are also supported. This guide uses Docker Engine from Docker’s official APT repository.
1. Install Docker Engine
Dagger runs an engine container in Docker. If you already have Docker installed on Ubuntu, skip to the next section.
Add Docker’s official GPG key and APT repository:
sudo install -m 0755 -d /etc/apt/keyrings
sudo curl -fsSL https://download.docker.com/linux/ubuntu/gpg -o /etc/apt/keyrings/docker.asc
sudo chmod a+r /etc/apt/keyrings/docker.asc
echo "deb [arch=$(dpkg --print-architecture) signed-by=/etc/apt/keyrings/docker.asc] https://download.docker.com/linux/ubuntu $(. /etc/os-release && echo $VERSION_CODENAME) stable" | sudo tee /etc/apt/sources.list.d/docker.list > /dev/null
sudo apt update
Install Docker Engine, CLI, and plugins:
sudo apt install -y docker-ce docker-ce-cli containerd.io docker-buildx-plugin docker-compose-plugin
Add your user to the docker group so you can run Docker commands without sudo:
sudo usermod -aG docker $USER
newgrp docker
Verify Docker is running:
docker --version
Expected output:
Docker version 29.4.0, build 9d7ad9f
Confirm the daemon is active:
sudo systemctl is-active docker
This should return active.
2. Install the Dagger CLI
Dagger provides an install script that downloads the latest stable binary. The script auto-detects your architecture and verifies checksums.
curl -fsSL https://dl.dagger.io/dagger/install.sh | BIN_DIR=/usr/local/bin sudo -E sh
The installer downloads the binary, verifies its checksum, and places it in /usr/local/bin:
dagger has built-in shell completion. This is how you can install it for:
BASH:
1. Ensure that you install bash-completion using your package manager.
2. Add dagger completion to your personal bash completions dir
mkdir -p ~/.local/share/bash-completion/completions
dagger completion bash > ~/.local/share/bash-completion/completions/dagger
ZSH:
1. Generate a _dagger completion script and write it to a file within your $FPATH, e.g.:
dagger completion zsh > /usr/local/share/zsh/site-functions/_dagger
installed /usr/local/bin/dagger
Verify the installed version:
dagger version
You should see the version, engine image reference, and platform:
dagger v0.20.3 (image://registry.dagger.io/engine:v0.20.3) linux/amd64
For a user-local install (no sudo), use BIN_DIR=$HOME/.local/bin instead and make sure ~/.local/bin is in your PATH.
3. Enable Shell Completion (Optional)
Tab completion speeds up working with dagger call and function names. For Bash:
sudo apt install -y bash-completion
mkdir -p ~/.local/share/bash-completion/completions
dagger completion bash > ~/.local/share/bash-completion/completions/dagger
source ~/.local/share/bash-completion/completions/dagger
For Zsh:
dagger completion zsh > /usr/local/share/zsh/site-functions/_dagger
How Dagger Works: Engine, Modules, and Functions
Before writing pipelines, it helps to understand the three core concepts.
The Dagger Engine runs as a container inside Docker. The first time you run a dagger command, it pulls the engine image (registry.dagger.io/engine:v0.20.3) and starts it automatically. Every subsequent call reuses this running engine. You can see it with docker ps:
docker ps --filter name=dagger --format "table {{.Names}}\t{{.Image}}\t{{.Status}}"
Output:
NAMES IMAGE STATUS
dagger-engine-v0.20.3 registry.dagger.io/engine:v0.20.3 Up 4 minutes
A Module is a collection of related functions packaged together. Think of it as a project that contains your pipeline logic. Each module has a dagger.json config file and source code in your chosen SDK language.
Functions are the building blocks. Each function takes typed inputs, runs operations inside containers, and returns typed outputs. Functions are composable, which means you chain them together into pipelines. The engine builds a directed acyclic graph (DAG) from your function calls and executes them with automatic caching and parallelism.
4. Create Your First Dagger Module
Initialize a new module with the Python SDK. Dagger supports Go, Python, and TypeScript as official SDKs, plus community SDKs for PHP, Java, Rust, and .NET.
mkdir ~/my-project && cd ~/my-project
dagger init --name=my-project --sdk=python
The first run takes a few seconds because Dagger pulls the engine image (if not cached) and generates the SDK scaffolding. The output shows the engine connecting and the Python codegen running:
connect
┆ starting engine
┆ ┆ create container
┆ connecting to engine
┆ INF connected client-version=v0.20.3 server-version=v0.20.3
┆ starting session
load module: .
┆ finding module configuration
┆ initializing module
┆ module SDK: run codegen
WRN no LICENSE file found; generating one for you, feel free to change or remove
Initialized module my-project in /home/ubuntu/my-project
Check the generated project structure:
ls -la
Dagger creates the following files:
dagger.json # Module configuration (name, SDK, engine version)
pyproject.toml # Python project metadata
uv.lock # Dependency lock file
sdk/ # Auto-generated Dagger Python SDK
src/my_project/ # Your pipeline code lives here
__init__.py # Module docstring and exports
main.py # Function definitions
LICENSE # Auto-generated Apache 2.0 license
The generated main.py includes two example functions. Take a look:
cat src/my_project/main.py
The file contains two example functions, container_echo and grep_dir:
import dagger
from dagger import dag, function, object_type
@object_type
class MyProject:
@function
def container_echo(self, string_arg: str) -> dagger.Container:
"""Returns a container that echoes whatever string argument is provided"""
return dag.container().from_("alpine:latest").with_exec(["echo", string_arg])
@function
async def grep_dir(self, directory_arg: dagger.Directory, pattern: str) -> str:
"""Returns lines that match a pattern in the files of the provided Directory"""
return await (
dag.container()
.from_("alpine:latest")
.with_mounted_directory("/mnt", directory_arg)
.with_workdir("/mnt")
.with_exec(["grep", "-R", pattern, "."])
.stdout()
)
The @object_type decorator registers the class as a Dagger module. Each @function method becomes callable from the dagger CLI. The dag object is the entry point to the Dagger API, providing access to containers, directories, files, and other primitives.
5. Run Your First Dagger Function
List all available functions in the module:
dagger functions
Output:
Name Description
container-echo Returns a container that echoes whatever string argument is provided
grep-dir Returns lines that match a pattern in the files of the provided Directory
Call the container-echo function and capture its stdout:
dagger call container-echo --string-arg="Hello from Dagger" stdout
Dagger connects to the engine, loads the module, pulls the Alpine image, runs the echo command, and returns the output. The TUI shows the execution trace in real time:
connect DONE [0.3s]
load module: . DONE [9.9s]
myProject: MyProject! DONE [2.1s]
MyProject.containerEcho(stringArg: "Hello from Dagger"): Container!
┆ Container.from(address: "alpine:latest"): Container! CACHED [1.0s]
┆ withExec echo 'Hello from Dagger'
┆ | Hello from Dagger
MyProject.containerEcho DONE [3.4s]
Container.stdout: String! DONE [0.2s]
Hello from Dagger
Notice the CACHED label on Container.from. The Dagger Engine caches every operation by content hash. On subsequent runs, unchanged steps are skipped entirely, which makes repeated pipeline runs significantly faster.
6. Write a Real Build Pipeline
The generated example is useful for understanding the basics, but a production pipeline needs build, test, and publish steps. Create a small Go application and a Dagger module that builds, tests, and publishes it as a container image.
Create the project directory and a simple Go web server:
mkdir ~/build-demo && cd ~/build-demo
Create the Go source file:
cat > main.go << "EOF"
package main
import (
"fmt"
"net/http"
)
func main() {
http.HandleFunc("/", func(w http.ResponseWriter, r *http.Request) {
fmt.Fprintln(w, "Hello from Dagger-built app!")
})
fmt.Println("Server starting on :8080")
http.ListenAndServe(":8080", nil)
}
EOF
Create the Go module file:
cat > go.mod << "EOF"
module build-demo
go 1.22
EOF
Initialize a Dagger module in this project:
dagger init --name=build-demo --sdk=python
Dagger creates a .dagger/ directory inside your project. This keeps the Dagger pipeline code separate from your application code. Open the generated pipeline file:
vi .dagger/src/build_demo/main.py
Replace the contents with three pipeline functions for build, test, and publish:
import dagger
from dagger import dag, function, object_type
@object_type
class BuildDemo:
@function
async def build(self, source: dagger.Directory) -> dagger.Container:
"""Build a Go application inside a container"""
return (
dag.container()
.from_("golang:1.22-alpine")
.with_mounted_directory("/src", source)
.with_workdir("/src")
.with_exec(["go", "build", "-o", "app", "."])
)
@function
async def test(self, source: dagger.Directory) -> str:
"""Run tests for the Go application"""
return await (
dag.container()
.from_("golang:1.22-alpine")
.with_mounted_directory("/src", source)
.with_workdir("/src")
.with_exec(["go", "vet", "./..."])
.stdout()
)
@function
async def publish(self, source: dagger.Directory, tag: str = "latest") -> str:
"""Build and publish a container image"""
build = (
dag.container()
.from_("golang:1.22-alpine")
.with_mounted_directory("/src", source)
.with_workdir("/src")
.with_exec(["go", "build", "-o", "/usr/local/bin/app", "."])
)
return await (
dag.container()
.from_("alpine:latest")
.with_file("/usr/local/bin/app", build.file("/usr/local/bin/app"))
.with_entrypoint(["/usr/local/bin/app"])
.with_exposed_port(8080)
.publish(f"ttl.sh/dagger-demo-{tag}")
)
Each function chains Dagger API calls to build a pipeline. The build function mounts your source code into a Go container and compiles it. The test function runs go vet to catch common issues. The publish function builds the binary, copies it into a minimal Alpine image, and pushes to a container registry.
7. Run the Pipeline
Verify the module registered all three functions:
dagger functions
Output confirms the three functions are available:
Name Description
build Build a Go application inside a container
publish Build and publish a container image
test Run tests for the Go application
Run the test function first. The --source=. flag passes the current directory as the source code input:
dagger call test --source=.
The engine mounts your source code, starts a Go container, and runs go vet. Clean output (no errors printed) means the code passed:
BuildDemo.test(source: ...): String!
┆ Container.from(address: "golang:1.22-alpine"): Container! DONE [1.0s]
┆ withExec go vet ./...
┆ Container.withExec DONE [20.0s]
BuildDemo.test DONE [23.3s]
Now publish the container image. This example uses ttl.sh, a free anonymous registry that auto-deletes images after a few hours (useful for testing):
dagger call publish --source=. --tag=v1.0
After building and pushing, Dagger returns the full image reference with digest:
ttl.sh/dagger-demo-v1.0@sha256:c25b9a7716dee53a6bd1d185afc5ac2ded870a2e15955b6303eb523686c96f27
The image was built inside a container, compiled into a minimal Alpine base, and pushed to the registry. All three steps (pull base image, compile, push) ran in containers managed by the Dagger Engine. No Go toolchain was needed on the host.
8. Dagger CLI Quick Reference
Here are the commands you will use most often when working with Dagger:
| Command | Description |
|---|---|
dagger init --sdk=python | Initialize a new module with the Python SDK |
dagger functions | List all functions in the current module |
dagger call <function> | Execute a function |
dagger call <function> --help | Show function arguments and types |
dagger install <module> | Add a dependency from the Daggerverse |
dagger develop | Regenerate SDK after changing dependencies |
dagger version | Show installed CLI and engine version |
dagger login | Authenticate with Dagger Cloud for traces |
Available SDKs
Dagger supports multiple programming languages for writing pipeline functions. All SDKs are version-locked to the engine release:
| SDK | Status | Init Command |
|---|---|---|
| Python | Official (GA) | dagger init --sdk=python |
| Go | Official (GA) | dagger init --sdk=go |
| TypeScript | Official (GA) | dagger init --sdk=typescript |
| PHP | Community | dagger init --sdk=php |
| Java | Community | dagger init --sdk=java |
| Rust | Community | dagger init --sdk=rust |
All SDKs communicate with the same GraphQL API, so the choice of language is a matter of team preference. The generated code is fully type-safe with editor autocompletion in all three official SDKs.
Using Daggerverse Modules
The Daggerverse is Dagger's module registry with over 1,500 public modules. You can install any module as a dependency and call its functions directly. For example, to add the Helm module:
dagger install github.com/dagger/dagger/modules/helm
After installing, the module's functions become available through dagger call. You can also use modules from within your own functions by calling dag.helm() (or whatever the module provides) in your pipeline code.
Run Dagger in CI Systems
One of Dagger's core strengths is portability. The same pipeline code that runs locally works in any CI system that can run Docker. The CI configuration just calls dagger call:
For GitHub Actions, add this to your workflow:
steps:
- uses: actions/checkout@v4
- name: Run pipeline
uses: dagger/dagger-for-github@v7
with:
verb: call
args: test --source=.
For GitLab CI, add this to .gitlab-ci.yml:
test:
image: docker:latest
services:
- docker:dind
before_script:
- curl -fsSL https://dl.dagger.io/dagger/install.sh | BIN_DIR=/usr/local/bin sh
script:
- dagger call test --source=.
Your pipeline logic stays in your Dagger module (Python/Go/TypeScript code), not in CI-specific YAML. Switching CI providers means changing only the thin wrapper, not rewriting your pipeline.
Uninstall Dagger
To remove Dagger completely, delete the CLI binary and stop the engine container:
sudo rm /usr/local/bin/dagger
docker stop dagger-engine-v0.20.3
docker rm dagger-engine-v0.20.3
Troubleshooting
Error: "Cannot connect to the Docker daemon"
This means Docker is not running or your user lacks permission. Check the Docker service and group membership:
sudo systemctl start docker
sudo usermod -aG docker $USER
newgrp docker
Error: "failed to run command [docker image inspect registry.dagger.io/engine:v0.20.3]"
This is normal on the first run. Dagger tries to inspect the engine image locally, fails because it hasn't been pulled yet, then automatically pulls it. You will see this error followed by a docker pull step. No action needed.
Slow first run
The first dagger call in a module takes longer because Dagger must pull the engine image (~200 MB), generate SDK code, and pull any base images referenced in your functions. Subsequent runs are faster thanks to content-addressed caching. The engine container stays running between calls.
What is Dagger Cloud?
Dagger Cloud is an optional observability platform. It provides pipeline visualization, trace debugging, cache visibility, and run history. The free tier includes 1 million events per month for a single user. Dagger Cloud does not host your runners. Your Dagger Engine runs on your own infrastructure, and Dagger Cloud receives telemetry for visualization. Sign up at dagger.cloud and run dagger login to connect.