docker-config
workflowv1.0.0All Docker configuration features in kdeps: volumes, networks, env, resources
Install
Then run locally:
Configure LLM provider in ~/.kdeps/config.yaml (created automatically on first run).
README
Docker Configuration Example
This example demonstrates all Docker configuration features in KDeps v2.
Features Demonstrated
- Base OS Selection - Choose Alpine, Ubuntu, or Debian
- OS Package Installation - Install system-level packages
- Python Package Management - Specify Python packages
- Auto-backend Installation - Automatically install Ollama LLM backend
Configuration via Workflow
Base OS Selection
agentSettings:
# Options: alpine, ubuntu, debian
# Default: alpine
baseOS: "alpine"
Can be overridden via CLI:
# Use workflow's baseOS (alpine)
kdeps build .
# Override with Ubuntu
kdeps build . --os ubuntu
# Override with Debian
kdeps build . --os debian
OS Packages
Install OS-level packages (git, vim, curl, etc.):
agentSettings:
osPackages:
- git
- vim
- curl
- jq
- postgresql-client # Database client
- redis-tools # Redis CLI
Package managers by OS:
- Alpine:
apk(e.g.,git,vim,curl) - Ubuntu/Debian:
apt(e.g.,git,vim,curl)
Python Packages
agentSettings:
pythonVersion: "3.12"
pythonPackages:
- requests
- numpy
- pandas
- scikit-learn
# Or use requirements file
requirementsFile: "requirements.txt"
LLM Backend Installation
Control Ollama installation with the installOllama flag:
agentSettings:
installOllama: true # Install Ollama for local LLM support
# or
installOllama: false # Disable Ollama (for cloud-only workflows)
When installOllama: true, Ollama is installed via the official install script and you can use local LLM resources:
resources:
- metadata:
actionId: llm
run:
chat:
backend: "ollama"
model: "llama3.2:1b"
prompt: "{{ get('q') }}"
Ollama is automatically installed when:
installOllama: trueis explicitly set- Models are configured in
agentSettings.models(implies Ollama usage)
Examples
Example 1: Lightweight Alpine with Ollama
agentSettings:
baseOS: "alpine"
installOllama: true
pythonPackages:
- requests
osPackages:
- curl
Build:
kdeps build .
# Or explicitly:
kdeps build . --os alpine
Result: ~200MB image with Ollama
Example 2: Ubuntu with Cloud Providers
agentSettings:
baseOS: "ubuntu"
installOllama: false # No local LLM needed
pythonPackages:
- openai
- anthropic
osPackages:
- git
- build-essential
resources:
- run:
chat:
backend: "openai"
model: "gpt-4o"
apiKey: "{{ get('OPENAI_API_KEY', 'env') }}"
Build:
kdeps build .
Result: Ubuntu image without Ollama (uses cloud APIs)
Example 3: Debian with Database Tools
agentSettings:
baseOS: "debian"
pythonPackages:
- psycopg2-binary
- sqlalchemy
osPackages:
- postgresql-client
- redis-tools
- git
Build:
kdeps build .
Result: Debian image with PostgreSQL client and Redis tools
Example 4: Data Science Stack
agentSettings:
baseOS: "ubuntu"
pythonVersion: "3.11"
pythonPackages:
- numpy
- pandas
- scikit-learn
- matplotlib
- jupyter
osPackages:
- git
- vim
- graphviz
Build:
kdeps build . --tag datascience:latest
Generated Dockerfile Preview
# Preview what will be generated
kdeps build . --show-dockerfile
# Preview with different OS
kdeps build . --show-dockerfile --os ubuntu
Example Output (Alpine + Ollama):
FROM alpine:latest
# Set environment variables
ENV PYTHONUNBUFFERED=1 \
PATH=/opt/venv/bin:$PATH \
OLLAMA_HOST=127.0.0.1 \
OLLAMA_PORT=11434 \
BACKEND_PORT=11434
# Install base dependencies
RUN apk add --no-cache \
zstd \
python3 \
py3-pip \
curl \
bash \
supervisor \
ca-certificates \
libstdc++ \
rsync
# Install kdeps via official install script
RUN curl -LsSf https://raw.githubusercontent.com/kdeps/kdeps/main/install.sh | sh -s -- -b /usr/local/bin
# Install OS packages
RUN apk add --no-cache git vim curl jq
# Install Ollama
RUN curl -fsSL https://ollama.com/install.sh | sh
# Install uv for Python package management
COPY --from=ghcr.io/astral-sh/uv:latest /uv /usr/local/bin/uv
RUN chmod +x /usr/local/bin/uv
# Create virtual environment
RUN uv venv /opt/venv
# Install Python packages
RUN --mount=type=cache,target=/root/.cache/uv \
uv pip install requests numpy pandas
# Copy workflow files
COPY workflow.yaml /app/workflow.yaml
COPY resources/ /app/resources/
COPY data/ /app/data/
# Copy entrypoint and supervisor config
COPY entrypoint.sh /entrypoint.sh
COPY supervisord.conf /etc/supervisord.conf
RUN chmod +x /entrypoint.sh
WORKDIR /app
# Expose ports
EXPOSE 16395 11434
# Use entrypoint for backend management
ENTRYPOINT ["/entrypoint.sh"]
CMD ["supervisord", "-c", "/etc/supervisord.conf"]
Build Process
1. Workflow Configuration Takes Precedence
agentSettings:
baseOS: "debian"
Result: Builds with Debian unless overridden
2. CLI Override
kdeps build . --os ubuntu
Result: Builds with Ubuntu (overrides workflow)
3. Default Behavior
No baseOS in workflow + no CLI flag = Alpine (default)
OS Comparison
Alpine
- Size: Smallest (~50-100MB less than Ubuntu/Debian)
- Best for: Lightweight APIs, simple workflows
- Package Manager:
apk - Use when: Image size is critical
Ubuntu
- Size: Largest (but most packages available)
- Best for: Complex applications, data science
- Package Manager:
apt - Use when: Need maximum compatibility
Debian
- Size: Medium (between Alpine and Ubuntu)
- Best for: Production workloads, stability
- Package Manager:
apt - Use when: Need balance of size and features
Hybrid Local + Cloud Workflows
You can combine local Ollama with cloud providers:
agentSettings:
installOllama: true # Enable local Ollama
resources:
# Fast local inference for simple tasks
- metadata:
actionId: quickChat
run:
chat:
backend: "ollama"
model: "llama3.2:1b"
prompt: "Quick answer: {{ get('q') }}"
# Cloud provider for complex tasks
- metadata:
actionId: deepAnalysis
run:
chat:
backend: "anthropic"
model: "claude-3-5-sonnet-20241022"
apiKey: "{{ get('ANTHROPIC_API_KEY', 'env') }}"
prompt: "Detailed analysis: {{ get('q') }}"
Result: Ollama installed locally, cloud APIs used for specific resources
Testing
# Run the example
kdeps run workflow.yaml --dev
# Test the API
curl -X POST 'http://localhost:16395/api/v1/chat?q=Hello'
# Build Docker image
kdeps build . --tag docker-config:latest
# Run Docker image
docker run -p 16395:16395 docker-config:latest
Tips
- Start with Alpine for simplest workflows
- Use Ubuntu when you need specific packages
- Use Debian for production stability
- Install only what you need - keeps images small
- Test locally first before building Docker
- Use
--show-dockerfileto preview before building - Use
installOllama: falsefor cloud-only workflows
Notes
- baseOS can be specified in workflow AND overridden via CLI
- Ollama is auto-detected from resources or explicitly controlled via
installOllama - OS packages use appropriate package manager (apk/apt)
- Python packages installed via uv (fast and reliable)
- All images include kdeps binary for execution
Versions
| Version | Published | Status |
|---|---|---|
| 1.0.0 | 4/11/2026 | active |
Details
- Author
- kdeps
- License
- Apache-2.0
- Latest Version
- 1.0.0
- Published
- 4/11/2026