Docker Compose deployment for OpenClaw with Ollama integration. This project provides a containerized environment for running OpenClaw (autonomous AI assistant) alongside Ollama for local LLM inference.
Based on the structure of silvaengine_docker and incorporating patterns from mythrantic/ollama-docker.
| Service | Description | Port |
|---|---|---|
| openclaw | OpenClaw Gateway (Node.js + Python) | 18789 |
| ollama | Ollama LLM Server | 11434 |
| ollama-webui | Open WebUI Interface | 8080 |
-
Clone this repository and navigate to the directory:
cd docker-openclaw -
Create the required directories:
mkdir www/logs mkdir www/projects
-
(Optional) Place SSH keys in
python/.ssh/for GitHub access. -
Create a
.envfile from the example:cp .env.example .env
Configure your environment variables:
PIP_INDEX_URL=https://pypi.org/simple/ PYTHON=python3.12 NODE_VERSION=22 PROJECTS_FOLDER={path to the projects folder} ANTHROPIC_API_KEY=your_anthropic_api_key OPENAI_API_KEY=your_openai_api_keyExample:
- PIP_INDEX_URL: https://pypi.org/simple/
- PROJECTS_FOLDER: "C:/Users/developer/GitHubRepos/openclaw_projects"
- PYTHON: python3.12
-
Build the Docker images:
docker compose build
-
Start the containers:
docker compose up -d
Configure the python/requirements.txt file to manage the necessary packages. Ensure this file includes all dependencies required for your OpenClaw projects to run smoothly.
The OpenClaw container includes:
- Node.js 22 with OpenClaw installed globally via pnpm
- Python 3.12 with virtual environment at
/var/python3.12/openclaw/env - Ollama, Anthropic, OpenAI Python SDKs
- Supervisor for process management
Pull models via the API or Open WebUI:
# Pull a model
curl http://localhost:11434/api/pull -d '{"name": "llama3.2"}'
# List models
curl http://localhost:11434/api/tagsAccess the web interface at http://localhost:8080 to interact with Ollama models.
The ollama service is configured for NVIDIA GPU support. Ensure you have:
- NVIDIA drivers installed
- NVIDIA Container Toolkit installed
For CPU-only mode, remove the deploy section from the ollama service in docker-compose.yaml.
docker-openclaw/
├── docker-compose.yaml
├── .env.example
├── .gitignore
├── README.md
├── python/
│ ├── Dockerfile
│ ├── openclaw.ini
│ ├── requirements.txt
│ └── .ssh/
└── www/
├── logs/
└── projects/
MIT