What I learned setting up OpenClaw (Docker + Discord + LiteLLM)

Home / What I learned setting up OpenClaw (Docker + Discord + LiteLLM)

This post is a bit of a field-notes dump from getting OpenClaw running in my own Docker environment, wiring it up to Discord/GitHub, and then realizing I wanted Azure AI Foundry models – so I ended up introducing LiteLLM as a proxy.

I will be honest: I expected to spend more time fighting glue code than actually using the thing. Instead, once I got Discord and GitHub dialed in (and stopped being surprised by what was blocked vs what was actually broken), it stopped feeling like a demo and started feeling like something I can actually keep around.


TL;DR

  • I run OpenClaw in Docker, but I had to extend the image to include tools needed by certain skills (e.g., gh, ffmpeg, etc.).
  • I wanted Azure OpenAI / AI Foundry models; OpenClaw didn’t support them directly in my setup, so I added a LiteLLM container and pointed OpenClaw at it.
  • Discord setup works, but the terminology is a little quirky (“guild” == server) and you often need channel IDs, not friendly names.

Once I had the Discord + GitHub pieces working, and the container had a modern .NET SDK plus the usual build tools, it could clone repos, compile, and generally do the ‘go run the boring stuff’ loop pretty well. I can see myself using it to iterate on apps I’ve already written (or to triage bugs when I’m feeling lazy).

1) Running OpenClaw in Docker

I’m running OpenClaw in Docker. Out of the box, OpenClaw starts fine, but I quickly hit a non-obvious issue: many skills are initially marked blocked because the container is missing required binaries. Some skills make it obvious why; others are less explicit.

The fix was simple: install the missing tools in the Docker image. In my case, I primarily needed GitHub CLI and a newer .NET SDK for build/test workflows.

Dockerfile additions

# Install GitHub CLI via APT in Docker (non-interactive, root)
USER root
RUN apt-get update && apt-get install -y curl gpg && curl -fsSL https://cli.github.com/packages/githubcli-archive-keyring.gpg | dd of=/usr/share/keyrings/githubcli-archive-keyring.gpg && chmod go+r /usr/share/keyrings/githubcli-archive-keyring.gpg && echo "deb [arch=$(dpkg --print-architecture) signed-by=/usr/share/keyrings/githubcli-archive-keyring.gpg] https://cli.github.com/packages stable main" > /etc/apt/sources.list.d/github-cli.list && apt-get update && apt-get install -y gh && rm -rf /var/lib/apt/lists/* # --------------------------
# Install .NET 10 SDK
# --------------------------
USER root
RUN apt-get update && apt-get install -y --no-install-recommends ca-certificates curl bash gh unzip ffmpeg && rm -rf /var/lib/apt/lists/* # Install latest .NET 10 SDK
ENV DOTNET_ROOT=/usr/share/dotnet
ENV PATH="${DOTNET_ROOT}:${PATH}"
ENV DOTNET_CLI_TELEMETRY_OPTOUT=1 RUN curl -fsSL https://builds.dotnet.microsoft.com/dotnet/scripts/v1/dotnet-install.sh -o /tmp/dotnet-install.sh && bash /tmp/dotnet-install.sh --channel 10.0 --install-dir "${DOTNET_ROOT}" --no-path && rm /tmp/dotnet-install.sh # Optional: verify installation
RUN dotnet --info && dotnet --list-sdks && dotnet --list-runtimes

I inserted these changes right before the WORKDIR is set.

2) A simple build + “portable image” workflow

I also added a build.sh to my OpenClaw clone to build the image and save it to a known location as a compressed tarball. This makes it easy to move/load the image elsewhere.

#!/bin/bash
set -euo pipefail IMAGE_NAME="${1:-openclaw:latest}"
BACKUP_PATH="${2:-/mnt/scratch/openclaw.tar.gz}"
REPO_PATH="${3:-$(pwd)}"
PRUNE_CONFIRM="${4:-no}" echo "Image: $IMAGE_NAME"
echo "Backup: $BACKUP_PATH"
echo "Repository path: $REPO_PATH" if ! command -v docker &> /dev/null; then echo "Error: docker command not found" exit 1
fi cd "$REPO_PATH" echo "Pruning Docker system (auto-confirmed)..."
docker system prune -a -f echo "Building OpenClaw Docker image '$IMAGE_NAME'..."
docker build -t "$IMAGE_NAME" -f Dockerfile . echo "Backing up image '$IMAGE_NAME' to '$BACKUP_PATH'..."
mkdir -p "$(dirname "$BACKUP_PATH")"
docker save "$IMAGE_NAME" | gzip > "$BACKUP_PATH" echo "Done! Image saved to '$BACKUP_PATH'."

To load it back into Docker later:

gunzip -c /mnt/scratch/openclaw.tar.gz | docker load

3) Gotchas: models (Azure AI Foundry) and LiteLLM

I wanted to use Azure OpenAI / AI Foundry hosted models. In my case, OpenClaw didn’t support that target directly, so I introduced a LiteLLM container as a proxy layer.

That meant:

  • Running LiteLLM alongside OpenClaw (e.g., in docker-compose).
  • Pointing OpenClaw’s model endpoint at LiteLLM.
  • Extra .env configuration.

One non-obvious hiccup: LiteLLM config didn’t behave like “standard” YAML env-var expansion in my first attempt, so it took some trial-and-error to get the env wiring correct.

4) Discord setup: “guilds”, bots, and channel IDs

Discord setup is doable, but it is a bit tedious the first time:

  • Create a Discord server (aka “guild”).
  • Create a Discord application.
  • Create a bot inside the application.
  • Invite the bot to your server and grant permissions.
  • Use the channel ID for configuration (the docs sometimes talk in friendly names, but IDs are what you often need).

Docs I used: https://docs.openclaw.ai/channels/discord

5) Skills: “blocked” doesn’t mean broken

The last thing that surprised me: on first boot, a bunch of skills show as blocked. That’s not necessarily a bug – often it’s simply missing binaries in the container. After installing tools (and then enabling the skill(s) in openclaw.json), they started working as expected.

Next up

  • Write down my final LiteLLM config + environment wiring in a way that’s repeatable.
  • Document the exact OpenClaw config changes I made for skills and channels.

.env (example)

I keep the actual values in .env and only check in an .env.example. Here’s a simplified (redacted) example of the variables I ended up needing:

SERVER_TZ=America/Los_Angeles
DOCKER_DATA=/path/to/docker-data # OpenClaw ⇄ LiteLLM / Azure OpenAI
OPENCLAW_AZURE_MODEL=gpt-5.2
OPENCLAW_AZURE_KEY=REDACTED
OPENCLAW_AZURE_ENDPOINT=https://YOUR-RESOURCE.openai.azure.com/
OPENCLAW_AZURE_DEPLOYMENT=YOUR-DEPLOYMENT
OPENCLAW_AZURE_API_VERSION=2024-xx-xx # LiteLLM
LITELLM_MASTER_KEY=REDACTED
LITELLM_AZURE_MODEL=gpt-5.2
LITELLM_AZURE_DEPLOYMENT=YOUR-DEPLOYMENT # (Optional) Claude keys if you use them
CLAUDE_AI_SESSION_KEY=REDACTED
CLAUDE_WEB_SESSION_KEY=REDACTED
CLAUDE_WEB_COOKIE=REDACTED

Sample configuration (redacted) Below are the configs I ended up with. Anything secret (tokens/keys) has been redacted. openclaw.json
{
  "meta": {
    "lastTouchedVersion": "2026.2.4",
    "lastTouchedAt": "2026-02-06T20:34:39.667Z"
  },
  "models": {
    "mode": "merge",
    "providers": {
      "litellm": {
        "baseUrl": "http://litellm:4000/v1",
        "apiKey": "REDACTED",
        "api": "openai-completions",
        "models": [
          {
            "id": "gpt-5.2",
            "name": "GPT-5.2 (LiteLLM/Azure)",
            "reasoning": false,
            "input": [
              "text",
              "image"
            ],
            "cost": {
              "input": 0,
              "output": 0,
              "cacheRead": 0,
              "cacheWrite": 0
            },
            "contextWindow": 128000,
            "maxTokens": 16384
          }
        ]
      }
    }
  },
  "agents": {
    "defaults": {
      "model": {
        "primary": "litellm/gpt-5.2"
      },
      "models": {
        "litellm/${OPENCLAW_AZURE_MODEL}": {
          "alias": "GPT-5.2 (LiteLLM/Azure)"
        }
      },
      "workspace": "/home/node/.openclaw/workspace",
      "compaction": {
        "mode": "safeguard"
      },
      "maxConcurrent": 4,
      "subagents": {
        "maxConcurrent": 8
      }
    }
  },
  "messages": {
    "ackReactionScope": "group-mentions"
  },
  "commands": {
    "native": "auto",
    "nativeSkills": "auto"
  },
  "channels": {
    "discord": {
      "enabled": true,
      "groupPolicy": "allowlist",
      "retry": {
        "attempts": 3,
        "minDelayMs": 500,
        "maxDelayMs": 30000,
        "jitter": 0.1
      },
      "dm": {
        "enabled": false,
        "policy": "pairing"
      },
      "guilds": {
        "YOUR_SERVER_ID": {
          "requireMention": true,
          "users": [
            "YOUR_USER_ID"
          ],
          "channels": {
            "CHANNEL_ID_NO_MENTION": {
              "allow": true,
              "requireMention": false
            },
            "CHANNEL_ID_REQUIRE_MENTION": {
              "allow": true,
              "requireMention": true
            },
            "help": {
              "allow": true,
              "requireMention": true
            }
          }
        }
      }
    }
  },
  "gateway": {
    "port": 18789,
    "mode": "local",
    "bind": "lan",
    "auth": {
      "mode": "token",
      "token": "REDACTED"
    }
  },
  "skills": {
    "allowBundled": [
      "gemini",
      "github",
      "weather"
    ],
    "load": {
      "extraDirs": [
        "~/Projects/agent-scripts/skills",
        "~/Projects/oss/some-skill-pack/skills"
      ],
      "watch": true,
      "watchDebounceMs": 250
    },
    "install": {
      "preferBrew": false,
      "nodeManager": "npm"
    },
    "entries": {
      "gemini": {
        "enabled": true
      },
      "github": {
        "enabled": true
      },
      "weather": {
        "enabled": true
      }
    }
  },
  "plugins": {
    "entries": {
      "discord": {
        "enabled": true
      }
    }
  }
}
litellm config.yml
model_list:
  - model_name: "os.environ/OPENCLAW_AZURE_MODEL"
    litellm_params:
      model: "os.environ/LITELLM_AZURE_DEPLOYMENT"
      base_model: "os.environ/LITELLM_AZURE_MODEL"
      api_base: "os.environ/AZURE_OPENAI_API_BASE"
      api_key: "os.environ/AZURE_OPENAI_API_KEY"
      api_version: "os.environ/AZURE_OPENAI_API_VERSION"

general_settings:
  master_key: "os.environ/LITELLM_MASTER_KEY"

litellm_settings:
  drop_params: true

docker-compose.yml excerpt
openclaw:
  image: openclaw:latest
  container_name: openclaw-gateway
  restart: unless-stopped
  env_file:
    - .env
  ports:
    - 18789:18789  # OpenClaw UI
  volumes:
    - ${DOCKER_DATA}/openclaw/config:/home/node/.openclaw
    - ${DOCKER_DATA}/openclaw/workspace:/home/node/.openclaw/workspace
  environment:
    - PUID=1000
    - PGID=1000
    - TZ=${SERVER_TZ}
    # Optional: forward Claude API keys to container
    - CLAUDE_AI_SESSION_KEY=${CLAUDE_AI_SESSION_KEY}
    - CLAUDE_WEB_SESSION_KEY=${CLAUDE_WEB_SESSION_KEY}
    - CLAUDE_WEB_COOKIE=${CLAUDE_WEB_COOKIE}
    # Azure OpenAI (namespaced for OpenClaw)
    - AZURE_OPENAI_KEY=${OPENCLAW_AZURE_KEY}
    - AZURE_OPENAI_ENDPOINT=${OPENCLAW_AZURE_ENDPOINT}
    - AZURE_OPENAI_DEPLOYMENT=${OPENCLAW_AZURE_DEPLOYMENT}
    - AZURE_OPENAI_API_VERSION=${OPENCLAW_AZURE_API_VERSION}
  depends_on:
    - litellm
  stdin_open: true
  tty: true

litellm:
  image: docker.litellm.ai/berriai/litellm:main-latest
  container_name: litellm-proxy
  restart: unless-stopped
  env_file:
    - .env
  ports:
    - "4000:4000"
  volumes:
    - ${DOCKER_DATA}/litellm/config.yml:/app/config.yml
  command:
    - "--config=/app/config.yml"
  environment:
    - PUID=1000
    - PGID=1000
    - TZ=${SERVER_TZ}
    - AZURE_OPENAI_API_KEY=${OPENCLAW_AZURE_KEY}
    - AZURE_OPENAI_API_BASE=${OPENCLAW_AZURE_ENDPOINT}
    - AZURE_OPENAI_API_VERSION=${OPENCLAW_AZURE_API_VERSION}

Anyway, overall I’m impressed. Once the basics were wired up, it was surprisingly capable. And yes, it also helped me write this post… after I finally gave it application access. 🙂

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.