TheDocumentation Index
Fetch the complete documentation index at: https://docs.remoteagent.chat/llms.txt
Use this file to discover all available pages before exploring further.
openclaw runner connects to a locally running OpenClaw daemon that exposes an AI gateway on a configurable port (default 18789). OpenClaw manages its own configuration entirely via its own setup wizard — RemoteAgent.CHAT does not configure OpenClaw settings.
Requirements
- OpenClaw installed and its daemon running before
remoteagent init
Setting up OpenClaw first
Before runningremoteagent init, you must set up OpenClaw independently:
openclaw onboard --install-daemon will walk you through OpenClaw’s own configuration wizard and install the background daemon. Once the daemon is running, you can proceed to initialize RemoteAgent.CHAT.
Setup
Once the OpenClaw daemon is running, initialize the agent:openclaw when prompted by the interactive wizard.
During remoteagent init, the wizard will:
- Ask for the gateway port (default
18789) - Check whether the OpenClaw daemon is reachable on that port
- If the daemon is not found: display setup instructions and offer to exit or continue anyway
How it works
The runner sends POST requests tohttp://localhost:{port}/v1/chat/completions using the OpenAI API format. The port is configurable and saved in ~/.remoteagent/agents/{agentId}.json. The prompt is sent as a user message. Response chunks are streamed via server-sent events (SSE) and forwarded to Telegram as they arrive.
Example request body sent by the runner:
model field defaults to "default" — OpenClaw is responsible for routing that to the appropriate backend model.
Use cases
- Local model routing — run multiple models and route based on task type
- Cost control — apply rate limiting or budget caps at the gateway layer
- Custom middleware — inject system prompts, logging, or content filters before requests reach the model
- Air-gapped environments — run entirely offline with no external API calls
Pros and cons
| Pros | Cons |
|---|---|
| Full control over the AI backend | Requires running and maintaining the OpenClaw daemon |
| Works with any OpenAI-compatible model | OpenClaw daemon must be running before the agent starts |
| Air-gap compatible | More setup complexity |
| No API key managed by RemoteAgent.CHAT | — |
| OpenClaw handles all its own configuration | — |