6 custom skills (assign-task, dispatch-webhook, daily-briefing, task-capture, qmd-brain, tts-voice) with technical documentation. Compatible with Claude Code, OpenClaw, Codex CLI, and OpenCode.
24 lines
1.1 KiB
Markdown
24 lines
1.1 KiB
Markdown
# Model Providers
|
|
|
|
OpenClaw supports multiple LLM providers. Users authenticate with their chosen provider and configure a default model using the format `provider/model`.
|
|
|
|
## Featured Provider
|
|
|
|
Venice AI is highlighted as the recommended privacy-first option, with `venice/llama-3.3-70b` as the default and `venice/claude-opus-45` noted as "the strongest" choice.
|
|
|
|
## Setup Steps
|
|
|
|
Configuration requires two actions: authenticating with a provider (typically through `openclaw onboard`) and setting the default model in the configuration file using the specified format.
|
|
|
|
## Available Providers
|
|
|
|
The platform integrates with major services including OpenAI, Anthropic, Qwen, and OpenRouter, alongside specialized options like Cloudflare AI Gateway, Moonshot AI, and Amazon Bedrock. Local model support is available through Ollama.
|
|
|
|
## Additional Services
|
|
|
|
Deepgram provides audio transcription capabilities. A community tool enables Claude subscribers to expose their account as an OpenAI-compatible endpoint.
|
|
|
|
## Documentation Access
|
|
|
|
A complete provider catalog with advanced configuration details is available through the main Model Providers concepts page.
|