Files
Selig 4c966a3ad2 Initial commit: OpenClaw Skill Collection
6 custom skills (assign-task, dispatch-webhook, daily-briefing,
task-capture, qmd-brain, tts-voice) with technical documentation.
Compatible with Claude Code, OpenClaw, Codex CLI, and OpenCode.
2026-03-13 10:58:30 +08:00

86 lines
9.5 KiB
JSON
Executable File
Raw Permalink Blame History

This file contains ambiguous Unicode characters
This file contains Unicode characters that might be confused with other characters. If you think that this is intentional, you can safely ignore this warning. Use the Escape button to reveal them.
{
"title": "MiniMax",
"content": "MiniMax is an AI company that builds the **M2/M2.1** model family. The current\ncoding-focused release is **MiniMax M2.1** (December 23, 2025), built for\nreal-world complex tasks.\n\nSource: [MiniMax M2.1 release note](https://www.minimax.io/news/minimax-m21)\n\n## Model overview (M2.1)\n\nMiniMax highlights these improvements in M2.1:\n\n* Stronger **multi-language coding** (Rust, Java, Go, C++, Kotlin, Objective-C, TS/JS).\n* Better **web/app development** and aesthetic output quality (including native mobile).\n* Improved **composite instruction** handling for office-style workflows, building on\n interleaved thinking and integrated constraint execution.\n* **More concise responses** with lower token usage and faster iteration loops.\n* Stronger **tool/agent framework** compatibility and context management (Claude Code,\n Droid/Factory AI, Cline, Kilo Code, Roo Code, BlackBox).\n* Higher-quality **dialogue and technical writing** outputs.\n\n## MiniMax M2.1 vs MiniMax M2.1 Lightning\n\n* **Speed:** Lightning is the “fast” variant in MiniMaxs pricing docs.\n* **Cost:** Pricing shows the same input cost, but Lightning has higher output cost.\n* **Coding plan routing:** The Lightning back-end isnt directly available on the MiniMax\n coding plan. MiniMax auto-routes most requests to Lightning, but falls back to the\n regular M2.1 back-end during traffic spikes.\n\n### MiniMax OAuth (Coding Plan) — recommended\n\n**Best for:** quick setup with MiniMax Coding Plan via OAuth, no API key required.\n\nEnable the bundled OAuth plugin and authenticate:\n\nYou will be prompted to select an endpoint:\n\n* **Global** - International users (`api.minimax.io`)\n* **CN** - Users in China (`api.minimaxi.com`)\n\nSee [MiniMax OAuth plugin README](https://github.com/openclaw/openclaw/tree/main/extensions/minimax-portal-auth) for details.\n\n### MiniMax M2.1 (API key)\n\n**Best for:** hosted MiniMax with Anthropic-compatible API.\n\n* Run `openclaw configure`\n* Select **Model/auth**\n* Choose **MiniMax M2.1**\n\n### MiniMax M2.1 as fallback (Opus primary)\n\n**Best for:** keep Opus 4.5 as primary, fail over to MiniMax M2.1.\n\n### Optional: Local via LM Studio (manual)\n\n**Best for:** local inference with LM Studio.\nWe have seen strong results with MiniMax M2.1 on powerful hardware (e.g. a\ndesktop/server) using LM Studio's local server.\n\nConfigure manually via `openclaw.json`:\n\n## Configure via `openclaw configure`\n\nUse the interactive config wizard to set MiniMax without editing JSON:\n\n1. Run `openclaw configure`.\n2. Select **Model/auth**.\n3. Choose **MiniMax M2.1**.\n4. Pick your default model when prompted.\n\n## Configuration options\n\n* `models.providers.minimax.baseUrl`: prefer `https://api.minimax.io/anthropic` (Anthropic-compatible); `https://api.minimax.io/v1` is optional for OpenAI-compatible payloads.\n* `models.providers.minimax.api`: prefer `anthropic-messages`; `openai-completions` is optional for OpenAI-compatible payloads.\n* `models.providers.minimax.apiKey`: MiniMax API key (`MINIMAX_API_KEY`).\n* `models.providers.minimax.models`: define `id`, `name`, `reasoning`, `contextWindow`, `maxTokens`, `cost`.\n* `agents.defaults.models`: alias models you want in the allowlist.\n* `models.mode`: keep `merge` if you want to add MiniMax alongside built-ins.\n\n* Model refs are `minimax/<model>`.\n* Coding Plan usage API: `https://api.minimaxi.com/v1/api/openplatform/coding_plan/remains` (requires a coding plan key).\n* Update pricing values in `models.json` if you need exact cost tracking.\n* Referral link for MiniMax Coding Plan (10% off): [https://platform.minimax.io/subscribe/coding-plan?code=DbXJTRClnb\\&source=link](https://platform.minimax.io/subscribe/coding-plan?code=DbXJTRClnb\\&source=link)\n* See [/concepts/model-providers](/concepts/model-providers) for provider rules.\n* Use `openclaw models list` and `openclaw models set minimax/MiniMax-M2.1` to switch.\n\n### “Unknown model: minimax/MiniMax-M2.1”\n\nThis usually means the **MiniMax provider isnt configured** (no provider entry\nand no MiniMax auth profile/env key found). A fix for this detection is in\n**2026.1.12** (unreleased at the time of writing). Fix by:\n\n* Upgrading to **2026.1.12** (or run from source `main`), then restarting the gateway.\n* Running `openclaw configure` and selecting **MiniMax M2.1**, or\n* Adding the `models.providers.minimax` block manually, or\n* Setting `MINIMAX_API_KEY` (or a MiniMax auth profile) so the provider can be injected.\n\nMake sure the model id is **casesensitive**:\n\n* `minimax/MiniMax-M2.1`\n* `minimax/MiniMax-M2.1-lightning`",
"code_samples": [
{
"code": "You will be prompted to select an endpoint:\n\n* **Global** - International users (`api.minimax.io`)\n* **CN** - Users in China (`api.minimaxi.com`)\n\nSee [MiniMax OAuth plugin README](https://github.com/openclaw/openclaw/tree/main/extensions/minimax-portal-auth) for details.\n\n### MiniMax M2.1 (API key)\n\n**Best for:** hosted MiniMax with Anthropic-compatible API.\n\nConfigure via CLI:\n\n* Run `openclaw configure`\n* Select **Model/auth**\n* Choose **MiniMax M2.1**",
"language": "unknown"
},
{
"code": "### MiniMax M2.1 as fallback (Opus primary)\n\n**Best for:** keep Opus 4.5 as primary, fail over to MiniMax M2.1.",
"language": "unknown"
},
{
"code": "### Optional: Local via LM Studio (manual)\n\n**Best for:** local inference with LM Studio.\nWe have seen strong results with MiniMax M2.1 on powerful hardware (e.g. a\ndesktop/server) using LM Studio's local server.\n\nConfigure manually via `openclaw.json`:",
"language": "unknown"
},
{
"code": "## Configure via `openclaw configure`\n\nUse the interactive config wizard to set MiniMax without editing JSON:\n\n1. Run `openclaw configure`.\n2. Select **Model/auth**.\n3. Choose **MiniMax M2.1**.\n4. Pick your default model when prompted.\n\n## Configuration options\n\n* `models.providers.minimax.baseUrl`: prefer `https://api.minimax.io/anthropic` (Anthropic-compatible); `https://api.minimax.io/v1` is optional for OpenAI-compatible payloads.\n* `models.providers.minimax.api`: prefer `anthropic-messages`; `openai-completions` is optional for OpenAI-compatible payloads.\n* `models.providers.minimax.apiKey`: MiniMax API key (`MINIMAX_API_KEY`).\n* `models.providers.minimax.models`: define `id`, `name`, `reasoning`, `contextWindow`, `maxTokens`, `cost`.\n* `agents.defaults.models`: alias models you want in the allowlist.\n* `models.mode`: keep `merge` if you want to add MiniMax alongside built-ins.\n\n## Notes\n\n* Model refs are `minimax/<model>`.\n* Coding Plan usage API: `https://api.minimaxi.com/v1/api/openplatform/coding_plan/remains` (requires a coding plan key).\n* Update pricing values in `models.json` if you need exact cost tracking.\n* Referral link for MiniMax Coding Plan (10% off): [https://platform.minimax.io/subscribe/coding-plan?code=DbXJTRClnb\\&source=link](https://platform.minimax.io/subscribe/coding-plan?code=DbXJTRClnb\\&source=link)\n* See [/concepts/model-providers](/concepts/model-providers) for provider rules.\n* Use `openclaw models list` and `openclaw models set minimax/MiniMax-M2.1` to switch.\n\n## Troubleshooting\n\n### “Unknown model: minimax/MiniMax-M2.1”\n\nThis usually means the **MiniMax provider isnt configured** (no provider entry\nand no MiniMax auth profile/env key found). A fix for this detection is in\n**2026.1.12** (unreleased at the time of writing). Fix by:\n\n* Upgrading to **2026.1.12** (or run from source `main`), then restarting the gateway.\n* Running `openclaw configure` and selecting **MiniMax M2.1**, or\n* Adding the `models.providers.minimax` block manually, or\n* Setting `MINIMAX_API_KEY` (or a MiniMax auth profile) so the provider can be injected.\n\nMake sure the model id is **casesensitive**:\n\n* `minimax/MiniMax-M2.1`\n* `minimax/MiniMax-M2.1-lightning`\n\nThen recheck with:",
"language": "unknown"
}
],
"headings": [
{
"level": "h2",
"text": "Model overview (M2.1)",
"id": "model-overview-(m2.1)"
},
{
"level": "h2",
"text": "MiniMax M2.1 vs MiniMax M2.1 Lightning",
"id": "minimax-m2.1-vs-minimax-m2.1-lightning"
},
{
"level": "h2",
"text": "Choose a setup",
"id": "choose-a-setup"
},
{
"level": "h3",
"text": "MiniMax OAuth (Coding Plan) — recommended",
"id": "minimax-oauth-(coding-plan)-—-recommended"
},
{
"level": "h3",
"text": "MiniMax M2.1 (API key)",
"id": "minimax-m2.1-(api-key)"
},
{
"level": "h3",
"text": "MiniMax M2.1 as fallback (Opus primary)",
"id": "minimax-m2.1-as-fallback-(opus-primary)"
},
{
"level": "h3",
"text": "Optional: Local via LM Studio (manual)",
"id": "optional:-local-via-lm-studio-(manual)"
},
{
"level": "h2",
"text": "Configure via `openclaw configure`",
"id": "configure-via-`openclaw-configure`"
},
{
"level": "h2",
"text": "Configuration options",
"id": "configuration-options"
},
{
"level": "h2",
"text": "Notes",
"id": "notes"
},
{
"level": "h2",
"text": "Troubleshooting",
"id": "troubleshooting"
},
{
"level": "h3",
"text": "“Unknown model: minimax/MiniMax-M2.1”",
"id": "“unknown-model:-minimax/minimax-m2.1”"
}
],
"url": "llms-txt#minimax",
"links": []
}