Codex CLI / Codex App
Codex 客户端支持两种 wire 协议:responses 与
chat。先将配置文件内容写入
~/.codex/config.toml(Windows:
%USERPROFILE%\.codex\config.toml),再在终端设置
XAI_API_KEY 并运行对应命令。下面的
responses 示例基于一份实际可用的
config.toml 模板,未包含本地项目路径相关的
[projects."..."] 配置。
使用顺序:先复制方案 A 或方案 B 的 TOML 内容到配置文件,再复制对应系统的启动命令。
model_provider = "xai"
model = "gpt-5.4"
model_reasoning_effort = "xhigh"
plan_mode_reasoning_effort = "xhigh"
model_reasoning_summary = "none"
model_verbosity = "medium"
model_context_window = 1050000
model_auto_compact_token_limit = 945000
tool_output_token_limit = 6000
approval_policy = "never"
sandbox_mode = "danger-full-access"
[model_providers.xai]
name = "xai"
base_url = ""
wire_api = "responses"
requires_openai_auth = false
env_key = "XAI_API_KEY"
approval_policy = "never"
sandbox_mode = "danger-full-access"
[model_providers.xai]
name = "xai"
base_url = ""
env_key = "XAI_API_KEY"
wire_api = "chat"
requires_openai_auth = false
[profiles.minimax]
model = "MiniMax-M2.5"
model_provider = "xai"
export XAI_API_KEY="sk-Xvs..."
# 方案 A(responses)
codex
# 方案 B(chat)
codex --profile minimax
set XAI_API_KEY=sk-Xvs...
:: 方案 A(responses)
codex
:: 方案 B(chat)
codex --profile minimax
$env:XAI_API_KEY="sk-Xvs..."
# 方案 A(responses)
codex
# 方案 B(chat)
codex --profile minimax
验证命令:codex(responses)或 codex --profile minimax(chat)
Claude Code(gpt-5.4)
Claude Code 主要通过环境变量接入。以下示例将默认模型映射到
gpt-5.4,网关地址使用
。
使用顺序:先复制对应系统的环境变量,再执行启动命令 claude。
export XAI_API_KEY="sk-Xvs..."
export ANTHROPIC_AUTH_TOKEN="$XAI_API_KEY"
export ANTHROPIC_BASE_URL=""
# 可选:自定义 Claude 默认模型映射(不配置也可正常使用)
export ANTHROPIC_DEFAULT_OPUS_MODEL="gpt-5.4"
export ANTHROPIC_DEFAULT_SONNET_MODEL="gpt-5.4"
export ANTHROPIC_DEFAULT_HAIKU_MODEL="gpt-5.4"
set XAI_API_KEY=sk-Xvs...
set ANTHROPIC_AUTH_TOKEN=%XAI_API_KEY%
set ANTHROPIC_BASE_URL=
:: 可选:自定义 Claude 默认模型映射(不配置也可正常使用)
set ANTHROPIC_DEFAULT_OPUS_MODEL=gpt-5.4
set ANTHROPIC_DEFAULT_SONNET_MODEL=gpt-5.4
set ANTHROPIC_DEFAULT_HAIKU_MODEL=gpt-5.4
$env:XAI_API_KEY="sk-Xvs..."
$env:ANTHROPIC_AUTH_TOKEN=$env:XAI_API_KEY
$env:ANTHROPIC_BASE_URL=""
# 可选:自定义 Claude 默认模型映射(不配置也可正常使用)
$env:ANTHROPIC_DEFAULT_OPUS_MODEL="gpt-5.4"
$env:ANTHROPIC_DEFAULT_SONNET_MODEL="gpt-5.4"
$env:ANTHROPIC_DEFAULT_HAIKU_MODEL="gpt-5.4"
验证命令:claude
OpenCode(Responses: gpt-5.4 / Chat: MiniMax-M2.5)
OpenCode 建议使用全局配置文件
~/.config/opencode/opencode.jsonc(Windows:
%USERPROFILE%\.config\opencode\opencode.jsonc)。
先将方案 A 或方案 B 的 JSONC 内容写入配置文件,再按你的系统设置
XAI_API_KEY 并运行验证命令。
使用顺序:先选 API 方案(A = Responses,B = Chat)并写入配置文件,再复制对应系统的终端命令。
{
"$schema": "https://opencode.ai/config.json",
"model": "openai/gpt-5.4",
"small_model": "openai/gpt-5.4",
"provider": {
"openai": {
"options": {
"baseURL": "",
"apiKey": "{env:XAI_API_KEY}"
}
}
}
}
{
"$schema": "https://opencode.ai/config.json",
"model": "xai-chat/MiniMax-M2.5",
"small_model": "xai-chat/MiniMax-M2.5",
"provider": {
"xai-chat": {
"npm": "@ai-sdk/openai-compatible",
"options": {
"baseURL": "",
"apiKey": "{env:XAI_API_KEY}"
},
"models": {
"MiniMax-M2.5": {}
}
}
}
}
export XAI_API_KEY="sk-Xvs..."
opencode debug config
opencode run "你好"
set XAI_API_KEY=sk-Xvs...
opencode debug config
opencode run "你好"
$env:XAI_API_KEY="sk-Xvs..."
opencode debug config
opencode run "你好"
curl /responses \
-H "Authorization: Bearer ${XAI_API_KEY}" \
-H "Content-Type: application/json" \
-d '{
"model":"gpt-5.4",
"input":"用一句话解释微服务网关的作用"
}'
curl /chat/completions \
-H "Authorization: Bearer ${XAI_API_KEY}" \
-H "Content-Type: application/json" \
-d '{
"model":"MiniMax-M2.5",
"messages":[{"role":"user","content":"用一句话解释微服务网关的作用"}]
}'
验证命令:opencode debug config(确认配置)与 opencode run "你好"(验证调用)
OpenClaw
OpenClaw 可接入 OpenAI API、Claude API,并可扩展 OpenAI Responses
API。XAI Router 默认支持 OpenAI API 与 Claude API 两种协议;推荐优先使用
Responses,即 api = "openai-responses"。
配置文件路径:Linux / macOS 为
~/.openclaw/openclaw.json,Windows 为
%USERPROFILE%\.openclaw\openclaw.json。
使用顺序:先将下方 JSON 写入配置文件,再按系统设置
XAI_API_KEY,最后执行验证命令。
{
"agents": {
"defaults": {
"model": { "primary": "xairouter/gpt-5.4" }
}
},
"models": {
"mode": "merge",
"providers": {
"xairouter": {
"baseUrl": "",
"apiKey": "${XAI_API_KEY}",
"api": "openai-responses",
"models": [{ "id": "gpt-5.4", "name": "gpt-5.4" }]
}
}
}
}
{
"agents": {
"defaults": {
"model": { "primary": "xairouter/MiniMax-M2.5" }
}
},
"models": {
"mode": "merge",
"providers": {
"xairouter": {
"baseUrl": "",
"apiKey": "${XAI_API_KEY}",
"api": "anthropic-messages",
"models": [{ "id": "MiniMax-M2.5", "name": "MiniMax-M2.5" }]
}
}
}
}
{
"agents": {
"defaults": {
"model": { "primary": "xairouter/MiniMax-M2.5" }
}
},
"models": {
"mode": "merge",
"providers": {
"xairouter": {
"baseUrl": "",
"apiKey": "${XAI_API_KEY}",
"api": "openai-completions",
"models": [{ "id": "MiniMax-M2.5", "name": "MiniMax-M2.5" }]
}
}
}
}
export XAI_API_KEY="sk-Xvs..."
set XAI_API_KEY=sk-Xvs...
$env:XAI_API_KEY="sk-Xvs..."
验证命令:openclaw models status