docs: instructions for OpenCode
This commit is contained in:
@@ -172,6 +172,31 @@ Create or edit `~/.qwen/settings.json`:
|
||||
|
||||
Replace the placeholder API key with the providers you use (Gemini, OpenAI, OpenRouter, etc.).
|
||||
|
||||
**For OpenCode CLI:**
|
||||
Edit `~/.config/opencode/opencode.json`:
|
||||
|
||||
```json
|
||||
{
|
||||
"$schema": "https://opencode.ai/config.json",
|
||||
"mcp": {
|
||||
"zen": {
|
||||
"type": "local",
|
||||
"command": [
|
||||
"/path/to/zen-mcp-server/.zen_venv/bin/python",
|
||||
"/path/to/zen-mcp-server/server.py"
|
||||
],
|
||||
"cwd": "/path/to/zen-mcp-server",
|
||||
"enabled": true,
|
||||
"environment": {
|
||||
"GEMINI_API_KEY": "your_api_key_here"
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
Add any other API keys you rely on (`OPENAI_API_KEY`, `OPENROUTER_API_KEY`, etc.).
|
||||
|
||||
#### IDE Clients (Cursor & VS Code)
|
||||
|
||||
Zen works in GUI IDEs that speak MCP. The configuration mirrors the CLI examples above—point the client at the `uvx` launcher and set any required environment variables.
|
||||
@@ -293,6 +318,11 @@ CUSTOM_MODEL_NAME=llama3.2 # Default model name
|
||||
2. Run `qwen mcp list --scope user` and confirm `zen` shows `CONNECTED`.
|
||||
3. Try: `"/mcp"` to inspect available tools or `"Use zen to analyze this repo"`.
|
||||
|
||||
### For OpenCode CLI:
|
||||
1. Restart OpenCode (or run `OpenCode: Reload Config`).
|
||||
2. Open **Settings › Tools › MCP** and confirm `zen` is enabled.
|
||||
3. Start a new chat and try: `"Use zen to list available models"`.
|
||||
|
||||
### For Codex CLI:
|
||||
1. Restart Codex CLI if running
|
||||
2. Open a new conversation
|
||||
|
||||
Reference in New Issue
Block a user