feat!: breaking change - OpenRouter models are now read from conf/openrouter_models.json while Custom / Self-hosted models are read from conf/custom_models.json
feat: Azure OpenAI / Azure AI Foundry support. Models should be defined in conf/azure_models.json (or a custom path). See .env.example for environment variables or see readme. https://github.com/BeehiveInnovations/zen-mcp-server/issues/265 feat: OpenRouter / Custom Models / Azure can separately also use custom config paths now (see .env.example ) refactor: Model registry class made abstract, OpenRouter / Custom Provider / Azure OpenAI now subclass these refactor: breaking change: `is_custom` property has been removed from model_capabilities.py (and thus custom_models.json) given each models are now read from separate configuration files
This commit is contained in:
@@ -158,6 +158,8 @@ XAI_ALLOWED_MODELS=grok,grok-3-fast
|
||||
```env
|
||||
# Override default location of custom_models.json
|
||||
CUSTOM_MODELS_CONFIG_PATH=/path/to/your/custom_models.json
|
||||
# Override default location of openrouter_models.json
|
||||
OPENROUTER_MODELS_CONFIG_PATH=/path/to/your/openrouter_models.json
|
||||
```
|
||||
|
||||
**Conversation Settings:**
|
||||
@@ -244,4 +246,4 @@ LOG_LEVEL=INFO
|
||||
|
||||
- **[Advanced Usage Guide](advanced-usage.md)** - Advanced model usage patterns, thinking modes, and power user workflows
|
||||
- **[Context Revival Guide](context-revival.md)** - Conversation persistence and context revival across sessions
|
||||
- **[AI-to-AI Collaboration Guide](ai-collaboration.md)** - Multi-model coordination and conversation threading
|
||||
- **[AI-to-AI Collaboration Guide](ai-collaboration.md)** - Multi-model coordination and conversation threading
|
||||
|
||||
Reference in New Issue
Block a user