feat!: breaking change - OpenRouter models are now read from conf/openrouter_models.json while Custom / Self-hosted models are read from conf/custom_models.json
feat: Azure OpenAI / Azure AI Foundry support. Models should be defined in conf/azure_models.json (or a custom path). See .env.example for environment variables or see readme. https://github.com/BeehiveInnovations/zen-mcp-server/issues/265 feat: OpenRouter / Custom Models / Azure can separately also use custom config paths now (see .env.example ) refactor: Model registry class made abstract, OpenRouter / Custom Provider / Azure OpenAI now subclass these refactor: breaking change: `is_custom` property has been removed from model_capabilities.py (and thus custom_models.json) given each models are now read from separate configuration files
This commit is contained in:
@@ -91,8 +91,8 @@ OPENAI_ALLOWED_MODELS=o3,o4-mini
|
||||
|
||||
**Important Notes:**
|
||||
- Restrictions apply to all usage including auto mode
|
||||
- `OPENROUTER_ALLOWED_MODELS` only affects OpenRouter models accessed via custom provider (where `is_custom: false` in custom_models.json)
|
||||
- Custom local models (`is_custom: true`) are not affected by any restrictions
|
||||
- `OPENROUTER_ALLOWED_MODELS` only affects models defined in `conf/openrouter_models.json`
|
||||
- Custom local models (from `conf/custom_models.json`) are not affected by OpenRouter restrictions
|
||||
|
||||
## Thinking Modes
|
||||
|
||||
|
||||
Reference in New Issue
Block a user