feat!: breaking change - OpenRouter models are now read from conf/openrouter_models.json while Custom / Self-hosted models are read from conf/custom_models.json
feat: Azure OpenAI / Azure AI Foundry support. Models should be defined in conf/azure_models.json (or a custom path). See .env.example for environment variables or see readme. https://github.com/BeehiveInnovations/zen-mcp-server/issues/265 feat: OpenRouter / Custom Models / Azure can separately also use custom config paths now (see .env.example ) refactor: Model registry class made abstract, OpenRouter / Custom Provider / Azure OpenAI now subclass these refactor: breaking change: `is_custom` property has been removed from model_capabilities.py (and thus custom_models.json) given each models are now read from separate configuration files
This commit is contained in:
@@ -181,7 +181,7 @@ class TestModelEnumeration:
|
||||
# Configure environment with OpenRouter access only
|
||||
self._setup_environment({"OPENROUTER_API_KEY": "test-openrouter-key"})
|
||||
|
||||
# Create a temporary custom model config with a free variant
|
||||
# Create a temporary OpenRouter model config with a free variant
|
||||
custom_config = {
|
||||
"models": [
|
||||
{
|
||||
@@ -199,9 +199,9 @@ class TestModelEnumeration:
|
||||
]
|
||||
}
|
||||
|
||||
config_path = tmp_path / "custom_models.json"
|
||||
config_path = tmp_path / "openrouter_models.json"
|
||||
config_path.write_text(json.dumps(custom_config), encoding="utf-8")
|
||||
monkeypatch.setenv("CUSTOM_MODELS_CONFIG_PATH", str(config_path))
|
||||
monkeypatch.setenv("OPENROUTER_MODELS_CONFIG_PATH", str(config_path))
|
||||
|
||||
# Reset cached registries so the temporary config is loaded
|
||||
from tools.shared.base_tool import BaseTool
|
||||
|
||||
Reference in New Issue
Block a user