feat!: breaking change - OpenRouter models are now read from conf/openrouter_models.json while Custom / Self-hosted models are read from conf/custom_models.json
feat: Azure OpenAI / Azure AI Foundry support. Models should be defined in conf/azure_models.json (or a custom path). See .env.example for environment variables or see readme. https://github.com/BeehiveInnovations/zen-mcp-server/issues/265 feat: OpenRouter / Custom Models / Azure can separately also use custom config paths now (see .env.example ) refactor: Model registry class made abstract, OpenRouter / Custom Provider / Azure OpenAI now subclass these refactor: breaking change: `is_custom` property has been removed from model_capabilities.py (and thus custom_models.json) given each models are now read from separate configuration files
This commit is contained in:
@@ -366,8 +366,8 @@ class TestCustomProviderOpenRouterRestrictions:
|
||||
assert not provider.validate_model_name("sonnet")
|
||||
assert not provider.validate_model_name("haiku")
|
||||
|
||||
# Should still validate custom models (is_custom=true) regardless of restrictions
|
||||
assert provider.validate_model_name("local-llama") # This has is_custom=true
|
||||
# Should still validate custom models defined in conf/custom_models.json
|
||||
assert provider.validate_model_name("local-llama")
|
||||
|
||||
@patch.dict(os.environ, {"OPENROUTER_ALLOWED_MODELS": "opus", "OPENROUTER_API_KEY": "test-key"})
|
||||
def test_custom_provider_openrouter_capabilities_restrictions(self):
|
||||
@@ -389,7 +389,7 @@ class TestCustomProviderOpenRouterRestrictions:
|
||||
with pytest.raises(ValueError):
|
||||
provider.get_capabilities("haiku")
|
||||
|
||||
# Should still work for custom models (is_custom=true)
|
||||
# Should still work for custom models
|
||||
capabilities = provider.get_capabilities("local-llama")
|
||||
assert capabilities.provider == ProviderType.CUSTOM
|
||||
|
||||
|
||||
Reference in New Issue
Block a user