refactor: moved registries into a separate module and code cleanup

fix: refactored dial provider to follow the same pattern
This commit is contained in:
Fahad
2025-10-07 12:59:09 +04:00
parent c27e81d6d2
commit 7c36b9255a
54 changed files with 325 additions and 282 deletions

View File

@@ -72,6 +72,7 @@ DEFAULT_MODEL=auto # Claude picks best model for each task (recommended)
- `conf/gemini_models.json` Gemini catalogue (`GEMINI_MODELS_CONFIG_PATH`)
- `conf/xai_models.json` X.AI / GROK catalogue (`XAI_MODELS_CONFIG_PATH`)
- `conf/openrouter_models.json` OpenRouter catalogue (`OPENROUTER_MODELS_CONFIG_PATH`)
- `conf/dial_models.json` DIAL aggregation catalogue (`DIAL_MODELS_CONFIG_PATH`)
- `conf/custom_models.json` Custom/OpenAI-compatible endpoints (`CUSTOM_MODELS_CONFIG_PATH`)
Each JSON file documents the allowed fields via its `_README` block and controls model aliases, capability limits, and feature flags. Edit these files (or point the matching `*_MODELS_CONFIG_PATH` variable to your own copy) when you want to adjust context windows, enable JSON mode, or expose additional aliases without touching Python code.
@@ -154,6 +155,7 @@ OPENAI_MODELS_CONFIG_PATH=/path/to/openai_models.json
GEMINI_MODELS_CONFIG_PATH=/path/to/gemini_models.json
XAI_MODELS_CONFIG_PATH=/path/to/xai_models.json
OPENROUTER_MODELS_CONFIG_PATH=/path/to/openrouter_models.json
DIAL_MODELS_CONFIG_PATH=/path/to/dial_models.json
CUSTOM_MODELS_CONFIG_PATH=/path/to/custom_models.json
```

View File

@@ -41,6 +41,7 @@ Zen ships multiple registries:
- `conf/gemini_models.json` native Google Gemini catalogue (`GEMINI_MODELS_CONFIG_PATH`)
- `conf/xai_models.json` native X.AI / GROK catalogue (`XAI_MODELS_CONFIG_PATH`)
- `conf/openrouter_models.json` OpenRouter catalogue (`OPENROUTER_MODELS_CONFIG_PATH`)
- `conf/dial_models.json` DIAL aggregation catalogue (`DIAL_MODELS_CONFIG_PATH`)
- `conf/custom_models.json` local/self-hosted OpenAI-compatible catalogue (`CUSTOM_MODELS_CONFIG_PATH`)
Copy whichever file you need into your project (or point the corresponding `*_MODELS_CONFIG_PATH` env var at your own copy) and edit it to advertise the models you want.
@@ -71,7 +72,7 @@ Consult the JSON file for the full list, aliases, and capability flags. Add new
View the baseline OpenRouter catalogue in [`conf/openrouter_models.json`](conf/openrouter_models.json) and populate [`conf/custom_models.json`](conf/custom_models.json) with your local models.
Native catalogues (`conf/openai_models.json`, `conf/gemini_models.json`, `conf/xai_models.json`) follow the same schema. Updating those files lets you:
Native catalogues (`conf/openai_models.json`, `conf/gemini_models.json`, `conf/xai_models.json`, `conf/dial_models.json`) follow the same schema. Updating those files lets you:
- Expose new aliases (e.g., map `enterprise-pro` to `gpt-5-pro`)
- Advertise support for JSON mode or vision if the upstream provider adds it