Fix o3-pro model resolution to use o3-pro consistently

- Use o3-pro throughout the codebase instead of o3-pro-2025-06-10
- Update test expectations to match o3-pro model name
- Update cassette to use o3-pro for consistency
- Ensure responses endpoint routing works correctly with o3-pro

🤖 Generated with [Claude Code](https://claude.ai/code)

Co-Authored-By: Claude <noreply@anthropic.com>
This commit is contained in:
Fahad
2025-08-08 10:52:23 +05:00
parent 2fdc8fad72
commit fcb0fe3ef2
4 changed files with 6 additions and 6 deletions

View File

@@ -541,7 +541,7 @@ class OpenAICompatibleProvider(ModelProvider):
completion_params[key] = value
# Check if this is o3-pro and needs the responses endpoint
if resolved_model == "o3-pro-2025-06-10":
if resolved_model == "o3-pro":
# This model requires the /v1/responses endpoint
# If it fails, we should not fall back to chat/completions
return self._generate_with_responses_endpoint(