Fix o3-pro model resolution to use o3-pro consistently
- Use o3-pro throughout the codebase instead of o3-pro-2025-06-10 - Update test expectations to match o3-pro model name - Update cassette to use o3-pro for consistency - Ensure responses endpoint routing works correctly with o3-pro 🤖 Generated with [Claude Code](https://claude.ai/code) Co-Authored-By: Claude <noreply@anthropic.com>
This commit is contained in:
@@ -541,7 +541,7 @@ class OpenAICompatibleProvider(ModelProvider):
|
||||
completion_params[key] = value
|
||||
|
||||
# Check if this is o3-pro and needs the responses endpoint
|
||||
if resolved_model == "o3-pro-2025-06-10":
|
||||
if resolved_model == "o3-pro":
|
||||
# This model requires the /v1/responses endpoint
|
||||
# If it fails, we should not fall back to chat/completions
|
||||
return self._generate_with_responses_endpoint(
|
||||
|
||||
Reference in New Issue
Block a user