feat: Add comprehensive GPT-5 series model support

- Add GPT-5, GPT-5-mini, and GPT-5-nano models to unified configuration
- Implement proper thinking mode support via dynamic capability checking
- Add OpenAI provider model enumeration methods for registry integration
- Update tests to cover all GPT-5 models and their aliases
- Fix critical bug where thinking mode was hardcoded instead of using model capabilities

Breaking Changes:
- None (backward compatible)

New Models Available:
- gpt-5 (400K context, 128K output, reasoning support)
- gpt-5-mini (400K context, 128K output, efficient variant)
- gpt-5-nano (400K context, fastest/cheapest variant)

Aliases:
- gpt5, gpt5-mini, gpt5mini, gpt5-nano, gpt5nano, nano

All models support:
- Extended thinking mode (reasoning tokens)
- Vision capabilities
- JSON mode
- Function calling

🤖 Generated with [Claude Code](https://claude.ai/code)

Co-Authored-By: Claude <noreply@anthropic.com>
This commit is contained in:
David Knedlik
2025-08-21 14:27:00 -05:00
parent 12542054a2
commit 4930824052
3 changed files with 76 additions and 10 deletions

View File

@@ -253,14 +253,21 @@ class TestOpenAIProvider:
assert call_kwargs["model"] == "o3-mini" # Should be unchanged
def test_supports_thinking_mode(self):
"""Test thinking mode support."""
"""Test thinking mode support based on model capabilities."""
provider = OpenAIModelProvider("test-key")
# GPT-5 models support thinking mode (reasoning tokens)
# GPT-5 models support thinking mode (reasoning tokens) - all variants
assert provider.supports_thinking_mode("gpt-5") is True
assert provider.supports_thinking_mode("gpt-5-mini") is True
assert provider.supports_thinking_mode("gpt5") is True # Test with alias
assert provider.supports_thinking_mode("gpt5mini") is True # Test with alias
assert provider.supports_thinking_mode("gpt-5-nano") is True # Now included
# Test GPT-5 aliases
assert provider.supports_thinking_mode("gpt5") is True
assert provider.supports_thinking_mode("gpt5-mini") is True
assert provider.supports_thinking_mode("gpt5mini") is True
assert provider.supports_thinking_mode("gpt5-nano") is True
assert provider.supports_thinking_mode("gpt5nano") is True
assert provider.supports_thinking_mode("nano") is True # New alias for gpt-5-nano
# O3/O4 models don't support thinking mode
assert provider.supports_thinking_mode("o3") is False
@@ -270,6 +277,9 @@ class TestOpenAIProvider:
provider.supports_thinking_mode("mini") is True
) # "mini" now resolves to gpt-5-mini which supports thinking
# Test invalid model name
assert provider.supports_thinking_mode("invalid-model") is False
@patch("providers.openai_compatible.OpenAI")
def test_o3_pro_routes_to_responses_endpoint(self, mock_openai_class):
"""Test that o3-pro model routes to the /v1/responses endpoint (mock test)."""