feat!: Full code can now be generated by an external model and shared with the AI tool (Claude Code / Codex etc)!

model definitions now support a new `allow_code_generation` flag, only to be used with higher reasoning models such as GPT-5-Pro and-Gemini 2.5-Pro

 When `true`, the `chat` tool can now request the external model to generate a full implementation / update / instructions etc and then share the implementation with the calling agent.

 This effectively allows us to utilize more powerful models such as GPT-5-Pro to generate code for us or entire implementations (which are either API-only or part of the $200 Pro plan from within the ChatGPT app)
This commit is contained in:
Fahad
2025-10-07 18:49:13 +04:00
parent 04f7ce5b03
commit ece8a5ebed
29 changed files with 1008 additions and 122 deletions

View File

@@ -404,11 +404,15 @@ class SimpleTool(BaseTool):
# Get the provider from model context (clean OOP - no re-fetching)
provider = self._model_context.provider
capabilities = self._model_context.capabilities
# Get system prompt for this tool
base_system_prompt = self.get_system_prompt()
capability_augmented_prompt = self._augment_system_prompt_with_capabilities(
base_system_prompt, capabilities
)
language_instruction = self.get_language_instruction()
system_prompt = language_instruction + base_system_prompt
system_prompt = language_instruction + capability_augmented_prompt
# Generate AI response using the provider
logger.info(f"Sending request to {provider.get_provider_type().value} API for {self.get_name()}")
@@ -423,7 +427,6 @@ class SimpleTool(BaseTool):
logger.debug(f"Prompt length: {len(prompt)} characters (~{estimated_tokens:,} tokens)")
# Resolve model capabilities for feature gating
capabilities = self._model_context.capabilities
supports_thinking = capabilities.supports_extended_thinking
# Generate content with provider abstraction