feat!: Full code can now be generated by an external model and shared with the AI tool (Claude Code / Codex etc)!

model definitions now support a new `allow_code_generation` flag, only to be used with higher reasoning models such as GPT-5-Pro and-Gemini 2.5-Pro

 When `true`, the `chat` tool can now request the external model to generate a full implementation / update / instructions etc and then share the implementation with the calling agent.

 This effectively allows us to utilize more powerful models such as GPT-5-Pro to generate code for us or entire implementations (which are either API-only or part of the $200 Pro plan from within the ChatGPT app)
This commit is contained in:
Fahad
2025-10-07 18:49:13 +04:00
parent 04f7ce5b03
commit ece8a5ebed
29 changed files with 1008 additions and 122 deletions

View File

@@ -1480,8 +1480,11 @@ class BaseWorkflowMixin(ABC):
# Get system prompt for this tool with localization support
base_system_prompt = self.get_system_prompt()
capability_augmented_prompt = self._augment_system_prompt_with_capabilities(
base_system_prompt, getattr(self._model_context, "capabilities", None)
)
language_instruction = self.get_language_instruction()
system_prompt = language_instruction + base_system_prompt
system_prompt = language_instruction + capability_augmented_prompt
# Check if tool wants system prompt embedded in main prompt
if self.should_embed_system_prompt():