feat!: Full code can now be generated by an external model and shared with the AI tool (Claude Code / Codex etc)!
model definitions now support a new `allow_code_generation` flag, only to be used with higher reasoning models such as GPT-5-Pro and-Gemini 2.5-Pro When `true`, the `chat` tool can now request the external model to generate a full implementation / update / instructions etc and then share the implementation with the calling agent. This effectively allows us to utilize more powerful models such as GPT-5-Pro to generate code for us or entire implementations (which are either API-only or part of the $200 Pro plan from within the ChatGPT app)
This commit is contained in:
@@ -91,6 +91,7 @@ def helper_function():
|
||||
"prompt": "Analyze this codebase structure",
|
||||
"files": [directory], # Directory path, not individual files
|
||||
"model": "flash",
|
||||
"working_directory": directory,
|
||||
}
|
||||
|
||||
# Execute the tool
|
||||
@@ -168,6 +169,7 @@ def helper_function():
|
||||
"files": [directory], # Same directory again
|
||||
"model": "flash",
|
||||
"continuation_id": thread_id,
|
||||
"working_directory": directory,
|
||||
}
|
||||
|
||||
# Mock to capture file filtering behavior
|
||||
@@ -299,6 +301,7 @@ def helper_function():
|
||||
"prompt": "Analyze this code",
|
||||
"files": [directory],
|
||||
"model": "flash",
|
||||
"working_directory": directory,
|
||||
}
|
||||
|
||||
result = await tool.execute(request_args)
|
||||
|
||||
Reference in New Issue
Block a user