fix: resolve temperature handling issues for O3/custom models (#245)
- Fix consensus tool hardcoded temperature=0.2 bypassing model capabilities - Add intelligent temperature inference for unknown custom models - Support multi-model collaboration (O3, Gemini, Claude, Mistral, DeepSeek) - Only OpenAI O-series and DeepSeek reasoner models reject temperature - Most reasoning models (Gemini Pro, Claude, Mistral) DO support temperature - Comprehensive logging for temperature decisions and user guidance Resolves: https://github.com/BeehiveInnovations/zen-mcp-server/issues/245
This commit is contained in:
1
.gitignore
vendored
1
.gitignore
vendored
@@ -187,3 +187,4 @@ logs/
|
||||
|
||||
/worktrees/
|
||||
test_simulation_files/
|
||||
.mcp.json
|
||||
|
||||
Reference in New Issue
Block a user