docs: updated docs
This commit is contained in:
@@ -452,9 +452,9 @@ Web search is enabled by default, allowing models to request Claude perform sear
|
|||||||
The server uses carefully crafted system prompts to give each tool specialized expertise:
|
The server uses carefully crafted system prompts to give each tool specialized expertise:
|
||||||
|
|
||||||
### Prompt Architecture
|
### Prompt Architecture
|
||||||
- **Centralized Prompts**: All system prompts are defined in `prompts/tool_prompts.py`
|
- **Centralized Prompts**: Each tool's system prompt lives in `systemprompts/` (for example, `systemprompts/chat_prompt.py`)
|
||||||
- **Tool Integration**: Each tool inherits from `BaseTool` and implements `get_system_prompt()`
|
- **Tool Integration**: Each tool inherits from `BaseTool` and implements `get_system_prompt()`
|
||||||
- **Prompt Flow**: `User Request → Tool Selection → System Prompt + Context → Gemini Response`
|
- **Prompt Flow**: `User Request → Tool Selection → System Prompt + Context → Model Response`
|
||||||
|
|
||||||
### Specialized Expertise
|
### Specialized Expertise
|
||||||
Each tool has a unique system prompt that defines its role and approach:
|
Each tool has a unique system prompt that defines its role and approach:
|
||||||
@@ -465,6 +465,6 @@ Each tool has a unique system prompt that defines its role and approach:
|
|||||||
|
|
||||||
### Customization
|
### Customization
|
||||||
To modify tool behavior, you can:
|
To modify tool behavior, you can:
|
||||||
1. Edit prompts in `prompts/tool_prompts.py` for global changes
|
1. Edit the prompt file in `systemprompts/` (and export it via `systemprompts/__init__.py`) for global changes
|
||||||
2. Override `get_system_prompt()` in a tool class for tool-specific changes
|
2. Override `get_system_prompt()` in a tool class for tool-specific changes
|
||||||
3. Use the `temperature` parameter to adjust response style (0.2 for focused, 0.7 for creative)
|
3. Use the `temperature` parameter to adjust response style (0.2 for focused, 0.7 for creative)
|
||||||
|
|||||||
@@ -53,10 +53,14 @@ def get_language_instruction(self) -> str:
|
|||||||
Returns:
|
Returns:
|
||||||
str: Language instruction to prepend to prompt, or empty string if no locale set
|
str: Language instruction to prepend to prompt, or empty string if no locale set
|
||||||
"""
|
"""
|
||||||
from config import LOCALE
|
import os
|
||||||
if not LOCALE or not LOCALE.strip():
|
|
||||||
|
locale = os.getenv("LOCALE", "").strip()
|
||||||
|
|
||||||
|
if not locale:
|
||||||
return ""
|
return ""
|
||||||
return f"Always respond in {LOCALE.strip()}.\n\n"
|
|
||||||
|
return f"Always respond in {locale}.\n\n"
|
||||||
```
|
```
|
||||||
|
|
||||||
### Integration in Tool Execution
|
### Integration in Tool Execution
|
||||||
@@ -80,7 +84,7 @@ system_prompt = language_instruction + base_system_prompt
|
|||||||
```
|
```
|
||||||
2. Restart the MCP server:
|
2. Restart the MCP server:
|
||||||
```bash
|
```bash
|
||||||
python server.py
|
./run-server.sh
|
||||||
```
|
```
|
||||||
3. Use any tool – responses will be in the specified language.
|
3. Use any tool – responses will be in the specified language.
|
||||||
|
|
||||||
@@ -153,11 +157,14 @@ To customize the language instruction, modify the `get_language_instruction()` m
|
|||||||
|
|
||||||
```python
|
```python
|
||||||
def get_language_instruction(self) -> str:
|
def get_language_instruction(self) -> str:
|
||||||
from config import LOCALE
|
import os
|
||||||
if not LOCALE or not LOCALE.strip():
|
|
||||||
|
locale = os.getenv("LOCALE", "").strip()
|
||||||
|
|
||||||
|
if not locale:
|
||||||
return ""
|
return ""
|
||||||
# Custom instruction
|
# Custom instruction
|
||||||
return f"Always respond in {LOCALE.strip()} and use a professional tone.\n\n"
|
return f"Always respond in {locale} and use a professional tone.\n\n"
|
||||||
```
|
```
|
||||||
|
|
||||||
### Per-Tool Customization
|
### Per-Tool Customization
|
||||||
@@ -167,10 +174,13 @@ You can also override the method in specific tools for custom behavior:
|
|||||||
```python
|
```python
|
||||||
class MyCustomTool(SimpleTool):
|
class MyCustomTool(SimpleTool):
|
||||||
def get_language_instruction(self) -> str:
|
def get_language_instruction(self) -> str:
|
||||||
from config import LOCALE
|
import os
|
||||||
if LOCALE == "fr-FR":
|
|
||||||
|
locale = os.getenv("LOCALE", "").strip()
|
||||||
|
|
||||||
|
if locale == "fr-FR":
|
||||||
return "Respond in French with precise technical vocabulary.\n\n"
|
return "Respond in French with precise technical vocabulary.\n\n"
|
||||||
elif LOCALE == "zh-CN":
|
elif locale == "zh-CN":
|
||||||
return "请用中文回答,使用专业术语。\n\n"
|
return "请用中文回答,使用专业术语。\n\n"
|
||||||
else:
|
else:
|
||||||
return super().get_language_instruction()
|
return super().get_language_instruction()
|
||||||
|
|||||||
@@ -94,8 +94,8 @@ Then it shared the changes and relevant code with Gemini 2.5 Pro - the following
|
|||||||
|
|
||||||
🚨 [HIGH PRIORITY] Critical Issue Found
|
🚨 [HIGH PRIORITY] Critical Issue Found
|
||||||
|
|
||||||
Inconsistent Provider Handling in tools/base.py
|
Inconsistent Provider Handling in tools/shared/base_tool.py
|
||||||
- Location: tools/base.py line ~1833 in _create_continuation_offer_response method
|
- Location: tools/shared/base_tool.py inside _create_continuation_offer_response
|
||||||
- Problem: The defensive provider handling pattern was applied in 2 locations but missed a 3rd location
|
- Problem: The defensive provider handling pattern was applied in 2 locations but missed a 3rd location
|
||||||
- Impact: Risk of AttributeError: 'str' object has no attribute 'get_provider_type' in continuation scenarios
|
- Impact: Risk of AttributeError: 'str' object has no attribute 'get_provider_type' in continuation scenarios
|
||||||
- Fix Required: Apply the same defensive pattern to the missed location
|
- Fix Required: Apply the same defensive pattern to the missed location
|
||||||
|
|||||||
Reference in New Issue
Block a user