docs: updated docs

This commit is contained in:
Fahad
2025-10-02 23:33:40 +04:00
parent 9100febdf9
commit b4e50901ba
3 changed files with 25 additions and 15 deletions

View File

@@ -452,9 +452,9 @@ Web search is enabled by default, allowing models to request Claude perform sear
The server uses carefully crafted system prompts to give each tool specialized expertise:
### Prompt Architecture
- **Centralized Prompts**: All system prompts are defined in `prompts/tool_prompts.py`
- **Centralized Prompts**: Each tool's system prompt lives in `systemprompts/` (for example, `systemprompts/chat_prompt.py`)
- **Tool Integration**: Each tool inherits from `BaseTool` and implements `get_system_prompt()`
- **Prompt Flow**: `User Request → Tool Selection → System Prompt + Context → Gemini Response`
- **Prompt Flow**: `User Request → Tool Selection → System Prompt + Context → Model Response`
### Specialized Expertise
Each tool has a unique system prompt that defines its role and approach:
@@ -465,6 +465,6 @@ Each tool has a unique system prompt that defines its role and approach:
### Customization
To modify tool behavior, you can:
1. Edit prompts in `prompts/tool_prompts.py` for global changes
1. Edit the prompt file in `systemprompts/` (and export it via `systemprompts/__init__.py`) for global changes
2. Override `get_system_prompt()` in a tool class for tool-specific changes
3. Use the `temperature` parameter to adjust response style (0.2 for focused, 0.7 for creative)

View File

@@ -53,10 +53,14 @@ def get_language_instruction(self) -> str:
Returns:
str: Language instruction to prepend to prompt, or empty string if no locale set
"""
from config import LOCALE
if not LOCALE or not LOCALE.strip():
import os
locale = os.getenv("LOCALE", "").strip()
if not locale:
return ""
return f"Always respond in {LOCALE.strip()}.\n\n"
return f"Always respond in {locale}.\n\n"
```
### Integration in Tool Execution
@@ -80,7 +84,7 @@ system_prompt = language_instruction + base_system_prompt
```
2. Restart the MCP server:
```bash
python server.py
./run-server.sh
```
3. Use any tool responses will be in the specified language.
@@ -153,11 +157,14 @@ To customize the language instruction, modify the `get_language_instruction()` m
```python
def get_language_instruction(self) -> str:
from config import LOCALE
if not LOCALE or not LOCALE.strip():
import os
locale = os.getenv("LOCALE", "").strip()
if not locale:
return ""
# Custom instruction
return f"Always respond in {LOCALE.strip()} and use a professional tone.\n\n"
return f"Always respond in {locale} and use a professional tone.\n\n"
```
### Per-Tool Customization
@@ -167,10 +174,13 @@ You can also override the method in specific tools for custom behavior:
```python
class MyCustomTool(SimpleTool):
def get_language_instruction(self) -> str:
from config import LOCALE
if LOCALE == "fr-FR":
import os
locale = os.getenv("LOCALE", "").strip()
if locale == "fr-FR":
return "Respond in French with precise technical vocabulary.\n\n"
elif LOCALE == "zh-CN":
elif locale == "zh-CN":
return "请用中文回答,使用专业术语。\n\n"
else:
return super().get_language_instruction()

View File

@@ -94,8 +94,8 @@ Then it shared the changes and relevant code with Gemini 2.5 Pro - the following
🚨 [HIGH PRIORITY] Critical Issue Found
Inconsistent Provider Handling in tools/base.py
- Location: tools/base.py line ~1833 in _create_continuation_offer_response method
Inconsistent Provider Handling in tools/shared/base_tool.py
- Location: tools/shared/base_tool.py inside _create_continuation_offer_response
- Problem: The defensive provider handling pattern was applied in 2 locations but missed a 3rd location
- Impact: Risk of AttributeError: 'str' object has no attribute 'get_provider_type' in continuation scenarios
- Fix Required: Apply the same defensive pattern to the missed location