feat: Add LOCAL variable support for responses with UTF-8 JSON encoding.
Description: This feature adds support for UTF-8 encoding in JSON responses, allowing for proper handling of special characters and emojis. - Implement unit tests for UTF-8 encoding in various model providers including Gemini, OpenAI, and OpenAI Compatible. - Validate UTF-8 support in token counting, content generation, and error handling. - Introduce tests for JSON serialization ensuring proper handling of French characters and emojis. - Create tests for language instruction generation based on locale settings. - Validate UTF-8 handling in workflow tools including AnalyzeTool, CodereviewTool, and DebugIssueTool. - Ensure that all tests check for correct UTF-8 character preservation and proper JSON formatting. - Add integration tests to verify the interaction between locale settings and model responses.
This commit is contained in:
@@ -136,6 +136,15 @@ def _calculate_mcp_prompt_limit() -> int:
|
||||
|
||||
MCP_PROMPT_SIZE_LIMIT = _calculate_mcp_prompt_limit()
|
||||
|
||||
# Language/Locale Configuration
|
||||
# LOCALE: Language/locale specification for AI responses
|
||||
# When set, all AI tools will respond in the specified language while
|
||||
# maintaining their analytical capabilities
|
||||
# Examples: "fr-FR", "en-US", "zh-CN", "zh-TW", "ja-JP", "ko-KR", "es-ES",
|
||||
# "de-DE", "it-IT", "pt-PT"
|
||||
# Leave empty for default language (English)
|
||||
LOCALE = os.getenv("LOCALE", "")
|
||||
|
||||
# Threading configuration
|
||||
# Simple in-memory conversation threading for stateless MCP environment
|
||||
# Conversations persist only during the Claude session
|
||||
|
||||
Reference in New Issue
Block a user