Migration from Docker to Standalone Python Server (#73)
* Migration from docker to standalone server Migration handling Fixed tests Use simpler in-memory storage Support for concurrent logging to disk Simplified direct connections to localhost * Migration from docker / redis to standalone script Updated tests Updated run script Fixed requirements Use dotenv Ask if user would like to install MCP in Claude Desktop once Updated docs * More cleanup and references to docker removed * Cleanup * Comments * Fixed tests * Fix GitHub Actions workflow for standalone Python architecture - Install requirements-dev.txt for pytest and testing dependencies - Remove Docker setup from simulation tests (now standalone) - Simplify linting job to use requirements-dev.txt - Update simulation tests to run directly without Docker Fixes unit test failures in CI due to missing pytest dependency. 🤖 Generated with [Claude Code](https://claude.ai/code) Co-Authored-By: Claude <noreply@anthropic.com> * Remove simulation tests from GitHub Actions - Removed simulation-tests job that makes real API calls - Keep only unit tests (mocked, no API costs) and linting - Simulation tests should be run manually with real API keys - Reduces CI costs and complexity GitHub Actions now only runs: - Unit tests (569 tests, all mocked) - Code quality checks (ruff, black) 🤖 Generated with [Claude Code](https://claude.ai/code) Co-Authored-By: Claude <noreply@anthropic.com> * Fixed tests * Fixed tests --------- Co-authored-by: Claude <noreply@anthropic.com>
This commit is contained in:
committed by
GitHub
parent
9d72545ecd
commit
4151c3c3a5
14
.env.example
14
.env.example
@@ -1,11 +1,6 @@
|
||||
# Zen MCP Server Environment Configuration
|
||||
# Copy this file to .env and fill in your values
|
||||
|
||||
# Required: Workspace root directory for file access
|
||||
# This should be the HOST path that contains all files Claude might reference
|
||||
# Defaults to $HOME for direct usage, auto-configured for Docker
|
||||
WORKSPACE_ROOT=/Users/your-username
|
||||
|
||||
# API Keys - At least one is required
|
||||
#
|
||||
# IMPORTANT: Use EITHER OpenRouter OR native APIs (Gemini/OpenAI), not both!
|
||||
@@ -27,10 +22,7 @@ XAI_API_KEY=your_xai_api_key_here
|
||||
OPENROUTER_API_KEY=your_openrouter_api_key_here
|
||||
|
||||
# Option 3: Use custom API endpoints for local models (Ollama, vLLM, LM Studio, etc.)
|
||||
# IMPORTANT: Since this server ALWAYS runs in Docker, you MUST use host.docker.internal instead of localhost
|
||||
# ❌ WRONG: http://localhost:11434/v1 (Docker containers cannot reach localhost)
|
||||
# ✅ CORRECT: http://host.docker.internal:11434/v1 (Docker can reach host services)
|
||||
# CUSTOM_API_URL=http://host.docker.internal:11434/v1 # Ollama example (NOT localhost!)
|
||||
# CUSTOM_API_URL=http://localhost:11434/v1 # Ollama example
|
||||
# CUSTOM_API_KEY= # Empty for Ollama (no auth needed)
|
||||
# CUSTOM_MODEL_NAME=llama3.2 # Default model name
|
||||
|
||||
@@ -95,9 +87,7 @@ DEFAULT_THINKING_MODE_THINKDEEP=high
|
||||
# Override the default location of custom_models.json
|
||||
# CUSTOM_MODELS_CONFIG_PATH=/path/to/your/custom_models.json
|
||||
|
||||
# Optional: Redis configuration (auto-configured for Docker)
|
||||
# The Redis URL for conversation threading - typically managed by docker-compose
|
||||
# REDIS_URL=redis://redis:6379/0
|
||||
# Note: Redis is no longer used - conversations are stored in memory
|
||||
|
||||
# Optional: Conversation timeout (hours)
|
||||
# How long AI-to-AI conversation threads persist before expiring
|
||||
|
||||
Reference in New Issue
Block a user