69d18cc4948f69be3b7b05b01e85029e91259b98
- Fix docker client initialization bug in app.py (context manager was closing client) - Add restart_session() method to preserve session IDs during container restarts - Add 60-second startup grace period before health checking new sessions - Fix _stop_container and _get_container_info to use docker_service API consistently - Disable mDNS in Dockerfile to prevent Bonjour service name conflicts - Remove old container before restart to free port bindings
Lovdata Chat Interface
A web-based chat interface that allows users to interact with Large Language Models (LLMs) equipped with Norwegian legal research tools from the Lovdata MCP server.
Overview
This project creates a chat interface where users can:
- Choose from multiple LLM providers (OpenAI, Anthropic, Google Gemini)
- Have conversations enhanced with Norwegian legal document search capabilities
- Access laws, regulations, and legal provisions through AI-powered semantic search
- Receive properly cited legal information with cross-references
Architecture
Backend (FastAPI)
- LLM Provider Layer: Abstract interface supporting multiple LLM providers with tool calling
- MCP Integration: Client connection to lovdata-ai MCP server
- Skill System: Norwegian legal research guidance and best practices
- Chat Management: Conversation history, streaming responses, session management
Frontend (Next.js)
- Chat Interface: Real-time messaging with streaming responses
- Model Selector: Dropdown to choose LLM provider and model
- Tool Visualization: Display when legal tools are being used
- Citation Rendering: Properly formatted legal references and cross-references
External Dependencies
- Lovdata MCP Server: Provides 15+ tools for Norwegian legal research
- PostgreSQL Database: Vector embeddings for semantic search
- LLM APIs: OpenAI, Anthropic, Google Gemini (with API keys)
Supported LLM Providers
| Provider | Models | Tool Support | Notes |
|---|---|---|---|
| OpenAI | GPT-4, GPT-4o | ✅ Native | Requires API key |
| Anthropic | Claude-3.5-Sonnet | ✅ Native | Requires API key |
| Gemini-1.5-Pro | ✅ Function calling | Requires API key | |
| Local | Ollama models | ⚠️ Limited | Self-hosted option |
MCP Tools Available
The interface integrates all tools from the lovdata-ai MCP server:
Law Document Tools
get_law: Retrieve specific laws by ID or titlelist_laws: Browse laws with filtering and paginationget_law_content: Get HTML content of lawsget_law_text: Get plain text content
Search Tools
search_laws_fulltext: Full-text search in lawssearch_laws_semantic: Semantic search using vector embeddingssearch_provisions_fulltext: Full-text search in provisionssearch_provisions_semantic: Semantic search in provisions
Provision Tools
get_provision: Get individual legal provisionslist_provisions: List all provisions in a lawget_provisions_batch: Bulk retrieval for RAG applications
Reference Tools
get_cross_references: Find references from/to provisionsresolve_reference: Parse legal reference strings (e.g., "lov/2014-06-20-42/§8")
Skills Integration
The system loads Norwegian legal research skills that ensure:
- Proper citation standards (Lovdata URL formatting)
- Appropriate legal terminology usage
- Clear distinction between information and legal advice
- Systematic amendment tracking
- Cross-reference analysis
Implementation Plan
Phase 1: Core Infrastructure
-
Project Structure Setup
- Create backend (FastAPI) and frontend (Next.js) directories
- Set up Python virtual environment and Node.js dependencies
- Configure development tooling (linting, testing, formatting)
-
LLM Provider Abstraction
- Create abstract base class for LLM providers
- Implement OpenAI, Anthropic, and Google Gemini clients
- Add tool calling support and response streaming
- Implement provider switching logic
-
MCP Server Integration
- Build MCP client to connect to lovdata-ai server
- Create tool registry and execution pipeline
- Add error handling and retry logic
- Implement tool result formatting for LLM consumption
Phase 2: Chat Functionality
-
Backend API Development
- Create chat session management endpoints
- Implement conversation history storage
- Add streaming response support
- Build health check and monitoring endpoints
-
Skill System Implementation
- Create skill loading and parsing system
- Implement skill application to LLM prompts
- Add skill validation and error handling
- Create skill management API endpoints
Phase 3: Frontend Development
-
Chat Interface
- Build responsive chat UI with message history
- Implement real-time message streaming
- Add message formatting for legal citations
- Create conversation management (new chat, clear history)
-
Model Selection UI
- Create LLM provider and model selector
- Add API key management (secure storage)
- Implement model switching during conversations
- Add model capability indicators
-
Tool Usage Visualization
- Display when MCP tools are being used
- Show tool execution results in chat
- Add legal citation formatting
- Create expandable tool result views
Phase 4: Deployment & Production
-
Containerization
- Create Dockerfiles for backend and frontend
- Set up Docker Compose for development
- Configure production Docker Compose
- Add environment variable management
-
Deployment Configuration
- Set up CI/CD pipeline (GitHub Actions)
- Configure cloud deployment (Railway/Render)
- Add reverse proxy configuration
- Implement SSL certificate management
-
Monitoring & Error Handling
- Add comprehensive logging
- Implement error tracking and reporting
- Create health check endpoints
- Add rate limiting and abuse protection
-
Documentation
- Create setup and deployment guides
- Document API endpoints
- Add user documentation
- Create troubleshooting guides
Development Setup
Prerequisites
- Python 3.12+
- Node.js 18+
- Docker and Docker Compose
- API keys for desired LLM providers
Local Development
# Clone and setup
git clone <repository>
cd lovdata-chat
# Backend setup
cd backend
python -m venv venv
source venv/bin/activate # On Windows: venv\Scripts\activate
pip install -r requirements.txt
# Frontend setup
cd ../frontend
npm install
# Start development servers
docker-compose -f docker-compose.dev.yml up
Environment Variables
# Backend
LOVDATA_MCP_URL=http://localhost:8001
OPENAI_API_KEY=your_key_here
ANTHROPIC_API_KEY=your_key_here
GOOGLE_API_KEY=your_key_here
# Frontend
NEXT_PUBLIC_API_URL=http://localhost:8000
Deployment Options
Cloud Deployment (Recommended)
- Frontend: Vercel or Netlify
- Backend: Railway, Render, or Fly.io
- Database: Use existing lovdata-ai PostgreSQL instance
Self-Hosted Deployment
- Docker Compose: Full stack containerization
- Reverse Proxy: Nginx or Caddy
- SSL: Let's Encrypt automatic certificates
Security Considerations
- API keys stored securely (environment variables, secret management)
- Rate limiting on chat endpoints
- Input validation and sanitization
- CORS configuration for frontend-backend communication
- Audit logging for legal tool usage
Performance Optimization
- Response streaming for real-time chat experience
- MCP tool result caching
- Conversation history pagination
- Lazy loading of legal document content
- CDN for static frontend assets
Future Enhancements
- User authentication and conversation persistence
- Advanced citation management and export
- Integration with legal research workflows
- Multi-language support beyond Norwegian
- Advanced analytics and usage tracking
Contributing
- Follow the implementation plan phases
- Ensure comprehensive testing for LLM integrations
- Document API changes and new features
- Maintain security best practices for API key handling
Status: Planning phase complete. Ready for implementation.
Next Steps: Begin with Phase 1 - Project Structure Setup
Description
Languages
Python
84.6%
Shell
12.6%
JavaScript
1.7%
Makefile
0.6%
Dockerfile
0.5%