Allows using OpenAI's models for text generation via the Model Context Protocol, requiring an API key for authentication.
Provides type safety throughout the application with Pydantic validation for request and response data.
Stores conversation history in a SQLite database with WAL mode and optimized indexes for high performance.
Uses YAML for configuration of LLM providers and settings in the src/config.yaml file.
MCP Backend OpenRouter
A high-performance chatbot platform connecting MCP servers with LLM APIs for intelligent tool execution.
🚀 Quick Start
Connect: ws://localhost:8000/ws/chat
📡 WebSocket API
Send Messages
Receive Responses
Message Types
Type | Purpose | Payload |
| Send user input |
|
| Start new session |
|
| AI response |
|
| Tool status |
|
⚙️ Configuration
Essential Settings (src/runtime_config.yaml
)
MCP Servers (servers_config.json
)
🔧 Performance Tuning
Streaming Optimization
HTTP/2 Support
🛠️ Development
Code Standards
Use
uv
for package managementPydantic for data validation
Type hints required
Fail-fast error handling
Available Scripts
Code Formatting
📁 Project Structure
🔑 Environment Variables
🚨 Troubleshooting
Common Issues
Problem | Solution |
Configuration not updating | Check file permissions on
|
WebSocket connection fails | Verify server is running and port is correct |
MCP server errors | Check
and server availability |
LLM API issues | Verify API keys and model configuration |
Debug Mode
Component Testing
✅ Features
Full MCP Protocol - Tools, prompts, resources
High Performance - SQLite with WAL mode, optimized indexes
Real-time Streaming - WebSocket with delta persistence
Multi-Provider - OpenRouter (100+ models), OpenAI, Groq
Type Safe - Pydantic validation throughout
Dynamic Configuration - Runtime changes without restart
Auto-Persistence - Automatic conversation storage
📚 Quick Reference
Command | Purpose |
| Start the platform |
| Reset to default config |
Edit
| Change settings (auto-reload) |
Edit
| Configure MCP servers |
🆘 Support
Check logs for detailed error messages
Verify configuration syntax with YAML validator
Test individual components for isolation
Monitor WebSocket connections and database size
Requirements: Python 3.13+, uv
package manager
This server cannot be installed
hybrid server
The server is able to function both locally and remotely, depending on the configuration or use case.
High-performance Model Context Protocol server supporting multiple LLM providers (OpenRouter, OpenAI, Groq) with WebSocket API and conversation history persistence.
Related MCP Servers
- -securityAlicense-qualityA Model Context Protocol server that enables LLMs to interact with databases (currently MongoDB) through natural language, supporting operations like querying, inserting, deleting documents, and running aggregation pipelines.Last updated -MIT License
- AsecurityFlicenseAqualityA Model Context Protocol server that enables LLMs to explore and interact with API specifications by providing tools for loading, browsing, and getting detailed information about API endpoints.Last updated -41113
- AsecurityAlicenseAqualityA Model Context Protocol server that provides knowledge graph-based persistent memory for LLMs, allowing them to store, retrieve, and reason about information across multiple conversations and sessions.Last updated -971,1662MIT License
- -securityAlicense-qualityA Model Context Protocol server that allows saving, retrieving, adding, and clearing memories from LLM conversations with MongoDB persistence.Last updated -133MIT License