Skip to main content
Glama

MCP Platform

by jck411

MCP Platform

High-performance Model Context Protocol (MCP) client with WebSocket API, SQLite storage, and support for OpenRouter, OpenAI, and Groq.

🚀 Quick Start

# Install and run uv sync export OPENROUTER_API_KEY="your_key_here" # or OPENAI_API_KEY, GROQ_API_KEY ./run.sh

Connect: ws://localhost:8000/ws/chat

📡 WebSocket API

Send:

{ "action": "chat", "request_id": "unique-id", "payload": { "text": "Hello", "streaming": true } }

Receive:

{ "request_id": "unique-id", "status": "chunk", "chunk": { "type": "text", "data": "Response...", "metadata": {} } }

⚙️ Configuration

LLM Provider (src/config.yaml)

llm: active: "openrouter" # or "openai", "groq" providers: openrouter: model: "openai/gpt-4o-mini" openai: model: "gpt-4o-mini" groq: model: "llama-3.3-70b-versatile"

MCP Servers (src/servers_config.json)

{ "mcpServers": { "my_server": { "enabled": true, "command": "python", "args": ["path/to/server.py"] } } }

📁 Key Files

  • src/config.yaml - LLM providers and settings
  • src/servers_config.json - MCP server configurations
  • chat_history.db - SQLite database for conversation history

✅ Features

  • Full MCP Protocol - Tools, prompts, resources
  • High Performance - SQLite with WAL mode, optimized indexes
  • Real-time Streaming - WebSocket with delta persistence
  • Multi-Provider - OpenRouter (100+ models), OpenAI, Groq
  • Type Safe - Pydantic validation throughout

Requirements: Python 3.13+, request_id required in WebSocket messages

-
security - not tested
F
license - not found
-
quality - not tested

hybrid server

The server is able to function both locally and remotely, depending on the configuration or use case.

High-performance Model Context Protocol server supporting multiple LLM providers (OpenRouter, OpenAI, Groq) with WebSocket API and conversation history persistence.

  1. 🚀 Quick Start
    1. 📡 WebSocket API
      1. ⚙️ Configuration
        1. LLM Provider (src/config.yaml)
        2. MCP Servers (src/servers_config.json)
      2. 📁 Key Files
        1. ✅ Features

          Related MCP Servers

          • -
            security
            A
            license
            -
            quality
            A Model Context Protocol server that enables LLMs to interact with databases (currently MongoDB) through natural language, supporting operations like querying, inserting, deleting documents, and running aggregation pipelines.
            Last updated -
            TypeScript
            MIT License
            • Apple
          • A
            security
            F
            license
            A
            quality
            A Model Context Protocol server that enables LLMs to explore and interact with API specifications by providing tools for loading, browsing, and getting detailed information about API endpoints.
            Last updated -
            4
            10
            13
            TypeScript
          • A
            security
            A
            license
            A
            quality
            A Model Context Protocol server that provides knowledge graph-based persistent memory for LLMs, allowing them to store, retrieve, and reason about information across multiple conversations and sessions.
            Last updated -
            9
            36,580
            2
            JavaScript
            MIT License
          • -
            security
            A
            license
            -
            quality
            A Model Context Protocol server that allows saving, retrieving, adding, and clearing memories from LLM conversations with MongoDB persistence.
            Last updated -
            72
            TypeScript
            MIT License

          View all related MCP servers

          MCP directory API

          We provide all the information about MCP servers via our MCP API.

          curl -X GET 'https://glama.ai/api/mcp/v1/servers/jck411/MCP_BACKEND_OPENROUTER'

          If you have feedback or need assistance with the MCP directory API, please join our Discord server