Skip to main content
Glama

Meta-Dynamic MCP Server

by umin-ai

Meta-Dynamic MCP Server

A single Model Context Protocol (MCP) proxy that aggregates multiple remote MCP endpoints (via HTTP-stream or SSE) and exposes them through one unified SSE interface.
Ideal for driving a single LLM client (e.g. Claude) while mixing in any number of specialized MCP servers (math, finance, etc.).


πŸ”„ Why Meta-Dynamic vs Direct MCP Configuration

Traditionally, you would list each MCP server directly in your LLM client’s mcpServers config. While straightforward, that approach has drawbacks:

  • Tight coupling: Every time you add or remove an MCP endpoint, you must update the client config and restart the LLM process.

  • Multiple connections: The client has to manage separate HTTP/SSE transports for each server, increasing complexity.

  • No shared logic: Common patterns like namespacing, error handling, or retries must be re-implemented in every client.

Meta-Dynamic centralizes these concerns in one proxy:

  • Single endpoint: Your LLM client only talks to http://localhost:8080/sse, regardless of how many backends you add.

  • Dynamic remotes: Remotes are configured in one place (your proxy), decoupled from the LLMβ€”add/remove without touching the client.

  • Unified logic: Namespacing, tool/resource aggregation, error handling, and transport selection live in a single codebase, reducing duplication.


Related MCP server: MCP Connection Hub

πŸ”§ Prerequisites

  • Node.js β‰₯ v16

  • npm (or Yarn)

  • A set of running MCP servers you want to proxy (e.g. FastMCP math server on http://localhost:8083/mcp, CoinGecko’s SSE-based MCP, etc.)


πŸ—οΈ Project Structure

meta-dynamic-server/ β”œβ”€β”€ package.json # scripts & dependencies β”œβ”€β”€ tsconfig.json # TypeScript compiler options β”œβ”€β”€ .gitignore # Node & dist ignores β”œβ”€β”€ README.md # this document └── src/ β”œβ”€β”€ index.ts # bootstrap entrypoint └── meta-dynamic-server.ts # core proxy implementation

πŸš€ Installation & Development

  1. Clone & install

    git clone <repo-url> meta-dynamic-server cd meta-dynamic-server npm install
  2. Run in watch mode

    npm run dev # uses ts-node-dev to reload on changes
  3. Build & run

    npm run build # compiles to `dist/` npm start # runs compiled `dist/index.js`

βš™οΈ Configuration: Adding Remotes

Edit src/index.ts to define the list of MCP servers you wish to proxy.
Each remote needs:

  • name: unique alias (used to namespace URIs & tool names)

  • url: full endpoint URL (HTTP-stream endpoints point to /mcp, SSE to the /sse path)

  • transport: either httpStream or sse

import { MetaDynamicServer } from "./meta-dynamic-server"; const remotes = [ { name: "math", url: "http://localhost:8083/mcp", transport: "httpStream" }, { name: "coingecko", url: "https://mcp.api.coingecko.com/sse", transport: "sse" }, // add more MCP endpoints here ]; new MetaDynamicServer(remotes).start(8080);

Note: The proxy exposes an SSE stream on port 8080 by default: http://localhost:8080/sse


πŸ“œ How It Works

  1. Remote Initialization: connects to each MCP server using the specified transport.

  2. Request Handlers:

    • resources/list, resources/read β†’ fan-out & namespace by alias

    • tools/list, tools/call β†’ aggregate & route tool invocations

  3. SSE Endpoint: exposes a single SSE stream (/sse) and message POST path (/messages) for any MCP-capable LLM client.


πŸ§ͺ Testing

You can verify connectivity with curl or your LLM’s built-in MCP client.
Example with curl to list resources:

# 1. open an SSE stream: curl -N http://localhost:8080/sse # 2. in another shell, send a JSON-RPC over POST: curl -X POST http://localhost:8080/messages \ -H "Content-Type: application/json" \ -d '{"jsonrpc":"2.0","id":1,"method":"resources/list"}'

🚧 Contributing

  1. Fork the repo

  2. Create a feature branch

  3. Submit a PR with tests/documentation


πŸ“„ License

Released under the MIT License. See LICENSE for details.

-
security - not tested
A
license - permissive license
-
quality - not tested

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/umin-ai/umcp-sse-connector'

If you have feedback or need assistance with the MCP directory API, please join our Discord server