-
securityF
license-
qualityAn open standard server implementation that enables AI assistants to directly access APIs and services through Model Context Protocol, built using Cloudflare Workers for scalability.
Last updated -
TypeScript
使用 Cloudflare Workers 和 Hono 框架构建的现代 AI 服务代理,支持包括 Anthropic Claude 和 OpenAI 在内的多家 AI 提供商。
.env
中配置环境变量启动开发服务器:
服务器将以开发模式启动并启用热重载。
部署到 Cloudflare Workers:
GET /health
GET /api/provider
POST /api/mcp
麻省理工学院
remote-capable server
The server can be hosted and run remotely because it primarily relies on remote services or has no dependency on the local environment.
一种现代 AI 服务代理,可通过统一的 API 与多个 AI 提供商(Anthropic Claude、OpenAI)进行交互,并使用 Cloudflare Workers 进行全球部署。
We provide all the information about MCP servers via our MCP API.
curl -X GET 'https://glama.ai/api/mcp/v1/servers/quang-pham-dev/my-mcp-server'
If you have feedback or need assistance with the MCP directory API, please join our Discord server