Provides MCP tools for FastAPI endpoint introspection and OpenAPI documentation, allowing AI agents to discover and understand available API endpoints, their methods, paths, and detailed schema information
Provides integration with GitHub through contribution guidelines and repository access for the FastAPI MCP OpenAPI library
Supports automated PyPI publishing workflows through GitHub Actions as referenced in the PUBLISHING.md documentation
Integrates with Pydantic models for proper API typing and schema generation, ensuring comprehensive OpenAPI documentation for AI agent consumption
Enables distribution and installation of the library through PyPI, with instructions for automated publishing
FastAPI MCP OpenAPI
Why use this?
- Zero-effort LLM/MCP integration: Expose your FastAPI endpoints to AI agents, Cursor, GitHub Copilot, and other dev tools with a single line of code.
- Seamless API doc connectivity: Tools like Cursor and Copilot can instantly discover and use your API docs for autocompletion, endpoint introspection, and more for app development.
- Full OpenAPI support: Get detailed, resolved OpenAPI docs for every endpoint—no more guessing what your API does.
- Streamable, modern protocol: Implements the latest MCP Streamable HTTP transport for real-time, agent-friendly workflows.
- Security by default: CORS, origin validation, and session management out of the box.
- Production-ready: Used in real-world AI agent stacks and developer tools.
Who's using this?
A FastAPI library that provides Model Context Protocol (MCP) tools for endpoint introspection and OpenAPI documentation. This library allows AI agents to discover and understand your FastAPI endpoints through MCP.
Features
- Endpoint Discovery: Lists all available FastAPI endpoints with metadata
- OpenAPI Documentation: Provides detailed OpenAPI schema for specific endpoints with fully resolved inline schemas
- Clean Output: Removes unnecessary references and fields for minimal context usage
- MCP Streamable HTTP Transport: Full compatibility with the latest MCP protocol (2025-03-26)
- Easy Integration: Simple mounting system similar to fastapi-mcp
- Security: Built-in CORS protection and origin validation
- Focused Tool Set: Only provides tools capability - resources and prompts endpoints are disabled
🚀 Try it in 5 minutes
Create a file called main.py
:
Run it:
Visit http://localhost:8000/mcp to see your MCP server in action!
Your MCP server will be available at http://localhost:8000/mcp
and provides two tools:
- listEndpoints: Get all available endpoints (excluding MCP endpoints)
- getEndpointDocs: Get detailed OpenAPI documentation for a specific endpoint
Caution
All API endpoints must be fully typed with Pydantic models or FastAPI's native types to ensure proper OpenAPI generation and MCP compatibility.
Configuration
Constructor Parameters
app
: The FastAPI application to introspectmount_path
: Path where MCP server will be mounted (default: "/mcp")server_name
: Name of the MCP server (default: "fastapi-openapi-mcp")server_version
: Version of the MCP server (default: "0.1.0")section_name
: Name of the section in documentation for MCP endpoints (default: "mcp")list_endpoints_tool_name
: Name of the list endpoints tool (default: "listEndpoints")get_endpoint_docs_tool_name
: Name of the get endpoint docs tool (default: "getEndpointDocs")
Example with Custom Configuration
MCP Tools
1. listEndpoints
Lists all available FastAPI endpoints with their metadata.
Input: No parameters required
Output: JSON array of endpoint information including:
path
: The endpoint pathmethods
: Array of HTTP methodsname
: Endpoint namesummary
: Endpoint summary from docstring
2. getEndpointDocs
Get detailed OpenAPI documentation for a specific endpoint.
Input:
endpoint_path
(required): The path of the endpoint (e.g., "/users/{user_id}")method
(optional): The HTTP method (default: "GET")
Output: JSON object with detailed OpenAPI information including:
path
: The endpoint pathmethod
: The HTTP methodoperation
: OpenAPI operation details with fully resolved schemas
Transport Support
This library implements the latest MCP Streamable HTTP transport (protocol version 2025-03-26) which:
- Uses a single HTTP endpoint for both requests and responses
- Supports both immediate JSON responses and Server-Sent Events (SSE) streaming
- Provides backward compatibility with older MCP clients
- Includes proper session management with unique session IDs
Security
The library includes built-in security features:
- Origin Header Validation: Prevents DNS rebinding attacks
- CORS Configuration: Configured for localhost development by default
- Session Management: Proper MCP session handling with unique IDs
For production use, make sure to:
- Configure appropriate CORS origins
- Implement proper authentication if needed
- Bind to localhost (127.0.0.1) for local instances
Integration with AI Agents
This library is designed to work with AI agents and MCP clients like:
- Claude Desktop (via mcp-remote)
- VS Code extensions with MCP support
- Custom MCP clients
Example client configuration for VS Code Copilot:
Development
Setting up the development environment
Running tests
Code Quality
This project uses several tools to maintain code quality:
Building the package
Publishing
See PUBLISHING.md for detailed instructions on setting up automated PyPI publishing with GitHub Actions.
Using locally built package
To use the locally built package in another FastAPI project, you can either install it from the wheel file or in editable mode.
Example Applications
This repository includes two example applications in the examples/
folder demonstrating the library's features:
Simple Example (examples/simple_example.py
)
A minimal example showing basic integration:
This creates a basic FastAPI app with a few endpoints and shows the MCP integration info on startup.
Complete Example (examples/advanced_example.py
)
A comprehensive example with multiple endpoint types, Pydantic models, and full CRUD operations:
Both examples can be seen in docs url at http://localhost:8000/docs
after running the application.
Use claude MCP inspector to see the MCP endpoints and tools. Connect to the MCP server at http://localhost:8000/mcp
with transport type Streamable HTTP.
License
MIT License
Contributing
Contributions are welcome! Please feel free to submit a Pull Request.
This server cannot be installed
A FastAPI library that provides Model Context Protocol tools for endpoint introspection and OpenAPI documentation, allowing AI agents to discover and understand API endpoints.
Related MCP Servers
- -securityAlicense-qualityA high-performance FastAPI server supporting Model Context Protocol (MCP) for seamless integration with Large Language Models, featuring REST, GraphQL, and WebSocket APIs, along with real-time monitoring and vector search capabilities.Last updated -5PythonMIT License
- -securityFlicense-qualityA FastAPI-based implementation of the Model Context Protocol that enables standardized interaction between AI models and development environments, making it easier for developers to integrate and manage AI tasks.Last updated -5PythonMIT License
- AsecurityAlicenseAqualityEnables AI assistants to interact with Fastly's CDN API through the Model Context Protocol, allowing secure management of CDN services, caching, security settings, and performance monitoring without exposing API keys.Last updated -23JavaScriptMIT License
- -securityAlicense-qualityA zero-configuration tool that automatically converts FastAPI endpoints into Model Context Protocol (MCP) tools, enabling AI systems to interact with your API through natural language.Last updated -1PythonMIT License