Provides containerized deployment options using Docker and Docker Compose for easy setup and operation of the MCP server.
Implements a streamable HTTP MCP server using FastAPI with Server-Sent Events (SSE) support for real-time communication and response streaming.
Integrates with Azure OpenAI GPT-4o for intelligent tool usage, enabling capabilities like calculations, weather queries, and time-related functions through tool calling features.
Streamable HTTP MCP Server with Azure OpenAI GPT-4o
This project implements a Streamable HTTP MCP (Model Context Protocol) Server using FastAPI and integrates it with Azure OpenAI GPT-4o for intelligent tool usage.
🚀 Features
- MCP Server: Streamable HTTP server with SSE support
- Azure OpenAI Integration: GPT-4o with tool calling capabilities
- Simple Tools: Calculator, Weather (mock), and Time tools
- Docker Setup: Easy deployment with docker-compose
- Real-time Communication: Server-Sent Events (SSE) for streaming responses
📋 Prerequisites
- Docker and Docker Compose
- Azure OpenAI account with GPT-4o deployment
- Python 3.11+ (for local development)
🛠️ Quick Setup
- Clone and setup:
- Configure Azure OpenAI:
- Start the MCP server:
- Run the GPT-4o client:
📖 Usage Examples
Basic Tool Usage
The client automatically demonstrates various tool interactions:
Complex Multi-tool Usage
🔧 Available Tools
- Calculator: Evaluate mathematical expressions
- Weather: Get mock weather data for any location
- Time: Get current timestamp
🏗️ Architecture
🌐 API Endpoints
POST /sse
- Main MCP communication endpointGET /health
- Health checkGET /tools
- List available tools
📝 Manual Testing
Test the MCP server directly:
🐳 Docker Commands
🧪 Development
Local Development
Adding New Tools
- Create a new tool class in
mcp_server.py
- Add tool definition to
TOOLS
dictionary - Add handler in
MCPHandler.handle_tools_call
Example:
🐛 Troubleshooting
Common Issues
- Connection refused: Make sure MCP server is running on port 8000
- Authentication errors: Check your Azure OpenAI credentials in
.env
- Tool call failures: Check MCP server logs for detailed error messages
Debug Mode
Enable debug logging:
🔒 Security Notes
- Never commit your
.env
file with real credentials - Use environment variables in production
- Consider adding authentication for production deployments
📄 License
MIT License - feel free to use and modify as needed.
This server cannot be installed
Implements a Model Context Protocol server that enables streaming communication between Azure OpenAI GPT-4o and tool services, allowing for real-time intelligent tool usage via Server-Sent Events.
Related MCP Servers
- AsecurityAlicenseAqualityThis server provides a convenient API for interacting with Azure DevOps services, enabling AI assistants and other tools to manage work items, code repositories, boards, sprints, and more. Built with the Model Context Protocol, it provides a standardized interface for communicating with Azure DevOpsLast updated -964343TypeScriptMIT License
- AsecurityAlicenseAqualityA Model Context Protocol server that enables AI assistants to interact with Azure DevOps resources including projects, work items, repositories, pull requests, branches, and pipelines through a standardized protocol.Last updated -151,436277TypeScriptMIT License
- -securityFlicense-qualityA reference server implementation for the Model Context Protocol that enables AI assistants to interact with Azure DevOps resources and perform operations such as project management, work item tracking, repository operations, and code search programmatically.Last updated -7TypeScript
- AsecurityAlicenseAqualityA Model Context Protocol server that enables AI assistants to interact with Azure DevOps services, allowing users to query work items with plans to support creating/updating items, managing pipelines, handling pull requests, and administering sprints and branch policies.Last updated -971PythonMIT License