Enables building and deploying the Fledge MCP server using Docker containers, supporting deployment to Smithery.ai for enhanced scalability and availability.
Supports deployment behind an Nginx reverse proxy for production environments, enhancing security and performance for the Fledge MCP server.
Allows generation of React components for Fledge data visualization, enabling the creation of custom UI elements to display sensor data from Fledge instances.
Fledge MCP Server
This is a Model Context Protocol (MCP) server that connects Fledge functionality to Cursor AI, allowing the AI to interact with Fledge instances via natural language commands.
Prerequisites
Fledge installed locally or accessible via API (default: http://localhost:8081)
Cursor AI installed
Python 3.8+
Installation
Clone this repository:
Install the dependencies:
Running the Server
Make sure Fledge is running:
Start the MCP server:
For secure operation with API key authentication:
Verify it's working by accessing the health endpoint:
You should receive "Fledge MCP Server is running" as the response.
Connecting to Cursor
In Cursor, go to Settings > MCP Servers
Add a new server:
Tools file: Upload the included tools.json or point to its local path
For the secure server, configure the "X-API-Key" header with the value from the api_key.txt file that is generated when the secure server starts.
Test it: Open Cursor's Composer (Ctrl+I), type "Check if Fledge API is reachable," and the AI should call the
validate_api_connection
tool.
Available Tools
Data Access and Management
get_sensor_data: Fetch sensor data from Fledge with optional filtering by time range and limit
list_sensors: List all sensors available in Fledge
ingest_test_data: Ingest test data into Fledge, with optional batch count
Service Control
get_service_status: Get the status of all Fledge services
start_stop_service: Start or stop a Fledge service by type
update_config: Update Fledge configuration parameters
Frontend Code Generation
generate_ui_component: Generate React components for Fledge data visualization
fetch_sample_frontend: Get sample frontend templates for different frameworks
suggest_ui_improvements: Get AI-powered suggestions for improving UI code
Real-Time Data Streaming
subscribe_to_sensor: Set up a subscription to sensor data updates
get_latest_reading: Get the most recent reading from a specific sensor
Debugging and Validation
validate_api_connection: Check if the Fledge API is reachable
simulate_frontend_request: Test API requests with different methods and payloads
Documentation and Schema
get_api_schema: Get information about available Fledge API endpoints
list_plugins: List available Fledge plugins
Advanced AI-Assisted Features
generate_mock_data: Generate realistic mock sensor data for testing
Testing the API
You can test the server using the included test scripts:
Security Options
The secure server (secure_mcp_server.py) adds API key authentication:
On first run, it generates an API key stored in api_key.txt
All requests must include this key in the X-API-Key header
Health check endpoint remains accessible without authentication
Example API Requests
Extending the Server
To add more tools:
Add the tool definition to
tools.json
Implement the tool handler in
mcp_server.py
andsecure_mcp_server.py
Production Considerations
For production deployment:
Use HTTPS
Deploy behind a reverse proxy like Nginx
Implement more robust authentication (JWT, OAuth)
Add rate limiting
Set up persistent data storage for subscriptions
Deploying on Smithery.ai
The Fledge MCP Server can be deployed on Smithery.ai for enhanced scalability and availability. Follow these steps to deploy:
Prerequisites
Docker installed on your local machine
A Smithery.ai account
The Smithery CLI tool installed
Build and Deploy
# Build the Docker image docker build -t fledge-mcp . # Deploy to Smithery.ai smithery deployConfiguration The
smithery.json
file contains the configuration for your deployment:WebSocket transport on port 8082
Configurable Fledge API URL
Tool definitions and parameters
Timeout settings
Environment Variables Set the following environment variables in your Smithery.ai dashboard:
FLEDGE_API_URL
: Your Fledge API endpointAPI_KEY
: Your secure API key (if using secure mode)
Verification After deployment, verify your server is running:
smithery status fledge-mcpMonitoring Monitor your deployment through the Smithery.ai dashboard:
Real-time logs
Performance metrics
Error tracking
Resource usage
Updating To update your deployment:
# Build new image docker build -t fledge-mcp . # Deploy updates smithery deploy --update
JSON-RPC Protocol Support
The server implements the Model Context Protocol (MCP) using JSON-RPC 2.0 over WebSocket. The following methods are supported:
initialize
{ "jsonrpc": "2.0", "method": "initialize", "params": {}, "id": "1" }Response:
{ "jsonrpc": "2.0", "result": { "serverInfo": { "name": "fledge-mcp", "version": "1.0.0", "description": "Fledge Model Context Protocol (MCP) Server", "vendor": "Fledge", "capabilities": { "tools": true, "streaming": true, "authentication": "api_key" } }, "configSchema": { "type": "object", "properties": { "fledge_api_url": { "type": "string", "description": "Fledge API URL", "default": "http://localhost:8081/fledge" } } } }, "id": "1" }tools/list
{ "jsonrpc": "2.0", "method": "tools/list", "params": {}, "id": "2" }Response: Returns the list of available tools and their parameters.
tools/call
{ "jsonrpc": "2.0", "method": "tools/call", "params": { "name": "get_sensor_data", "parameters": { "sensor_id": "temp1", "limit": 10 } }, "id": "3" }
Error Codes
The server follows standard JSON-RPC 2.0 error codes:
-32700: Parse error
-32600: Invalid Request
-32601: Method not found
-32602: Invalid params
-32000: Server error
This server cannot be installed
hybrid server
The server is able to function both locally and remotely, depending on the configuration or use case.
Connects Fledge functionality to Cursor AI, allowing interaction with Fledge instances via natural language commands.
Related MCP Servers
- AsecurityFlicenseAqualityEnables Cursor AI to interact with Figma designs, allowing users to read design information and programmatically modify elements through natural language commands.Last updated -381,7562
- -securityFlicense-qualityAn interface enabling high-frequency communication between AI tools (like Cursor and Windsurf) and users, allowing for option selection and information gathering through CLI, Web, or PyQt interfaces.Last updated -29
- AsecurityAlicenseAqualityEnables Cursor AI to communicate with Figma for reading designs and modifying them programmatically, allowing users to automate design tasks through natural language.Last updated -381,756MIT License
- AsecurityAlicenseAqualityAllows Cursor AI to communicate with Figma for reading designs and modifying them programmatically through a Model Context Protocol integration.Last updated -381,756MIT License