fal.ai MCP Server
A Model Context Protocol (MCP) server for interacting with fal.ai models and services.
Features
List all available fal.ai models
Search for specific models by keywords
Get model schemas
Generate content using any fal.ai model
Support for both direct and queued model execution
Queue management (status checking, getting results, cancelling requests)
File upload to fal.ai CDN
Requirements
Python 3.10+
fastmcp
httpx
aiofiles
A fal.ai API key
Installation
Clone this repository:
Install the required packages:
Set your fal.ai API key as an environment variable:
Usage
Running the Server
You can run the server in development mode with:
This will launch the MCP Inspector web interface where you can test the tools interactively.
Installing in Claude Desktop
To use the server with Claude Desktop:
This will make the server available to Claude in the Desktop app.
Running Directly
You can also run the server directly:
API Reference
Tools
models(page=None, total=None)- List available models with optional paginationsearch(keywords)- Search for models by keywordsschema(model_id)- Get OpenAPI schema for a specific modelgenerate(model, parameters, queue=False)- Generate content using a modelresult(url)- Get result from a queued requeststatus(url)- Check status of a queued requestcancel(url)- Cancel a queued requestupload(path- Upload a file to fal.ai CDN