Skip to main content
Glama

mcp-flowise

by matthewhand

Server Configuration

Describes the environment variables required to run the server.

NameRequiredDescriptionDefault
FLOWISE_API_KEYYesYour Flowise API Bearer token.
FLOWISE_CHATFLOW_IDNoSingle Chatflow ID (optional).
FLOWISE_API_ENDPOINTYesBase URL for Flowise. Default is 'http://localhost:3000'.http://localhost:3000
FLOWISE_ASSISTANT_IDNoSingle Assistant ID (optional).
FLOWISE_CHATFLOW_BLACKLISTNoComma-separated list of denied chatflow IDs (optional).
FLOWISE_CHATFLOW_WHITELISTNoComma-separated list of allowed chatflow IDs (optional).
FLOWISE_CHATFLOW_DESCRIPTIONSNoComma-separated list of 'chatflow_id:Description' pairs for LowLevel mode.

Schema

Prompts

Interactive templates invoked by user choice

NameDescription

No prompts

Resources

Contextual data attached and managed by the client

NameDescription

No resources

Tools

Functions exposed to the LLM to take actions

NameDescription
list_chatflows
List all available chatflows from the Flowise API. This function respects optional whitelisting or blacklisting if configured via FLOWISE_CHATFLOW_WHITELIST or FLOWISE_CHATFLOW_BLACKLIST. Returns: str: A JSON-encoded string of filtered chatflows.
create_prediction
Create a prediction by sending a question to a specific chatflow or assistant. Args: chatflow_id (str, optional): The ID of the chatflow to use. Defaults to FLOWISE_CHATFLOW_ID. question (str): The question or prompt to send to the chatflow. Returns: str: The raw JSON response from Flowise API or an error message if something goes wrong.

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/matthewhand/mcp-flowise'

If you have feedback or need assistance with the MCP directory API, please join our Discord server