Skip to main content
Glama

AutoGen MCP Server

Server Configuration

Describes the environment variables required to run the server.

NameRequiredDescriptionDefault
OPENAI_API_KEYNoOpenAI API Key (optional, can also be set in config.json)
AUTOGEN_MCP_CONFIGYesPath to the configuration fileconfig.json

Schema

Prompts

Interactive templates invoked by user choice

NameDescription
autogen-workflowCreate a sophisticated multi-agent AutoGen workflow
code-reviewSet up agents for comprehensive collaborative code review
research-analysisCreate advanced research and analysis workflow

Resources

Contextual data attached and managed by the client

NameDescription
Active AgentsList of currently active AutoGen agents
Workflow TemplatesAvailable workflow templates
Chat HistoryRecent agent conversation history
Current ConfigurationCurrent AutoGen configuration settings
Progress StatusReal-time progress of running tasks
Performance MetricsServer performance statistics

Tools

Functions exposed to the LLM to take actions

NameDescription
create_streaming_workflow

Create a workflow with real-time streaming

start_streaming_chat

Start a streaming chat session

create_agent

Create a new AutoGen agent

execute_workflow

Execute a workflow with streaming support

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/DynamicEndpoints/Autogen_MCP'

If you have feedback or need assistance with the MCP directory API, please join our Discord server