Skip to main content
Glama

Graphiti MCP Server

by dreamnear

Server Configuration

Describes the environment variables required to run the server.

NameRequiredDescriptionDefault
NEO4J_URIYesURI for the Neo4j databasebolt://localhost:7687
MODEL_NAMENoOpenAI model name to use for LLM operationsgpt-4.1-mini
NEO4J_USERYesNeo4j usernameneo4j
NEO4J_PASSWORDYesNeo4j passworddemodemo
OPENAI_API_KEYNoOpenAI API key (required for LLM operations)
LLM_TEMPERATURENoTemperature for LLM responses (0.0-2.0)0.0
MCP_SERVER_HOSTNoHost to bind the server to127.0.0.1
MCP_SERVER_PORTNoPort to bind the server to8000
OPENAI_BASE_URLNoOptional base URL for OpenAI API
SEMAPHORE_LIMITNoEpisode processing concurrency10
SMALL_MODEL_NAMENoOpenAI model name to use for smaller LLM operationsgpt-4.1-nano
AZURE_OPENAI_ENDPOINTNoOptional Azure OpenAI LLM endpoint URL
AZURE_OPENAI_API_VERSIONNoOptional Azure OpenAI LLM API version
AZURE_OPENAI_DEPLOYMENT_NAMENoOptional Azure OpenAI LLM deployment name
AZURE_OPENAI_EMBEDDING_API_KEYNoOptional Azure OpenAI Embedding deployment key
AZURE_OPENAI_EMBEDDING_ENDPOINTNoOptional Azure OpenAI Embedding endpoint URL
AZURE_OPENAI_USE_MANAGED_IDENTITYNoOptional use Azure Managed Identities for authentication
AZURE_OPENAI_EMBEDDING_API_VERSIONNoOptional Azure OpenAI API version
AZURE_OPENAI_EMBEDDING_DEPLOYMENT_NAMENoOptional Azure OpenAI embedding deployment name

Schema

Prompts

Interactive templates invoked by user choice

NameDescription

No prompts

Resources

Contextual data attached and managed by the client

NameDescription

No resources

Tools

Functions exposed to the LLM to take actions

NameDescription

No tools

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/dreamnear/graphiti-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server