Skip to main content
Glama

Graphiti MCP Server

by dreamnear

Graphiti MCP Server

中文版本

This is a standalone Model Context Protocol (MCP) server implementation for Graphiti, specifically designed as an independent service with enhanced features.

Source Repository

This project is based on the official Graphiti project. The original Graphiti framework provides the core functionality for building and querying temporally-aware knowledge graphs.

This standalone edition maintains compatibility with the original Graphiti while adding enhanced features and improved performance through FastMCP refactoring.

Key Differences from Official Graphiti MCP

This standalone edition differs from the official Graphiti MCP implementation in the following ways:

  1. Client-defined Group ID: Unlike the official version, this implementation allows clients to define their own group_id for better data organization and isolation.
  2. FastMCP Refactoring: The server has been refactored using FastMCP framework for improved performance and maintainability.

Features

The Graphiti MCP server exposes the following key high-level functions of Graphiti:

  • Episode Management: Add, retrieve, and delete episodes (text, messages, or JSON data)
  • Entity Management: Search and manage entity nodes and relationships in the knowledge graph
  • Search Capabilities: Search for facts (edges) and node summaries using semantic and hybrid search
  • Group Management: Organize and manage groups of related data with group_id filtering
  • Graph Maintenance: Clear the graph and rebuild indices

Quick Start

Installation

  1. Ensure you have Python 3.10 or higher installed.
  2. Install the package using pip: install from source:
git clone git@github.com:dreamnear/graphiti-mcp.git cd graphiti-mcp pip install -e .

Prerequisites

  1. A running Neo4j database (version 5.26 or later required)
  2. OpenAI API key for LLM operations (optional, but required for entity extraction)

Setup

  1. Copy the provided .env.example file to create a .env file:
    cp .env.example .env
  2. Edit the .env file to set your configuration:
    # Required Neo4j configuration NEO4J_URI=bolt://localhost:7687 NEO4J_USER=neo4j NEO4J_PASSWORD=your_password_here # Optional OpenAI API key for LLM operations OPENAI_API_KEY=your_openai_api_key_here MODEL_NAME=gpt-4.1-mini

Running the Server

Direct Execution

To run the Graphiti MCP server directly:

graphiti-mcp-server

Or with options:

graphiti-mcp-server --model gpt-4.1-mini --transport sse --group-id my_project

Using uv

If you prefer to use uv for package management:

# Install uv if you don't have it already curl -LsSf https://astral.sh/uv/install.sh | sh # Install dependencies uv sync # Run the server uv run graphiti-mcp-server

Docker Deployment

The Graphiti MCP server can be deployed using Docker:

docker build -t graphiti-mcp-server . docker run -p 8000:8000 --env-file .env graphiti-mcp-server

Or using Docker Compose (includes Neo4j):

docker-compose up

Configuration

The server uses the following environment variables:

  • NEO4J_URI: URI for the Neo4j database (default: bolt://localhost:7687)
  • NEO4J_USER: Neo4j username (default: neo4j)
  • NEO4J_PASSWORD: Neo4j password (default: demodemo)
  • OPENAI_API_KEY: OpenAI API key (required for LLM operations)
  • OPENAI_BASE_URL: Optional base URL for OpenAI API
  • MODEL_NAME: OpenAI model name to use for LLM operations (default: gpt-4.1-mini)
  • SMALL_MODEL_NAME: OpenAI model name to use for smaller LLM operations (default: gpt-4.1-nano)
  • LLM_TEMPERATURE: Temperature for LLM responses (0.0-2.0, default: 0.0)
  • AZURE_OPENAI_ENDPOINT: Optional Azure OpenAI LLM endpoint URL
  • AZURE_OPENAI_DEPLOYMENT_NAME: Optional Azure OpenAI LLM deployment name
  • AZURE_OPENAI_API_VERSION: Optional Azure OpenAI LLM API version
  • AZURE_OPENAI_EMBEDDING_API_KEY: Optional Azure OpenAI Embedding deployment key
  • AZURE_OPENAI_EMBEDDING_ENDPOINT: Optional Azure OpenAI Embedding endpoint URL
  • AZURE_OPENAI_EMBEDDING_DEPLOYMENT_NAME: Optional Azure OpenAI embedding deployment name
  • AZURE_OPENAI_EMBEDDING_API_VERSION: Optional Azure OpenAI API version
  • AZURE_OPENAI_USE_MANAGED_IDENTITY: Optional use Azure Managed Identities for authentication
  • SEMAPHORE_LIMIT: Episode processing concurrency (default: 10)
  • MCP_SERVER_HOST: Host to bind the server to (default: 127.0.0.1)
  • MCP_SERVER_PORT: Port to bind the server to (default: 8000)

Available Arguments

  • --transport: Choose the transport method (stdio, http, or sse, default: stdio)
  • --model: Overrides the MODEL_NAME environment variable
  • --small-model: Overrides the SMALL_MODEL_NAME environment variable
  • --temperature: Overrides the LLM_TEMPERATURE environment variable
  • --group-id: Set a namespace for the graph (default: "default")
  • --destroy-graph: If set, destroys all Graphiti graphs on startup
  • --use-custom-entities: Enable entity extraction using the predefined ENTITY_TYPES
  • --host: Host to bind the MCP server to (default: 127.0.0.1)
  • --port: Port to bind the MCP server to (default: 8000)
  • --path: Path for transport endpoint (default: /mcp for HTTP, /sse for SSE)

Integrating with MCP Clients

STDIO Transport (for Claude Desktop, etc.)

{ "mcpServers": { "graphiti-memory": { "transport": "stdio", "command": "graphiti-mcp-server", "args": ["--transport", "stdio"], "env": { "NEO4J_URI": "bolt://localhost:7687", "NEO4J_USER": "neo4j", "NEO4J_PASSWORD": "your_password", "OPENAI_API_KEY": "your_api_key" } } } }

HTTP Transport (for general HTTP clients)

{ "mcpServers": { "graphiti-memory": { "type": "http", "url": "http://localhost:8000/mcp/?group_id=default" } } }

SSE Transport (for Cursor, etc.)

{ "mcpServers": { "graphiti-memory": { "transport": "sse", "url": "http://localhost:8000/sse?group_id=my_project" } } }

Available Tools

The Graphiti MCP server exposes the following tools:

  • add_memory: Add an episode to the knowledge graph (supports text, JSON, and message formats)
  • search_memory_nodes: Search the knowledge graph for relevant node summaries
  • search_memory_facts: Search the knowledge graph for relevant facts (edges between entities)
  • delete_entity_edge: Delete an entity edge from the knowledge graph
  • delete_episode: Delete an episode from the knowledge graph
  • get_entity_edge: Get an entity edge by its UUID
  • get_episodes: Get the most recent episodes for a specific group
  • clear_graph: Clear all data from the knowledge graph and rebuild indices

Working with JSON Data

The Graphiti MCP server can process structured JSON data through the add_memory tool with source="json":

add_memory( name="Customer Profile", episode_body='{"company": {"name": "Acme Technologies"}, "products": [{"id": "P001", "name": "CloudSync"}, {"id": "P002", "name": "DataMiner"}]}', source="json", source_description="CRM data" )

Requirements

  • Python 3.10 or higher
  • Neo4j database (version 5.26 or later required)
  • OpenAI API key (for LLM operations and embeddings)
-
security - not tested
A
license - permissive license
-
quality - not tested

hybrid server

The server is able to function both locally and remotely, depending on the configuration or use case.

A framework for building and querying temporally-aware knowledge graphs that allows AI assistants to interact with graph capabilities through the Model Context Protocol.

  1. Source Repository
    1. Key Differences from Official Graphiti MCP
      1. Features
        1. Quick Start
          1. Installation
          2. Prerequisites
          3. Setup
        2. Running the Server
          1. Direct Execution
          2. Using uv
          3. Docker Deployment
        3. Configuration
          1. Available Arguments
            1. Integrating with MCP Clients
              1. STDIO Transport (for Claude Desktop, etc.)
              2. HTTP Transport (for general HTTP clients)
              3. SSE Transport (for Cursor, etc.)
            2. Available Tools
              1. Working with JSON Data
                1. Requirements

                  Related MCP Servers

                  • A
                    security
                    F
                    license
                    A
                    quality
                    Provides a scalable knowledge graph implementation for Model Context Protocol using Elasticsearch, enabling AI models to store and query information with advanced search capabilities, memory-like behavior, and multi-zone architecture.
                    Last updated -
                    17
                    13
                    TypeScript
                  • A
                    security
                    A
                    license
                    A
                    quality
                    A Model Context Protocol server implementation that enables AI assistants to securely interact with GreptimeDB, allowing them to explore database schema, read data, and execute SQL queries through a controlled interface.
                    Last updated -
                    1
                    21
                    Python
                    MIT License
                    • Apple
                  • A
                    security
                    A
                    license
                    A
                    quality
                    Scalable, high-performance knowledge graph memory system with semantic search, temporal awareness, and advanced relation management.
                    Last updated -
                    20
                    85
                    157
                    TypeScript
                    MIT License
                  • -
                    security
                    A
                    license
                    -
                    quality
                    A bridge that exposes structured, verifiable context and query capabilities of a local Temporal Knowledge Graph to MCP-compatible AI agents, enabling them to access explicit project history and relationships rather than just semantic content.
                    Last updated -
                    1
                    Python
                    MIT License
                    • Apple
                    • Linux

                  View all related MCP servers

                  MCP directory API

                  We provide all the information about MCP servers via our MCP API.

                  curl -X GET 'https://glama.ai/api/mcp/v1/servers/dreamnear/graphiti-mcp'

                  If you have feedback or need assistance with the MCP directory API, please join our Discord server