Skip to main content
Glama

MCP Ollama Server

MCP Ollama

A Model Context Protocol (MCP) server for integrating Ollama with Claude Desktop or other MCP clients.

Requirements

  • Python 3.10 or higher
  • Ollama installed and running (https://ollama.com/download)
  • At least one model pulled with Ollama (e.g., ollama pull llama2)

Configure Claude Desktop

Add to your Claude Desktop configuration (~/Library/Application Support/Claude/claude_desktop_config.json on macOS, %APPDATA%\Claude\claude_desktop_config.json on Windows):

{ "mcpServers": { "ollama": { "command": "uvx", "args": [ "mcp-ollama" ] } } }

Development

Install in development mode:

git clone https://github.com/yourusername/mcp-ollama.git cd mcp-ollama uv sync

Test with MCP Inspector:

mcp dev src/mcp_ollama/server.py

Features

The server provides four main tools:

  • list_models - List all downloaded Ollama models
  • show_model - Get detailed information about a specific model
  • ask_model - Ask a question to a specified model

License

MIT

Install Server
A
security – no known vulnerabilities
A
license - permissive license
A
quality - confirmed to work

local-only server

The server can only run on the client's local machine because it depends on local resources.

MCP Ollama server integrates Ollama models with MCP clients, allowing users to list models, get detailed information, and interact with them through questions.

  1. Requirements
    1. Configure Claude Desktop
    2. Development
  2. Features
    1. License

      Related MCP Servers

      • -
        security
        F
        license
        -
        quality
        An interactive chat interface that combines Ollama's LLM capabilities with PostgreSQL database access through the Model Context Protocol (MCP). Ask questions about your data in natural language and get AI-powered responses backed by real SQL queries.
        Last updated -
        51
        TypeScript
      • -
        security
        A
        license
        -
        quality
        Enables seamless integration between Ollama's local LLM models and MCP-compatible applications, supporting model management and chat interactions.
        Last updated -
        65
        94
        TypeScript
        AGPL 3.0
      • A
        security
        A
        license
        A
        quality
        An MCP server that queries multiple Ollama models and combines their responses, providing diverse AI perspectives on a single question for more comprehensive answers.
        Last updated -
        2
        57
        TypeScript
        MIT License
        • Apple
      • -
        security
        F
        license
        -
        quality
        A server that enables seamless integration between local Ollama LLM instances and MCP-compatible applications, providing advanced task decomposition, evaluation, and workflow management capabilities.
        Last updated -
        6
        Python
        • Apple

      View all related MCP servers

      MCP directory API

      We provide all the information about MCP servers via our MCP API.

      curl -X GET 'https://glama.ai/api/mcp/v1/servers/emgeee/mcp-ollama'

      If you have feedback or need assistance with the MCP directory API, please join our Discord server