Skip to main content
Glama

MCP Ollama Server

MCP 奥拉马

用于将 Ollama 与 Claude Desktop 或其他 MCP 客户端集成的模型上下文协议 (MCP) 服务器。

要求

  • Python 3.10 或更高版本
  • Ollama 已安装并正在运行( https://ollama.com/download
  • 至少一个模型使用 Ollama 拉取(例如, ollama pull llama2

配置 Claude 桌面

添加到您的 Claude Desktop 配置(在 macOS 上为~/Library/Application Support/Claude/claude_desktop_config.json ,在 Windows 上%APPDATA%\Claude\claude_desktop_config.json ):

{ "mcpServers": { "ollama": { "command": "uvx", "args": [ "mcp-ollama" ] } } }

发展

以开发模式安装:

git clone https://github.com/yourusername/mcp-ollama.git cd mcp-ollama uv sync

使用 MCP Inspector 进行测试:

mcp dev src/mcp_ollama/server.py

特征

该服务器提供四个主要工具:

  • list_models - 列出所有下载的 Ollama 模型
  • show_model - 获取有关特定模型的详细信息
  • ask_model - 向指定模型提问

执照

麻省理工学院

Install Server
A
security – no known vulnerabilities
A
license - permissive license
A
quality - confirmed to work

local-only server

The server can only run on the client's local machine because it depends on local resources.

MCP Ollama 服务器将 Ollama 模型与 MCP 客户端集成,允许用户列出模型、获取详细信息并通过问题与模型进行交互。

  1. 要求
    1. 配置 Claude 桌面
    2. 发展
  2. 特征
    1. 执照

      Related MCP Servers

      • -
        security
        F
        license
        -
        quality
        An interactive chat interface that combines Ollama's LLM capabilities with PostgreSQL database access through the Model Context Protocol (MCP). Ask questions about your data in natural language and get AI-powered responses backed by real SQL queries.
        Last updated -
        51
        TypeScript
      • -
        security
        A
        license
        -
        quality
        Enables seamless integration between Ollama's local LLM models and MCP-compatible applications, supporting model management and chat interactions.
        Last updated -
        65
        94
        TypeScript
        AGPL 3.0
      • A
        security
        A
        license
        A
        quality
        An MCP server that queries multiple Ollama models and combines their responses, providing diverse AI perspectives on a single question for more comprehensive answers.
        Last updated -
        2
        59
        TypeScript
        MIT License
        • Apple
      • -
        security
        F
        license
        -
        quality
        A server that enables seamless integration between local Ollama LLM instances and MCP-compatible applications, providing advanced task decomposition, evaluation, and workflow management capabilities.
        Last updated -
        6
        Python
        • Apple

      View all related MCP servers

      MCP directory API

      We provide all the information about MCP servers via our MCP API.

      curl -X GET 'https://glama.ai/api/mcp/v1/servers/emgeee/mcp-ollama'

      If you have feedback or need assistance with the MCP directory API, please join our Discord server