MCP 奥拉马
用于将 Ollama 与 Claude Desktop 或其他 MCP 客户端集成的模型上下文协议 (MCP) 服务器。
要求
Python 3.10 或更高版本
Ollama 已安装并正在运行( https://ollama.com/download )
至少一个模型使用 Ollama 拉取(例如,
ollama pull llama2
)
配置 Claude 桌面
添加到您的 Claude Desktop 配置(在 macOS 上为~/Library/Application Support/Claude/claude_desktop_config.json
,在 Windows 上%APPDATA%\Claude\claude_desktop_config.json
):
发展
以开发模式安装:
使用 MCP Inspector 进行测试:
特征
该服务器提供四个主要工具:
list_models
- 列出所有下载的 Ollama 模型show_model
- 获取有关特定模型的详细信息ask_model
- 向指定模型提问
执照
麻省理工学院
local-only server
The server can only run on the client's local machine because it depends on local resources.
MCP Ollama 服务器将 Ollama 模型与 MCP 客户端集成,允许用户列出模型、获取详细信息并通过问题与模型进行交互。
Related Resources
Related MCP Servers
- -securityFlicense-qualityAn interactive chat interface that combines Ollama's LLM capabilities with PostgreSQL database access through the Model Context Protocol (MCP). Ask questions about your data in natural language and get AI-powered responses backed by real SQL queries.Last updated -58
- -securityAlicense-qualityEnables seamless integration between Ollama's local LLM models and MCP-compatible applications, supporting model management and chat interactions.Last updated -75100AGPL 3.0
- AsecurityAlicenseAqualityAn MCP server that queries multiple Ollama models and combines their responses, providing diverse AI perspectives on a single question for more comprehensive answers.Last updated -264MIT License
- -securityFlicense-qualityA server that enables seamless integration between local Ollama LLM instances and MCP-compatible applications, providing advanced task decomposition, evaluation, and workflow management capabilities.Last updated -6