Skip to main content
Glama

矢量化 MCP 服务器

模型上下文协议 (MCP) 服务器实现与Vectorize集成,用于高级向量检索和文本提取。

安装

使用 npx 运行

export VECTORIZE_ORG_ID=YOUR_ORG_ID export VECTORIZE_TOKEN=YOUR_TOKEN export VECTORIZE_PIPELINE_ID=YOUR_PIPELINE_ID npx -y @vectorize-io/vectorize-mcp-server@latest

Related MCP server: Scrapezy

Claude/Windsurf/Cursor/Cline 上的配置

{ "mcpServers": { "vectorize": { "command": "npx", "args": ["-y", "@vectorize-io/vectorize-mcp-server@latest"], "env": { "VECTORIZE_ORG_ID": "your-org-id", "VECTORIZE_TOKEN": "your-token", "VECTORIZE_PIPELINE_ID": "your-pipeline-id" } } } }

工具

检索文档

执行向量搜索并检索文档(参见官方API ):

{ "name": "retrieve", "arguments": { "question": "Financial health of the company", "k": 5 } }

文本提取和分块(任何文件到 Markdown)

从文档中提取文本并将其分块为 Markdown 格式(请参阅官方API ):

{ "name": "extract", "arguments": { "base64document": "base64-encoded-document", "contentType": "application/pdf" } }

深入研究

从您的管道生成私人深度研究(请参阅官方API ):

{ "name": "deep-research", "arguments": { "query": "Generate a financial status report about the company", "webSearch": true } }

发展

npm install npm run dev

贡献

  1. 分叉存储库

  2. 创建你的功能分支

  3. 提交拉取请求

One-click Deploy
A
security – no known vulnerabilities
A
license - permissive license
A
quality - confirmed to work

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/vectorize-io/vectorize-mcp-server'

If you have feedback or need assistance with the MCP directory API, please join our Discord server