Why this server?
This server provides RAG capabilities for semantic document search using Qdrant vector database and Ollama/OpenAI embeddings.
Why this server?
This server enables AI agents to perform Retrieval-Augmented Generation by querying a FAISS vector database containing Sui Move language documents.
Why this server?
This server provides a Model Context Protocol server providing vector database capabilities through Chroma, enabling semantic document search, metadata filtering, and document management with persistent storage.
Why this server?
This server provides data retrieval capabilities powered by Chroma embedding database, enabling AI models to create collections over generated data and user inputs, and retrieve that data using vector search, full text search, and metadata filtering.
Why this server?
This is an example of how to create a MCP server for Qdrant, a vector search engine.
Why this server?
A Model Context Protocol (MCP) server providing semantic search and memory mining server based on PubTator3, providing convenient access through the MCP interface.