Provides integration with OpenAI's API for processing SPARQL queries and natural language interactions through the chat endpoint
๐ญ SPARQL MCP server
A Model Context Protocol (MCP) server to help users write SPARQL queries for open-access SPARQL endpoints, developed for the SIB Expasy portal.
The server will automatically index metadata present in the list of SPARQL endpoints defined in a JSON config file, such as:
Endpoints schema using the Vocabulary of Interlinked Datasets (VoID), which can be automatically generated using the void-generator.
๐งฉ Endpoints
The HTTP API comprises 2 main endpoints:
/mcp: MCP server that searches for relevant data to answer a user question using the EOSC Data Commons search APIUses
rmcpwith Streamable HTTP transport๐งฐ Available tools:
access_sparql_resources: retrieve relevant information about the resources to help build a SPARQL query to answer the question (query examples, classes schema)get_resources_info: retrieve relevant information about the SPARQL endpoints resources themselves (e.g. description, list of available endpoints)execute_sparql: execute a SPARQL query against a given endpoint
/chat: optional HTTP POST endpoint (JSON) to query the MCP server via an LLM provider
๐ Use
Use it through the sparql-mcp package on pip:
Or download the binary corresponding to your architecture from the releases page.
๐ ๏ธ Development
Requirements:
Protobuf installed (e.g.
brew install protobuf)API key for a LLM provider: Mistral.ai or OpenAI, you can use the free tier, you just need to login
Recommend VSCode extension: rust-analyzer
๐ฅ Install dev dependencies
Create a .cargo/config.toml file with your Mistral API key or OpenAI API key:
โก๏ธ Start dev server
Start the MCP server in dev at http://localhost:8000/mcp, with OpenAPI UI at http://localhost:8000/docs
Customize server configuration through CLI arguments:
Provide a custom list of servers through a .json file with:
Example sparql-mcp.json:
Run and reload on change to the code:
Example curl request:
Recommended model per supported provider:
openai/gpt-4.1mistralai/mistral-large-latestgroq/moonshotai/kimi-k2-instruct
๐ Connect MCP client
Follow the instructions of your client, and use the /mcp URL of your deployed server (e.g. http://localhost:8000/mcp)
๐ VSCode GitHub Copilot
Add a new MCP server through the VSCode UI:
Open the Command Palette (
ctrl+shift+porcmd+shift+p)Search for
MCP: Add Server...Choose
HTTP, and provide the MCP server URL http://localhost:8000/mcp
Your VSCode mcp.json should look like:
๐ฆ Build for production
Build binary in target/release/
Start the server with (change flags at your convenience):
Start using the python wheel:
๐ Build python package
Require
uvinstalled
Bundle the CLI as python package in target/wheels:
๐ณ Deploy with Docker
Create a keys.env file with the API keys:
SEARCH_API_KEY can be used to add a layer of protection against bots that might spam the LLM, if not provided no API key will be needed to query the API.
Build and deploy the service:
๐งผ Format & lint
Automatically format the codebase using rustfmt:
Lint with clippy:
Automatically apply possible fixes:
โ๏ธ Check supply chain
Check the dependency supply chain: licenses (only accept dependencies with OSI or FSF approved licenses), and vulnerabilities (CVE advisories).
Update dependencies in Cargo.lock:
๐ท๏ธ Release
Dry run:
Or
minor/major
Create release: