Skip to main content
Glama

๐Ÿ”ญ SPARQL MCP server

Build

A Model Context Protocol (MCP) server to help users write SPARQL queries for open-access SPARQL endpoints, developed for the SIB Expasy portal.

The server will automatically index metadata present in the list of SPARQL endpoints defined in a JSON config file, such as:

๐Ÿงฉ Endpoints

The HTTP API comprises 2 main endpoints:

  • /mcp: MCP server that searches for relevant data to answer a user question using the EOSC Data Commons search API

    • Uses rmcp with Streamable HTTP transport

    • ๐Ÿงฐ Available tools:

      • access_sparql_resources: retrieve relevant information about the resources to help build a SPARQL query to answer the question (query examples, classes schema)

      • get_resources_info: retrieve relevant information about the SPARQL endpoints resources themselves (e.g. description, list of available endpoints)

      • execute_sparql: execute a SPARQL query against a given endpoint

  • /chat: optional HTTP POST endpoint (JSON) to query the MCP server via an LLM provider

    • Uses axum, utoipa for OpenAPI spec generation, llm to interact with LLM providers (e.g. Mistral, OpenAI)

    • Supports streaming response: tool call requested, then tool call results, and final search results.

๐Ÿš€ Use

Use it through the sparql-mcp package on pip:

uvx sparql-mcp ./sparql-mcp.json

Or download the binary corresponding to your architecture from the releases page.

๐Ÿ› ๏ธ Development

IMPORTANT

Requirements:

  • Rust

  • Protobuf installed (e.g. brew install protobuf)

  • API key for a LLM provider: Mistral.ai or OpenAI, you can use the free tier, you just need to login

Recommend VSCode extension: rust-analyzer

๐Ÿ“ฅ Install dev dependencies

rustup update cargo install cargo-release cargo-deny cargo-watch git-cliff

Create a .cargo/config.toml file with your Mistral API key or OpenAI API key:

[env] MISTRAL_API_KEY = "YOUR_API_KEY" OPENAI_API_KEY = "YOUR_API_KEY" GROQ_API_KEY = "YOUR_API_KEY"

โšก๏ธ Start dev server

Start the MCP server in dev at http://localhost:8000/mcp, with OpenAPI UI at http://localhost:8000/docs

cargo run

Customize server configuration through CLI arguments:

cargo run -- --force-index --mcp-only --db-path ./data/lancedb

Provide a custom list of servers through a .json file with:

cargo run -- ./sparql-mcp.json

Example sparql-mcp.json:

{ "endpoints": [ { "label": "UniProt", "endpoint_url": "https://sparql.uniprot.org/sparql/", "description": "UniProt is a comprehensive resource for protein sequence and annotation data." }, { "label": "Bgee", "endpoint_url": "https://www.bgee.org/sparql/", "description": "Bgee is a database for retrieval and comparison of gene expression patterns across multiple animal species.", "homepage_url": "https://www.bgee.org/" } ] }
TIP

Run and reload on change to the code:

cargo watch -x run
NOTE

Example curl request:

curl -X POST http://localhost:8000/search -H "Content-Type: application/json" -H "Authorization: SECRET_KEY" -d '{"messages": [{"role": "user", "content": "What is the HGNC symbol for the P68871 protein?"}], "model": "mistral/mistral-small-latest", "stream": true}'

Recommended model per supported provider:

  • openai/gpt-4.1

  • mistralai/mistral-large-latest

  • groq/moonshotai/kimi-k2-instruct

๐Ÿ”Œ Connect MCP client

Follow the instructions of your client, and use the /mcp URL of your deployed server (e.g. http://localhost:8000/mcp)

๐Ÿ™ VSCode GitHub Copilot

Add a new MCP server through the VSCode UI:

  • Open the Command Palette (ctrl+shift+p or cmd+shift+p)

  • Search for MCP: Add Server...

  • Choose HTTP, and provide the MCP server URL http://localhost:8000/mcp

Your VSCode mcp.json should look like:

{ "servers": { "sparql-mcp-server": { "url": "http://localhost:8000/mcp", "type": "http" } }, "inputs": [] }

๐Ÿ“ฆ Build for production

Build binary in target/release/

cargo build --release
NOTE

Start the server with (change flags at your convenience):

./target/release/sparql-mcp ./sparql-mcp.json --force-index

Start using the python wheel:

uvx --from ./target/release/sparql_mcp-0.1.0-py3-none-any.whl . sparql-mcp

๐Ÿ Build python package

Require uv installed

Bundle the CLI as python package in target/wheels:

uvx maturin build

๐Ÿณ Deploy with Docker

Create a keys.env file with the API keys:

MISTRAL_API_KEY=YOUR_API_KEY SEARCH_API_KEY=SECRET_KEY_YOU_CAN_USE_IN_FRONTEND_TO_AVOID_SPAM
TIP

SEARCH_API_KEY can be used to add a layer of protection against bots that might spam the LLM, if not provided no API key will be needed to query the API.

Build and deploy the service:

docker compose up

๐Ÿงผ Format & lint

Automatically format the codebase using rustfmt:

cargo fmt

Lint with clippy:

cargo clippy --all

Automatically apply possible fixes:

cargo fix

โ›“๏ธ Check supply chain

Check the dependency supply chain: licenses (only accept dependencies with OSI or FSF approved licenses), and vulnerabilities (CVE advisories).

cargo deny check

Update dependencies in Cargo.lock:

cargo update

๐Ÿท๏ธ Release

Dry run:

cargo release patch

Or minor / major

Create release:

cargo release patch --execute
-
security - not tested
A
license - permissive license
-
quality - not tested

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/sib-swiss/sparql-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server