Skip to main content
Glama

SPARQL MCP Server

by sib-swiss

🔭 SPARQL MCP server

A Model Context Protocol (MCP) server to help users write SPARQL queries for open-access SPARQL endpoints, developed for the SIB Expasy portal.

The server will automatically index metadata present in the list of SPARQL endpoints defined in a JSON config file, such as:

🧩 Endpoints

The HTTP API comprises 2 main endpoints:

  • /mcp: MCP server that searches for relevant data to answer a user question using the EOSC Data Commons search API
    • Uses rmcp with Streamable HTTP transport
    • 🧰 Available tools:
      • access_sparql_resources: retrieve relevant information about the resources to help build a SPARQL query to answer the question (query examples, classes schema)
      • get_resources_info: retrieve relevant information about the SPARQL endpoints resources themselves (e.g. description, list of available endpoints)
      • execute_sparql: execute a SPARQL query against a given endpoint
  • /chat: optional HTTP POST endpoint (JSON) to query the MCP server via an LLM provider
    • Uses axum, utoipa for OpenAPI spec generation, llm to interact with LLM providers (e.g. Mistral, OpenAI)
    • Supports streaming response: tool call requested, then tool call results, and final search results.

🚀 Use

Use it through the sparql-mcp package on pip:

uvx sparql-mcp ./sparql-mcp.json

Or download the binary corresponding to your architecture from the releases page.

🛠️ Development

Important

Requirements:

  • Rust
  • Protobuf installed (e.g. brew install protobuf)
  • API key for a LLM provider: Mistral.ai or OpenAI, you can use the free tier, you just need to login

Recommend VSCode extension: rust-analyzer

📥 Install dev dependencies

rustup update cargo install cargo-release cargo-deny cargo-watch git-cliff

Create a .cargo/config.toml file with your Mistral API key or OpenAI API key:

[env] MISTRAL_API_KEY = "YOUR_API_KEY" OPENAI_API_KEY = "YOUR_API_KEY" GROQ_API_KEY = "YOUR_API_KEY"

⚡️ Start dev server

Start the MCP server in dev at http://localhost:8000/mcp, with OpenAPI UI at http://localhost:8000/docs

cargo run

Customize server configuration through CLI arguments:

cargo run -- --force-index --mcp-only --db-path ./data/lancedb

Provide a custom list of servers through a .json file with:

cargo run -- ./sparql-mcp.json

Example sparql-mcp.json:

{ "endpoints": [ { "label": "UniProt", "endpoint_url": "https://sparql.uniprot.org/sparql/", "description": "UniProt is a comprehensive resource for protein sequence and annotation data." }, { "label": "Bgee", "endpoint_url": "https://www.bgee.org/sparql/", "description": "Bgee is a database for retrieval and comparison of gene expression patterns across multiple animal species.", "homepage_url": "https://www.bgee.org/" } ] }

Tip

Run and reload on change to the code:

cargo watch -x run

Note

Example curl request:

curl -X POST http://localhost:8000/search -H "Content-Type: application/json" -H "Authorization: SECRET_KEY" -d '{"messages": [{"role": "user", "content": "What is the HGNC symbol for the P68871 protein?"}], "model": "mistral/mistral-small-latest", "stream": true}'

Recommended model per supported provider:

  • openai/gpt-4.1
  • mistralai/mistral-large-latest
  • groq/moonshotai/kimi-k2-instruct

🔌 Connect MCP client

Follow the instructions of your client, and use the /mcp URL of your deployed server (e.g. http://localhost:8000/mcp)

🐙 VSCode GitHub Copilot

Add a new MCP server through the VSCode UI:

  • Open the Command Palette (ctrl+shift+p or cmd+shift+p)
  • Search for MCP: Add Server...
  • Choose HTTP, and provide the MCP server URL http://localhost:8000/mcp

Your VSCode mcp.json should look like:

{ "servers": { "sparql-mcp-server": { "url": "http://localhost:8000/mcp", "type": "http" } }, "inputs": [] }

📦 Build for production

Build binary in target/release/

cargo build --release

Note

Start the server with (change flags at your convenience):

./target/release/sparql-mcp ./sparql-mcp.json --force-index

Start using the python wheel:

uvx --from ./target/release/sparql_mcp-0.1.0-py3-none-any.whl . sparql-mcp

🐍 Build python package

Require uv installed

Bundle the CLI as python package in target/wheels:

uvx maturin build

🐳 Deploy with Docker

Create a keys.env file with the API keys:

MISTRAL_API_KEY=YOUR_API_KEY SEARCH_API_KEY=SECRET_KEY_YOU_CAN_USE_IN_FRONTEND_TO_AVOID_SPAM

Tip

SEARCH_API_KEY can be used to add a layer of protection against bots that might spam the LLM, if not provided no API key will be needed to query the API.

Build and deploy the service:

docker compose up

🧼 Format & lint

Automatically format the codebase using rustfmt:

cargo fmt

Lint with clippy:

cargo clippy --all

Automatically apply possible fixes:

cargo fix

⛓️ Check supply chain

Check the dependency supply chain: licenses (only accept dependencies with OSI or FSF approved licenses), and vulnerabilities (CVE advisories).

cargo deny check

Update dependencies in Cargo.lock:

cargo update

🏷️ Release

Dry run:

cargo release patch

Or minor / major

Create release:

cargo release patch --execute
-
security - not tested
A
license - permissive license
-
quality - not tested

remote-capable server

The server can be hosted and run remotely because it primarily relies on remote services or has no dependency on the local environment.

Enables users to write and execute SPARQL queries against open-access SPARQL endpoints by providing relevant query examples, schema information, and endpoint metadata. Supports querying biological databases like UniProt and Bgee through natural language interactions.

  1. 🧩 Endpoints
    1. 🚀 Use
      1. 🛠️ Development
        1. 📥 Install dev dependencies
        2. ⚡️ Start dev server
        3. 🔌 Connect MCP client
        4. 📦 Build for production
      2. 🐍 Build python package
        1. 🐳 Deploy with Docker
        2. 🧼 Format & lint
        3. ⛓️ Check supply chain
        4. 🏷️ Release

      MCP directory API

      We provide all the information about MCP servers via our MCP API.

      curl -X GET 'https://glama.ai/api/mcp/v1/servers/sib-swiss/sparql-mcp'

      If you have feedback or need assistance with the MCP directory API, please join our Discord server