Provides integration with OpenAI's API for processing SPARQL queries and natural language interactions through the chat endpoint
🔭 SPARQL MCP server
A Model Context Protocol (MCP) server to help users write SPARQL queries for open-access SPARQL endpoints, developed for the SIB Expasy portal.
The server will automatically index metadata present in the list of SPARQL endpoints defined in a JSON config file, such as:
- SPARQL query examples,
- Endpoints schema using the Vocabulary of Interlinked Datasets (VoID), which can be automatically generated using the void-generator.
🧩 Endpoints
The HTTP API comprises 2 main endpoints:
/mcp
: MCP server that searches for relevant data to answer a user question using the EOSC Data Commons search API- Uses
rmcp
with Streamable HTTP transport - 🧰 Available tools:
access_sparql_resources
: retrieve relevant information about the resources to help build a SPARQL query to answer the question (query examples, classes schema)get_resources_info
: retrieve relevant information about the SPARQL endpoints resources themselves (e.g. description, list of available endpoints)execute_sparql
: execute a SPARQL query against a given endpoint
- Uses
/chat
: optional HTTP POST endpoint (JSON) to query the MCP server via an LLM provider
🚀 Use
Use it through the sparql-mcp
package on pip:
Or download the binary corresponding to your architecture from the releases page.
🛠️ Development
Important
Requirements:
- Rust
- Protobuf installed (e.g.
brew install protobuf
) - API key for a LLM provider: Mistral.ai or OpenAI, you can use the free tier, you just need to login
Recommend VSCode extension: rust-analyzer
📥 Install dev dependencies
Create a .cargo/config.toml
file with your Mistral API key or OpenAI API key:
⚡️ Start dev server
Start the MCP server in dev at http://localhost:8000/mcp, with OpenAPI UI at http://localhost:8000/docs
Customize server configuration through CLI arguments:
Provide a custom list of servers through a .json
file with:
Example sparql-mcp.json
:
Tip
Run and reload on change to the code:
Note
Example curl
request:
Recommended model per supported provider:
openai/gpt-4.1
mistralai/mistral-large-latest
groq/moonshotai/kimi-k2-instruct
🔌 Connect MCP client
Follow the instructions of your client, and use the /mcp
URL of your deployed server (e.g. http://localhost:8000/mcp)
🐙 VSCode GitHub Copilot
Add a new MCP server through the VSCode UI:
- Open the Command Palette (
ctrl+shift+p
orcmd+shift+p
) - Search for
MCP: Add Server...
- Choose
HTTP
, and provide the MCP server URL http://localhost:8000/mcp
Your VSCode mcp.json
should look like:
📦 Build for production
Build binary in target/release/
Note
Start the server with (change flags at your convenience):
Start using the python wheel:
🐍 Build python package
Require
uv
installed
Bundle the CLI as python package in target/wheels
:
🐳 Deploy with Docker
Create a keys.env
file with the API keys:
Tip
SEARCH_API_KEY
can be used to add a layer of protection against bots that might spam the LLM, if not provided no API key will be needed to query the API.
Build and deploy the service:
🧼 Format & lint
Automatically format the codebase using rustfmt
:
Lint with clippy
:
Automatically apply possible fixes:
⛓️ Check supply chain
Check the dependency supply chain: licenses (only accept dependencies with OSI or FSF approved licenses), and vulnerabilities (CVE advisories).
Update dependencies in Cargo.lock
:
🏷️ Release
Dry run:
Or
minor
/major
Create release:
This server cannot be installed
remote-capable server
The server can be hosted and run remotely because it primarily relies on remote services or has no dependency on the local environment.
Enables users to write and execute SPARQL queries against open-access SPARQL endpoints by providing relevant query examples, schema information, and endpoint metadata. Supports querying biological databases like UniProt and Bgee through natural language interactions.