Integrations
Uses Docker to run GraphDB and provide SPARQL endpoint functionality
Provides access to Gemini models for text generation, chat completion, and model listing with support for various Gemini model variants
Enables running, managing, and interacting with Ollama models including execution, information retrieval, downloading, listing, deletion, and chat completion
Ontology MCP
Ontology MCP is a Model Context Protocol (MCP) server that connects GraphDB's SPARQL endpoints and Ollama models to Claude. This tool allows Claude to query and manipulate ontology data and leverage various AI models.
Key Features
SPARQL related functions
- Execute SPARQL query (
mcp_sparql_execute_query
) - Execute SPARQL update query (
mcp_sparql_update
) - List repositories (
mcp_sparql_list_repositories
) - Query the graph list (
mcp_sparql_list_graphs
) - Get resource information (
mcp_sparql_get_resource_info
)
Features related to the Ollama model
- Run the model (
mcp_ollama_run
) - Check model information (
mcp_ollama_show
) - Download model (
mcp_ollama_pull
) - Get model list (
mcp_ollama_list
) - Delete model (
mcp_ollama_rm
) - Chat completion (
mcp_ollama_chat_completion
) - Check container status (
mcp_ollama_status
)
OpenAI related features
- Chat completed (
mcp_openai_chat
) - Create image (
mcp_openai_image
) - Text-to-speech (
mcp_openai_tts
) - Speech-to-text (
mcp_openai_transcribe
) - Generate embedding (
mcp_openai_embedding
)
Google Gemini related features
- Generate text (
mcp_gemini_generate_text
) - Chat completion (
mcp_gemini_chat_completion
) - Get model list (
mcp_gemini_list_models
) - ~~Generate images (
mcp_gemini_generate_images
) - Using Imagen model~~ (currently disabled) - ~~Generate videos (
mcp_gemini_generate_videos
) - Using Veo models~~ (currently disabled) - ~~Generate multimodal content (
mcp_gemini_generate_multimodal_content
)~~ (currently disabled)
Note : Gemini's image creation, video creation, and multimodal content creation features are currently disabled due to API compatibility issues.
Supported Gemini Models
Model transformation | input | output of power | Optimization Goal |
---|---|---|---|
Gemini 2.5 Flash Preview gemini-2.5-flash-preview-04-17 | Audio, images, video, text | Text | Adaptive thinking, cost-effectiveness |
Gemini 2.5 Pro Preview gemini-2.5-pro-preview-03-25 | Audio, images, video, text | Text | Enhanced thinking and reasoning, multimodal understanding, advanced coding |
Gemini 2.0 Flash gemini-2.0-flash | Audio, images, video, text | Text, images (experimental), audio (coming soon) | Next-generation capabilities, speed, thinking, real-time streaming, multimodal creation |
Gemini 2.0 Flash-Lite gemini-2.0-flash-lite | Audio, images, video, text | Text | Cost-effective and low latency |
Gemini 1.5 Flash gemini-1.5-flash | Audio, images, video, text | Text | Fast and versatile performance for a variety of tasks |
Gemini 1.5 Flash-8B gemini-1.5-flash-8b | Audio, images, video, text | Text | High volume and low intelligence tasks |
Gemini 1.5 Pro gemini-1.5-pro | Audio, images, video, text | Text | Complex reasoning tasks that require more intelligence |
Gemini embedding gemini-embedding-exp | Text | Text embedding | Measuring the relevance of text strings |
Imagen 3 imagen-3.0-generate-002 | Text | image | Google's most advanced image generation model |
Veo 2 veo-2.0-generate-001 | Text, Images | video | Create high-quality videos |
Gemini 2.0 Flash Live gemini-2.0-flash-live-001 | Audio, video, text | Text, Audio | Low-latency, two-way voice and video interaction |
HTTP request functions
- Execute HTTP requests (
mcp_http_request
) - communicate with external APIs using various HTTP methods such as GET, POST, PUT, DELETE, etc.
Get started
1. Clone the repository
2. Run the GraphDB Docker container
Start the GraphDB server by running the following command from the project root directory:
The GraphDB web interface runs at http://localhost:7200 .
3. Build and run the MCP server
4. Import RDF data
Go to the GraphDB web interface ( http://localhost:7200 ) and do the following:
- Create a repository:
- “Setup” → “Repositories” → “Create new repository”
- Repository ID:
schemaorg-current-https
(or whatever name you want) - Repository title: "Schema.org"
- Click "Create"
- Get sample data:
- Select the repository you created
- “Import” → “RDF” → “Upload RDF files”
- Upload an example file to the
imports
directory (e.g.imports/example.ttl
) - Click "Import"
Note : The project includes example RDF files in the
imports
directory.
5. Setting up Claude Desktop
To use Ontology MCP in Claude Desktop, you need to update the MCP settings file:
- Open the Claude Desktop settings file:
- Windows:
%AppData%\Claude\claude_desktop_config.json
- macOS:
~/Library/Application Support/Claude/claude_desktop_config.json
- Linux:
~/.config/Claude/claude_desktop_config.json
- Windows:
- Add the following settings:
IMPORTANT : Change the path in `args' to the actual absolute path to your project build directory.
- Restart Claude Desktop
License
This project is provided under the MIT License. See the LICENSE file for details.
You must be authenticated.
hybrid server
The server is able to function both locally and remotely, depending on the configuration or use case.
Tools
A Model Context Protocol (MCP) server that connects GraphDB's SPARQL endpoints and Ollama models to Claude, enabling Claude to query and manipulate ontology data while leveraging various AI models.
Related MCP Servers
- -securityFlicense-qualityA MCP server that exposes GraphQL schema information to LLMs like Claude. This server allows an LLM to explore and understand large GraphQL schemas through a set of specialized tools, without needing to load the whole schema into the contextLast updated -141JavaScript
- -securityAlicense-qualityA Model Context Protocol server that enables LLMs to interact with GraphQL APIs by providing schema introspection and query execution capabilities.Last updated -5361MIT License
- AsecurityFlicenseAqualityA Model Context Protocol server that provides read-only access to Ontotext GraphDB, enabling LLMs to explore RDF graphs and execute SPARQL queries.Last updated -24JavaScript
FalkorDB MCP Serverofficial
-securityAlicense-qualityAllows AI models to query and interact with FalkorDB graph databases through the Model Context Protocol (MCP) specification.Last updated -4TypeScriptMIT License