Display detailed information about a specific Ollama model by specifying its name, enabling users to retrieve metadata for better integration and utilization within the Ontology MCP server.
Download models from the Ollama registry to integrate with Ontology MCP, enabling AI-driven querying and manipulation of ontology data via SPARQL endpoints.
Check the status of Ollama servers connected to the Ontology MCP, enabling real-time monitoring of AI models for ontology data querying and manipulation.
Retrieve detailed information about a specific AI model by providing its name. Use this tool to understand model configurations and capabilities on the MCP Ollama Server.
MCP Ollama server integrates Ollama models with MCP clients, allowing users to list models, get detailed information, and interact with them through questions.
An interactive chat interface that combines Ollama's LLM capabilities with PostgreSQL database access through the Model Context Protocol (MCP). Ask questions about your data in natural language and get AI-powered responses backed by real SQL queries.
This is a Model Context Protocol (MCP) server adaptation of LangChain Ollama Deep Researcher. It provides the deep research capabilities as MCP tools that can be used within the model context protocol ecosystem, allowing AI assistants to perform in-depth research on topics (locally) via Ollama