Retrieve hierarchical children of a specific ontology term, including subclasses and 'part of' relationships, from the Ontology Lookup Service (OLS). Useful for exploring term dependencies and structure in biomedical ontologies.
Install and configure Ollama locally on macOS without Homebrew or sudo. Downloads directly to ~/Applications, works on locked-down systems, and sets up the default model for local inference.
MCP Ollama server integrates Ollama models with MCP clients, allowing users to list models, get detailed information, and interact with them through questions.
Enables interaction with locally running Ollama models through chat, generation, and model management operations. Supports listing, downloading, and deleting models while maintaining conversation history for interactive sessions.
Enables complete local Ollama management including listing models, chatting with local LLMs, starting/stopping the server, and getting intelligent model recommendations for specific tasks through natural language commands.