MCP Ollama server integrates Ollama models with MCP clients, allowing users to list models, get detailed information, and interact with them through questions.
A bridge that enables Claude Code to interact with local Ollama instances for text generation, multi-turn chat, and vision-based analysis. It supports model management tasks such as listing, pulling, and showing details, alongside generating text embeddings.
Facilitates initiating Ollama queries via Claude and manages a simple note storage system with capabilities to add, summarize, and access notes using custom URIs.
This is a Model Context Protocol (MCP) server adaptation of LangChain Ollama Deep Researcher. It provides the deep research capabilities as MCP tools that can be used within the model context protocol ecosystem, allowing AI assistants to perform in-depth research on topics (locally) via Ollama
A server that enables seamless integration between local Ollama LLM instances and MCP-compatible applications, providing advanced task decomposition, evaluation, and workflow management capabilities.
Enables AI assistants to index and search codebases using semantic search powered by multiple embedding providers (OpenAI, VoyageAI, Gemini, Ollama) and vector database storage.