Best Ollama MCP Servers
Ollama is an open-source project that allows you to run large language models (LLMs) locally on your own hardware, providing a way to use AI capabilities privately without sending data to external services.
Why this server?
Integrates with Ollama for local AI-powered natural language to SQL query generation.
AlicenseAqualityAmaintenanceWhoDB CLI - a powerful database management tool with interactive TUI, programmatic commands, and MCP server for AI assistants.Last updated2124,787Apache 2.0Why this server?
Provides a preset tunnel configuration for Ollama with authentication enabled on port 11434, enabling secure external access to local Ollama instances.
Why this server?
Supports using Ollama as a local LLM provider for generating concept maps and analyzing codebases without sending data to external services.
AlicenseAqualityDmaintenanceMason indexes your codebase into a persistent concept map linking features and flows to their implementing files, so AI agents can answer "where is X implemented" without running grep/glob. It also provides pre-edit impact analysis and generates CLAUDE.md files from structured analysis of git history, architectural file sampling, and test mappings.Last updated206116MITWhy this server?
Supports optional integration with Ollama for higher-quality embeddings in semantic search functionality, using models like nomic-embed-text for improved vector search results.
AlicenseAqualityBmaintenanceHeadless semantic MCP server for Obsidian, Logseq, Dendron, Foam, and any markdown folder. Features built-in hybrid semantic search, surgical AST editing, template scaffolding, zero-config local embeddings, and workflow tracking.Last updated85501MITWhy this server?
Utilizes Ollama as a local-first model runtime to power the retrieval-ranking learning loop and provide model-driven memory orchestration.
AlicenseAqualityBmaintenanceContextLattice is an HTTP-first, MCP-compatible memory/context/task orchestrator that persists writes and returns fused recall from specialized stores with local-first defaults. Primary URL: https://contextlattice.io/ Install: https://contextlattice.io/installation.html Troubleshooting: https://contextlattice.io/troubleshooting.htmlLast updated6355Why this server?
Integrates with Ollama for local embedding generation, enabling semantic search, similarity matching, and automated metadata suggestions using locally-hosted language models.
AlicenseAqualityBmaintenanceMCP Server for local knowledge management. Semantic + keywords + tagsLast updated3691MITWhy this server?
Integrates with Ollama for local embedding models, supporting document embedding and semantic search functionality.
AlicenseBqualityDmaintenanceA Model Context Protocol server that enables semantic search capabilities by providing tools to manage Qdrant vector database collections, process and embed documents using various embedding services, and perform semantic searches across vector embeddings.Last updated4563MITWhy this server?
Integrates with Ollama to provide semantic embedding similarity for tool retrieval, enabling cross-language and semantic search capabilities.
AlicenseAqualityBmaintenanceGraph-based tool retrieval for LLM agents. Builds a tool graph from OpenAPI/MCP specs and retrieves multi-step workflows via hybrid search (BM25 + graph traversal + embedding), recovering accuracy from 12% to 82% with 79% fewer tokens. Also works as an MCP Proxy to aggregate multiple servers behind 3 meta-tools.Last updated66MITWhy this server?
Provides Ollama AI models with ServiceNow integration capabilities, allowing local AI models to query, create, update, and automate ServiceNow operations across all supported modules and domains.
AlicenseBqualityDmaintenanceThe most advanced & comprehensive ServiceNow MCP server — 150+ production-ready tools across 17 modules (ITSM, ITOM, HRSD, CSM, SecOps, GRC, Agile, ATF, Flow Designer, Now Assist, and more). Supports multi-instance management, four-tier permission control, 10 role-based tool packages, OAuth 2.0 + Basic Auth, and integrates with Claude, GPT-4o, Gemini, Cursor, VS Code, and Codex.Last updated1006721MIT