Search for:

Information about Ollama

  • Why this server?

    Provides RAG capabilities for semantic document search using Qdrant vector database and Ollama/OpenAI embeddings.

    -
    security
    A
    license
    -
    quality
    Provides RAG capabilities for semantic document search using Qdrant vector database and Ollama/OpenAI embeddings, allowing users to add, search, list, and delete documentation with metadata support.
    5
    4
    TypeScript
    Apache 2.0
  • Why this server?

    Enables seamless AI integration via Ollama's Deepseek model, providing protocol compliance and automatic configuration.

    -
    security
    F
    license
    -
    quality
    Enables seamless AI integration via Ollama's Deepseek model, providing protocol compliance and automatic configuration for clean AI-driven interactions.
    1
    Python
  • Why this server?

    A bridge that enables seamless integration of Ollama's local LLM capabilities into MCP-powered applications, allowing users to manage and run AI models locally with full API coverage.

    A
    security
    F
    license
    A
    quality
    A bridge that enables seamless integration of Ollama's local LLM capabilities into MCP-powered applications, allowing users to manage and run AI models locally with full API coverage.
    10
    33
    JavaScript
    • Apple
  • Why this server?

    An open protocol server that implements Anthropic's Model Context Protocol to enable seamless integration between LLM applications and RAG data sources using Sionic AI's Storm Platform.

    -
    security
    F
    license
    -
    quality
    An open protocol server that implements Anthropic's Model Context Protocol to enable seamless integration between LLM applications and RAG data sources using Sionic AI's Storm Platform.
    27
    Python
    • Apple
  • Why this server?

    A Model Context Protocol server that enables LLMs to interact with Elasticsearch clusters, allowing them to manage indices and execute search queries using natural language.

    -
    security
    F
    license
    -
    quality
    A Model Context Protocol server that enables LLMs to interact with Elasticsearch clusters, allowing them to manage indices and execute search queries using natural language.
    1
    JavaScript
  • Why this server?

    A Model Context Protocol server built as a Model Context Protocol (MCP) server that provides advanced web search, content extraction, web crawling, and scraping capabilities using the Firecrawl API.

    A
    security
    F
    license
    A
    quality
    Built as a Model Context Protocol (MCP) server that provides advanced web search, content extraction, web crawling, and scraping capabilities using the Firecrawl API.
    4
    1
    Python
    • Apple
    • Linux
  • Why this server?

    Enables seamless integration between Ollama's local LLM models and MCP-compatible applications, supporting model management and chat interactions.

    -
    security
    A
    license
    -
    quality
    Enables seamless integration between Ollama's local LLM models and MCP-compatible applications, supporting model management and chat interactions.
    50
    13
    TypeScript
    MIT License
  • Why this server?

    A flexible memory system for AI applications that supports multiple LLM providers and can be used either as an MCP server or as a direct library integration, enabling autonomous memory management without explicit commands.

    A
    security
    A
    license
    A
    quality
    A flexible memory system for AI applications that supports multiple LLM providers and can be used either as an MCP server or as a direct library integration, enabling autonomous memory management without explicit commands.
    3
    746
    32
    JavaScript
    MIT License