mcp-server-qdrant

Official
  • Search
  • Databases
Python
29
-
security - not tested
F
license - not found
-
quality - not tested

This repository is an example of how to create a MCP server for Qdrant, a vector search engine.

  1. Tools
  2. Prompts
  3. Resources
  4. Server Configuration
  5. README.md

Prompts

Interactive templates invoked by user choice

NameDescription

No prompts

Resources

Contextual data attached and managed by the client

NameDescription

No resources

Tools

Functions exposed to the LLM to take actions

NameDescription

No tools

Server Configuration

Describes the environment variables required to run the server.

NameRequiredDescriptionDefault
QDRANT_URLNoURL of the Qdrant server, e.g. http://localhost:6333
QDRANT_API_KEYNoAPI key for the Qdrant server
COLLECTION_NAMEYesName of the collection to use
QDRANT_LOCAL_PATHNoPath to the local Qdrant database
FASTEMBED_MODEL_NAMENoName of the FastEmbed model to usesentence-transformers/all-MiniLM-L6-v2
README.md

mcp-server-qdrant: A Qdrant MCP server

smithery badge

The Model Context Protocol (MCP) is an open protocol that enables seamless integration between LLM applications and external data sources and tools. Whether you’re building an AI-powered IDE, enhancing a chat interface, or creating custom AI workflows, MCP provides a standardized way to connect LLMs with the context they need.

This repository is an example of how to create a MCP server for Qdrant, a vector search engine.

Overview

A basic Model Context Protocol server for keeping and retrieving memories in the Qdrant vector search engine. It acts as a semantic memory layer on top of the Qdrant database.

Components

Tools

  1. qdrant-store-memory
    • Store a memory in the Qdrant database
    • Input:
      • information (string): Memory to store
    • Returns: Confirmation message
  2. qdrant-find-memories
    • Retrieve a memory from the Qdrant database
    • Input:
      • query (string): Query to retrieve a memory
    • Returns: Memories stored in the Qdrant database as separate messages

Installation

Using uv (recommended)

When using uv no specific installation is needed to directly run mcp-server-qdrant.

uv run mcp-server-qdrant \ --qdrant-url "http://localhost:6333" \ --qdrant-api-key "your_api_key" \ --collection-name "my_collection" \ --fastembed-model-name "sentence-transformers/all-MiniLM-L6-v2"

Installing via Smithery

To install Qdrant MCP Server for Claude Desktop automatically via Smithery:

npx @smithery/cli install mcp-server-qdrant --client claude

Usage with Claude Desktop

To use this server with the Claude Desktop app, add the following configuration to the "mcpServers" section of your claude_desktop_config.json:

{ "qdrant": { "command": "uvx", "args": [ "mcp-server-qdrant", "--qdrant-url", "http://localhost:6333", "--qdrant-api-key", "your_api_key", "--collection-name", "your_collection_name" ] } }

Replace http://localhost:6333, your_api_key and your_collection_name with your Qdrant server URL, Qdrant API key and collection name, respectively. The use of API key is optional, but recommended for security reasons, and depends on the Qdrant server configuration.

This MCP server will automatically create a collection with the specified name if it doesn't exist.

By default, the server will use the sentence-transformers/all-MiniLM-L6-v2 embedding model to encode memories. For the time being, only FastEmbed models are supported, and you can change it by passing the --fastembed-model-name argument to the server.

Using the local mode of Qdrant

To use a local mode of Qdrant, you can specify the path to the database using the --qdrant-local-path argument:

{ "qdrant": { "command": "uvx", "args": [ "mcp-server-qdrant", "--qdrant-local-path", "/path/to/qdrant/database", "--collection-name", "your_collection_name" ] } }

It will run Qdrant local mode inside the same process as the MCP server. Although it is not recommended for production.

Environment Variables

The configuration of the server can be also done using environment variables:

  • QDRANT_URL: URL of the Qdrant server, e.g. http://localhost:6333
  • QDRANT_API_KEY: API key for the Qdrant server
  • COLLECTION_NAME: Name of the collection to use
  • FASTEMBED_MODEL_NAME: Name of the FastEmbed model to use
  • QDRANT_LOCAL_PATH: Path to the local Qdrant database

You cannot provide QDRANT_URL and QDRANT_LOCAL_PATH at the same time.

License

This MCP server is licensed under the MIT License. This means you are free to use, modify, and distribute the software, subject to the terms and conditions of the MIT License. For more details, please see the LICENSE file in the project repository.

GitHub Badge

Glama performs regular codebase and documentation scans to:

  • Confirm that the MCP server is working as expected.
  • Confirm that there are no obvious security issues with dependencies of the server.
  • Extract server characteristics such as tools, resources, prompts, and required parameters.

Our directory badge helps users to quickly asses that the MCP server is safe, server capabilities, and instructions for installing the server.

Copy the following code to your README.md file:

Alternative MCP servers

  • -
    security
    F
    license
    -
    quality
    This server provides a comprehensive integration with Zendesk. Retrieving and managing tickets and comments. Ticket analyzes and response drafting. Access to help center articles as knowledge base.
  • -
    security
    -
    license
    -
    quality
    Tools for executing JQL queries. Tools for creating, editing, and deleting Jira tickets. Tools for listing Jira projects and statuses
    • Apple
  • -
    security
    -
    license
    -
    quality
    A Model Context Protocol (MCP) server for managing Microsoft 365 environments through Claude or other AI assistants. This server provides comprehensive management capabilities for users, groups, teams, SharePoint, and device management in Microsoft 365.
  • -
    security
    A
    license
    -
    quality
    MCP Server for the Notion API, enabling Claude to interact with Notion workspaces.
    MIT