Skip to main content
Glama

MCP Embedding Storage Server

MCP Embedding Storage Server

An MCP server for storing and retrieving information using vector embeddings via the AI Embeddings API.

Features

  • Store content with automatically generated embeddings

  • Search content using semantic similarity

  • Access content through both tools and resources

  • Use pre-defined prompts for common operations

How It Works

This MCP server connects to the AI Embeddings API, which:

  1. Processes content and breaks it into sections

  2. Generates embeddings for each section

  3. Stores both the content and embeddings in a database

  4. Enables semantic search using vector similarity

When you search, the API finds the most relevant sections of stored content based on the semantic similarity of your query to the stored embeddings.

Installation

# Install with npm npm install -g mcp-embedding-storage # Or with pnpm pnpm add -g mcp-embedding-storage # Or with yarn yarn global add mcp-embedding-storage

Usage with Claude for Desktop

Add the following configuration to your claude_desktop_config.json file:

{ "mcpServers": { "embedding-storage": { "command": "mcp-embedding-storage" } } }

Then restart Claude for Desktop to connect to the server.

Available Tools

store-content

Stores content with automatically generated embeddings.

Parameters:

  • content: The content to store

  • path: Unique identifier path for the content

  • type (optional): Content type (e.g., 'markdown')

  • source (optional): Source of the content

  • parentPath (optional): Path of the parent content (if applicable)

search-content

Searches for content using vector similarity.

Parameters:

  • query: The search query

  • maxMatches (optional): Maximum number of matches to return

Available Resources

search://{query}

Resource template for searching content.

Example usage: search://machine learning basics

Available Prompts

store-new-content

A prompt to help store new content with embeddings.

Parameters:

  • path: Unique identifier path for the content

  • content: The content to store

search-knowledge

A prompt to search for knowledge.

Parameters:

  • query: The search query

API Integration

This MCP server integrates with the AI Embeddings API at https://ai-embeddings.vercel.app/ with the following endpoints:

  1. Generate Embeddings (POST /api/generate-embeddings)

    • Generates embeddings for content and stores them in the database

    • Required parameters: content and path

  2. Vector Search (POST /api/vector-search)

    • Searches for content based on semantic similarity

    • Required parameter: prompt

Building from Source

# Clone the repository git clone https://github.com/yourusername/mcp-embedding-storage.git cd mcp-embedding-storage # Install dependencies pnpm install # Build the project pnpm run build # Start the server pnpm start

License

MIT

Deploy Server
-
security - not tested
F
license - not found
-
quality - not tested

remote-capable server

The server can be hosted and run remotely because it primarily relies on remote services or has no dependency on the local environment.

Enables storing and retrieving information using vector embeddings with semantic search capabilities. Integrates with the AI Embeddings API to automatically generate embeddings for content and perform similarity-based searches through natural language queries.

  1. Features
    1. How It Works
      1. Installation
        1. Usage with Claude for Desktop
          1. Available Tools
            1. store-content
            2. search-content
          2. Available Resources
            1. search://{query}
          3. Available Prompts
            1. store-new-content
            2. search-knowledge
          4. API Integration
            1. Building from Source
              1. License

                MCP directory API

                We provide all the information about MCP servers via our MCP API.

                curl -X GET 'https://glama.ai/api/mcp/v1/servers/RichardFelix999/Knowledge-EmbeddingAPI-MCP'

                If you have feedback or need assistance with the MCP directory API, please join our Discord server