Skip to main content
Glama

Store MCP Server

Store MCP Server

A Model Context Protocol (MCP) server that enables AI agents to store and retrieve information persistently.

Project Structure

store_mcp/ ├── README.md # This file ├── pyproject.toml # Project dependencies and configuration ├── src/ │ └── store_mcp/ │ ├── __init__.py # Package initialization │ ├── server.py # Main MCP server implementation │ └── storage.py # Storage backend (JSON/SQLite) └── tests/ ├── __init__.py └── test_server.py # Unit tests

Features

  • Store Information: Save key-value pairs or structured data

  • Retrieve Information: Query stored data by key or search criteria

  • List Keys: View all available stored keys

  • Delete Information: Remove stored data when no longer needed

  • Persistent Storage: Data persists across sessions

Installation

# Install dependencies pip install -e .

Usage

# Run the MCP server python -m store_mcp.server

MCP Tools

The server exposes the following tools to AI agents:

  • store_data: Store information with a key

  • retrieve_data: Retrieve information by key

  • list_keys: List all stored keys

  • delete_data: Delete stored information by key

  • search_data: Search stored information by pattern or content

Configuration

MCP Client Configuration

To use this server with an MCP client (like Claude Desktop), add it to your MCP settings configuration file:

For Claude Desktop on MacOS: ~/Library/Application Support/Claude/claude_desktop_config.json

For Claude Desktop on Windows: %APPDATA%\Claude\claude_desktop_config.json

{ "mcpServers": { "store": { "command": "python", "args": [ "-m", "store_mcp.server" ], "env": { "PYTHONPATH": "/absolute/path/to/store_mcp/src" } } } }

Alternative using uvx (if installed via pip):

{ "mcpServers": { "store": { "command": "uvx", "args": [ "--from", "/absolute/path/to/store_mcp", "python", "-m", "store_mcp.server" ] } } }

Storage Configuration

The server uses a local file-based storage system (JSON) located at:

  • Default: ~/.store_mcp/data.json

To use a custom storage location, modify server.py and initialize Storage with a custom path:

storage = Storage("/path/to/custom/data.json")

Environment Variables

You can set the following environment variables:

  • STORE_MCP_PATH: Custom path for the storage file (default: ~/.store_mcp/data.json)

Example configuration with custom storage path:

{ "mcpServers": { "store": { "command": "python", "args": ["-m", "store_mcp.server"], "env": { "PYTHONPATH": "/absolute/path/to/store_mcp/src", "STORE_MCP_PATH": "/custom/path/to/storage.json" } } } }

Development

# Run tests pytest tests/

Requirements

  • Python 3.10+

  • mcp library

-
security - not tested
F
license - not found
-
quality - not tested

local-only server

The server can only run on the client's local machine because it depends on local resources.

Enables AI agents to store and retrieve information persistently using key-value pairs with JSON file-based storage. Supports storing, retrieving, listing, deleting, and searching data across sessions.

  1. Project Structure
    1. Features
      1. Installation
        1. Usage
          1. MCP Tools
            1. Configuration
              1. MCP Client Configuration
              2. Storage Configuration
              3. Environment Variables
            2. Development
              1. Requirements

                MCP directory API

                We provide all the information about MCP servers via our MCP API.

                curl -X GET 'https://glama.ai/api/mcp/v1/servers/mariano-cecowski/store_mcp'

                If you have feedback or need assistance with the MCP directory API, please join our Discord server