Skip to main content
Glama

Memory Context Provider Server

by Srish-ty
README.md1.04 kB
# Memory Context Provider (MCP) Server A server that manages context for LLM interactions, storing and providing relevant context for each user. ## Features - In-memory storage of user contexts - Context management with last 5 prompts - RESTful API endpoints - TypeScript support ## Setup 1. Install dependencies: ```bash npm install ``` 2. Start the development server: ```bash npm run dev ``` ## API Endpoints ### POST /context/:userId Add a new prompt to user's context and get updated context. Request body: ```json { "prompt": "Your prompt here" } ``` Response: ```json { "context": "Combined context from last 5 prompts" } ``` ### GET /context/:userId Get current context for a user. Response: ```json { "context": "Current context" } ``` ### DELETE /context/:userId Clear context for a user. Response: ```json { "message": "Context cleared" } ``` ## Development - `npm run dev`: Start development server with hot reload - `npm run build`: Build TypeScript files - `npm start`: Run built files

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/Srish-ty/MCP-Testing-interface-for-LLMs'

If you have feedback or need assistance with the MCP directory API, please join our Discord server