Skip to main content
Glama
by Srish-ty

Memory Context Provider (MCP) Server

A server that manages context for LLM interactions, storing and providing relevant context for each user.

Features

  • In-memory storage of user contexts

  • Context management with last 5 prompts

  • RESTful API endpoints

  • TypeScript support

Related MCP server: OpenAPI MCP Server

Setup

  1. Install dependencies:

    npm install
  2. Start the development server:

    npm run dev

API Endpoints

POST /context/:userId

Add a new prompt to user's context and get updated context.

Request body:

{ "prompt": "Your prompt here" }

Response:

{ "context": "Combined context from last 5 prompts" }

GET /context/:userId

Get current context for a user.

Response:

{ "context": "Current context" }

DELETE /context/:userId

Clear context for a user.

Response:

{ "message": "Context cleared" }

Development

  • npm run dev: Start development server with hot reload

  • npm run build: Build TypeScript files

  • npm start: Run built files

-
security - not tested
F
license - not found
-
quality - not tested

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/Srish-ty/MCP-Testing-interface-for-LLMs'

If you have feedback or need assistance with the MCP directory API, please join our Discord server