Memory Context Provider Server
Provides TypeScript support for type safety and enhanced developer experience when working with the MCP server's API.
Click on "Install Server".
Wait a few minutes for the server to deploy. Once ready, it will show a "Started" state.
In the chat, type
@followed by the MCP server name and your instructions, e.g., "@Memory Context Provider Serveradd 'summarize our last conversation' to my context"
That's it! The server will respond to your query, and you can continue using it as needed.
Here is a step-by-step guide with screenshots.
Memory Context Provider (MCP) Server
A server that manages context for LLM interactions, storing and providing relevant context for each user.
Features
In-memory storage of user contexts
Context management with last 5 prompts
RESTful API endpoints
TypeScript support
Related MCP server: OpenAPI MCP Server
Setup
Install dependencies:
npm installStart the development server:
npm run dev
API Endpoints
POST /context/:userId
Add a new prompt to user's context and get updated context.
Request body:
{
"prompt": "Your prompt here"
}Response:
{
"context": "Combined context from last 5 prompts"
}GET /context/:userId
Get current context for a user.
Response:
{
"context": "Current context"
}DELETE /context/:userId
Clear context for a user.
Response:
{
"message": "Context cleared"
}Development
npm run dev: Start development server with hot reloadnpm run build: Build TypeScript filesnpm start: Run built files
This server cannot be installed
Resources
Unclaimed servers have limited discoverability.
Looking for Admin?
If you are the server author, to access and configure the admin panel.
Latest Blog Posts
MCP directory API
We provide all the information about MCP servers via our MCP API.
curl -X GET 'https://glama.ai/api/mcp/v1/servers/Srish-ty/MCP-Testing-interface-for-LLMs'
If you have feedback or need assistance with the MCP directory API, please join our Discord server