Click on "Install Server".
Wait a few minutes for the server to deploy. Once ready, it will show a "Started" state.
In the chat, type
@followed by the MCP server name and your instructions, e.g., "@MemGPT MCP Serverchat what are the main features of Claude 3.5 Sonnet?"
That's it! The server will respond to your query, and you can continue using it as needed.
Here is a step-by-step guide with screenshots.
MemGPT MCP Server
A TypeScript-based MCP server that implements a memory system for LLMs. It provides tools for chatting with different LLM providers while maintaining conversation history.
Features
Tools
chat- Send a message to the current LLM providerTakes a message parameter
Supports multiple providers (OpenAI, Anthropic, OpenRouter, Ollama)
get_memory- Retrieve conversation historyOptional
limitparameter to specify number of memories to retrievePass
limit: nullfor unlimited memory retrievalReturns memories in chronological order with timestamps
clear_memory- Clear conversation historyRemoves all stored memories
use_provider- Switch between different LLM providersSupports OpenAI, Anthropic, OpenRouter, and Ollama
Persists provider selection
use_model- Switch to a different model for the current providerSupports provider-specific models:
Anthropic Claude Models:
Claude 3 Series:
claude-3-haiku: Fastest response times, ideal for tasks like customer support and content moderationclaude-3-sonnet: Balanced performance for general-purpose useclaude-3-opus: Advanced model for complex reasoning and high-performance tasks
Claude 3.5 Series:
claude-3.5-haiku: Enhanced speed and cost-effectivenessclaude-3.5-sonnet: Superior performance with computer interaction capabilities
OpenAI: 'gpt-4o', 'gpt-4o-mini', 'gpt-4-turbo'
OpenRouter: Any model in 'provider/model' format (e.g., 'openai/gpt-4', 'anthropic/claude-2')
Ollama: Any locally available model (e.g., 'llama2', 'codellama')
Persists model selection
Related MCP server: File Context MCP
Development
Install dependencies:
Build the server:
For development with auto-rebuild:
Installation
To use with Claude Desktop, add the server config:
On MacOS: ~/Library/Application Support/Claude/claude_desktop_config.json
On Windows: %APPDATA%/Claude/claude_desktop_config.json
Environment Variables
OPENAI_API_KEY- Your OpenAI API keyANTHROPIC_API_KEY- Your Anthropic API keyOPENROUTER_API_KEY- Your OpenRouter API key
Debugging
Since MCP servers communicate over stdio, debugging can be challenging. We recommend using the MCP Inspector:
The Inspector will provide a URL to access debugging tools in your browser.
Recent Updates
Claude 3 and 3.5 Series Support (March 2024)
Added support for latest Claude models:
Claude 3 Series (Haiku, Sonnet, Opus)
Claude 3.5 Series (Haiku, Sonnet)
Unlimited Memory Retrieval
Added support for retrieving unlimited conversation history
Use
{ "limit": null }with theget_memorytool to retrieve all stored memoriesUse
{ "limit": n }to retrieve the n most recent memoriesDefault limit is 10 if not specified