Uses FastAPI to create a REST API server that handles requests for managing conversation context and interacting with language models.
Integrates with Google Gemini API to enable context-aware conversations with the language model, allowing the system to maintain conversation history across multiple requests.
Click on "Install Server".
Wait a few minutes for the server to deploy. Once ready, it will show a "Started" state.
In the chat, type
@followed by the MCP server name and your instructions, e.g., "@MCP: Model Context Protocolcontinue our discussion about Python decorators from yesterday"
That's it! The server will respond to your query, and you can continue using it as needed.
Here is a step-by-step guide with screenshots.
This server cannot be installed