Integrations
Uses FastAPI to create a REST API server that handles requests for managing conversation context and interacting with language models.
Integrates with Google Gemini API to enable context-aware conversations with the language model, allowing the system to maintain conversation history across multiple requests.
MCP: Model Context Protocol
Project Description
MCP (Model Context Protocol) is a system for managing context when interacting with language models (LLM). The system allows you to save the context of dialogs between requests, which allows the language model to "remember" previous interactions.
Peculiarities
- Maintaining and managing context between requests
- Integration with Google Gemini API
- Possibility of using multiple independent sessions
- Switching between sessions using identifiers
- Local storage of dialog history
Technologies
- Python 3
- FastAPI
- Gemini API
- Uvicorn
Installation
- Clone repository:
- Create and activate a virtual environment:
- Install dependencies:
- Create a
.env
file and add the API key:
Usage
Starting the server
The server will be available at: http://localhost:9999
Using the client
Send request (will create a new session):
Continue the dialogue in the same session:
Use a specific session by ID:
Show list of all sessions:
Project structure
mcp_server.py
- Main MCP servermcp_client.py
- Client for interaction with the serverrequirements.txt
- Project dependencies
License
MIT
Author
Alex Replicator - alex.replicator@gmail.com
This server cannot be installed
A system that manages context for language model interactions, allowing the model to remember previous interactions across multiple independent sessions using Gemini API.