Skip to main content
Glama

FlashCardsMCP

by mtib
README.md3.06 kB
# FlashCardsMCP This is a dockerized Python Model Context Protocol (MCP) server for managing flash card projects. It uses OpenAI embeddings and SQLite for semantic search and storage. ## Features - List all project names and ids - Semantic search for project by name (using OpenAI embeddings) - Get random flash card by project id - Add flash card to project (with question, answer, optional hint, optional description) - List all flash cards by project - Semantic search for flash cards by query (using OpenAI embeddings) - Global semantic search for cards across all projects - Retrieve a card by its id - All API/tool responses include a `type` field: `project` or `card` - No binary embedding data is ever returned in API responses ## API/Tool Design - All tools raise `ValueError` for not found or empty results - Project and card creation tools return the full object, not just the id - See `.github/copilot-instructions.md` for code generation rules ## Getting Started 1. **Install dependencies:** ```sh pip install -r requirements.txt ``` 2. **Run the server:** ```sh python main.py ``` 3. **Run with Docker:** ```sh docker build -t flash-card-mcp . # Run with database persistence (recommended): docker run -v $(pwd)/storage:/app/storage/database.db flash-card-mcp ``` ## Environment Variables - **OPENAI_API_KEY**: Required. Set this environment variable to your OpenAI API key to enable embedding generation. Example: ```sh export OPENAI_API_KEY=sk-...your-key... ``` You must set this variable before running the server or running the Docker container. ## Usage This server exposes its API via the Model Context Protocol (MCP) using FastMCP. You can call the following tools: - `get_all_projects()` → List all projects - `add_project(name)` → Create a new project (returns full project dict) - `search_project_by_name(name)` → Semantic search for a project (returns full project dict) - `get_random_card_by_project(project_id)` → Get a random card from a project - `add_card(project_id, question, answer, hint=None, description=None)` → Add a card (returns full card dict) - `get_all_cards_by_project(project_id)` → List all cards in a project - `search_cards_by_embedding(project_id, query)` → Semantic search for cards in a project - `global_search_cards_by_embedding(query)` → Semantic search for cards across all projects - `get_card_by_id(card_id)` → Retrieve a card by its id All returned objects include a `type` field and never include binary embedding data. ## Development - All project and card data is stored in SQLite (`database.db`) - Embeddings are generated using OpenAI's `text-embedding-ada-002` model - The server is implemented in `main.py` and `db.py` - See `.github/copilot-instructions.md` for code and API rules ### Inspector ``` npx @modelcontextprotocol/inspector docker 'run -e OPENAI_API_KEY=sk-...your-key... -v /<path>/storage:/app/storage --rm -i flash-card-mcp ``` --- For more details, see the code and docstrings in `main.py` and `db.py`.

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/mtib/flash-cards-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server