Click on "Install Server".
Wait a few minutes for the server to deploy. Once ready, it will show a "Started" state.
In the chat, type
@followed by the MCP server name and your instructions, e.g., "@Cocktails RAG MCP Servergive me a refreshing gin-based cocktail for a summer evening"
That's it! The server will respond to your query, and you can continue using it as needed.
Here is a step-by-step guide with screenshots.
Cocktails RAG MCP Server
MCP tool for cocktail recommendations using RAG (Retrieval-Augmented Generation).
Requirements
Python 3.11+
uv package manager - https://docs.astral.sh/uv/getting-started/installation/
Quick Start
Get Groq API key (free): https://console.groq.com/keys
Setup:
# Clone the repository git clone https://github.com/00200200/cocktails-rag-mcp.git cd cocktails-rag-mcp # Copy environment template cp .env.example .env # Edit .env and add your GROQ_API_KEY nano .env # Install dependencies uv syncPre-download models (required):
Download embeddings and reranker models:
uv run python -c "from src.rag.rag import RAG; RAG(); print('Models downloaed!')"Install for Claude Desktop:
Automatic (Recommended)
uv run fastmcp install claude-desktop fastmcp.json --name cocktails --env-file .envManual
Edit config file:
macOS:
~/Library/Application Support/Claude/claude_desktop_config.jsonWindows:
%APPDATA%\Claude\claude_desktop_config.json
{ "mcpServers": { "cocktails": { "command": "uv", "args": [ "run", "--with","faiss-cpu", "--with","fastmcp", "--with","jq", "--with","langchain", "--with","langchain-community", "--with","langchain-groq", "--with","langchain-huggingface", "--with","pandas", "--with","python-dotenv", "--with","sentence-transformers", "fastmcp", "run", "/ABSOLUTE/PATH/TO/src/mcp/server.py:mcp" ], "env": { "GROQ_API_KEY": "your_groq_api_key_here" } } } }Replace
/ABSOLUTE/PATH/TO/with your project path andGROQ_API_KEYwith your API key.
Example Usage
Local Testing
Development
Code Formatting
Project Structure
Tech Stack
MCP Framework: FastMCP
RAG Framework: LangChain
Embeddings: BAAI/bge-m3 (local via HuggingFace)
Vector DB: FAISS (local)
Reranker: BAAI/bge-reranker-v2-m3 (local via HuggingFace)
LLM: Groq API (llama-3.1-8b-instant)
Package Manager: uv