Skip to main content
Glama
gedin-eth

College Football MCP

by gedin-eth
prd.txt12.5 kB
# College Football MCP (cfb-mcp) – Product Requirements ## Introduction The **College Football MCP (cfb-mcp)** will be a Python-based server that enables an AI assistant to retrieve real-time game information, betting odds, and historical performance data for college football teams and players. It will integrate two external data sources: **The Odds API** for live scores and betting odds, and the **CollegeFootballData API (CFBD)** for detailed player/team stats and historical data. The server will expose up to **5 key functions** (endpoints) that the AI agent can call, focusing on the most impactful capabilities for a sports/betting assistant. The entire service will be containerized (Docker) and published on GitHub for easy deployment and collaboration. **Goals:** - Provide **real-time game updates** (scores and odds) for NCAA college football games via an AI-accessible interface. - Enable retrieval of **player and team statistics** (e.g. last 5 game performances, team recent results) to support insights and trend analysis. - Maintain a **simple, limited function set (≤5 functions)** to allow rapid development (aiming to build a MVP in ~1 hour) while covering the most essential features. - Ensure the server is easily deployable (Docker) and integrable with AI agent platforms (following the Model-Context-Protocol standard). ## Key Features & Functional Requirements The following five functions are identified as the most impactful for our application. Each function corresponds to an MCP endpoint that the AI client can invoke. They are designed to cover the core use cases: fetching game odds and scores, player stats, and team performance data. 1. **`get_game_odds_and_score`** – *Live Game Info & Odds*. Given a specific game (identified by teams or a game ID), this function retrieves the current **live score** (if the game is in progress or just ended) and the latest **betting odds** for that matchup. It will use **The Odds API** to get real-time odds (moneyline, point spread, totals, etc.) and game status. If the game hasn't started yet, it returns the pre-game odds; if the game is live or just finished, it returns the live score (with quarter/time) and updated odds. This is critical for answering questions like *"What's the score of the Alabama game and the current spread?"* 2. **`get_recent_player_stats`** – *Player Last 5 Games Stats*. Given a player name (and possibly team or other identifier for disambiguation), this function pulls that player's statistics from the **last five games**. It leverages the CollegeFootballData API v2. For example, for a quarterback it might retrieve passing yards, touchdowns, etc. in each of the last 5 games; for a running back, rushing yards and scores per game, etc. This function addresses queries like *"How has Player X performed in the last few games?"* 3. **`get_team_recent_results`** – *Team Recent Performance*. Given a team name, this function retrieves the **last five game results** for that team. This includes the opponents, dates, final scores of those games, and win/loss outcome. It uses the CFBD API to get game results and team records. This data will allow the AI to identify trends such as winning streaks or recent performance (e.g. "Team Y won 4 of their last 5 games, with scores..."). By analyzing this output, the AI agent can answer questions about a team's momentum or form (e.g., *"How has Team Y been doing lately?"*). 4. **`get_team_info`** – *Team Info and Season Stats*. This function provides an overview of a given team's current season, including their **overall record** (wins-losses), current **rankings** if available, and possibly key team statistics or next game info. The CFBD API provides team season records and rankings data (e.g. AP/Coaches Poll or CFP rankings) which we can use. For example, if a user asks *"Tell me about Team Z this year,"* the AI can call this function to get "Team Z is 8-2 this season, currently ranked #5 in the AP Poll, averaging 35.2 points per game," etc. 5. **`get_next_game_odds`** – *Next Game & Odds for a Team*. Given a team name, this function finds the team's **next scheduled game** (upcoming opponent and date) and returns the **betting odds** for that matchup. This is a high-impact feature for users interested in future games and betting lines. The output will include details like "Team A's next game is vs Team B on 2025-09-10, and Team A is favored by 7.5 points (odds: -110)." This answers questions such as *"Who does Team A play next and what are the odds?"* **Note:** All these functions will require **API keys** for the external services. We have an API key for The Odds API already (for odds and scores), and we will obtain a key for the CollegeFootballData API (required for team/player data). The keys will be stored in a secure manner (e.g., environment variables in a `.env` file, not hard-coded). ## Non-Functional Requirements - **Performance:** The server should handle requests quickly (each function call will typically involve an external API call; we should aim for efficient use of those APIs and perhaps minimal caching for repeated queries if within the same session). - **Rate Limits & Reliability:** We must respect API rate limits. The implementation should be mindful of this – e.g., not polling excessively. In case of API failures or timeouts, the MCP server should handle errors gracefully (returning a meaningful error message to the AI agent). - **Security:** API keys will be kept out of source code (use environment variables and Docker secrets). - **Compatibility:** The MCP server will adhere to the **Model Context Protocol** interface standards so that it can be easily integrated with AI assistant platforms. - **Dockerization:** We will provide a Dockerfile and container configuration. The container should be lightweight (base Python image) and expose the necessary port (likely default 8000 for MCP server). - **GitHub Repository & Code Quality:** The code will be organized and documented in a GitHub repo (named `cfb-mcp`). We will include a README with usage instructions. Given the short build time, we will focus on core functionality, but we aim to follow good coding practices (clear function structure, logging of key events, etc.). ## Implementation Plan (MVP in ~1 Hour) **1. Project Setup:** - Initialize GitHub repository `cfb-mcp` - Set up basic Python project structure - Create Python virtual environment - Install dependencies (requests, python-dotenv, MCP framework) - Set up MCP server framework (FastAPI or similar) **2. API Key Configuration:** - Add `.env` file to project (ensure it's in .gitignore) - Populate with placeholders for `ODDS_API_KEY` and `CFB_API_KEY` - Load keys via environment variables at startup **3. Implement `get_game_odds_and_score`:** - Call The Odds API endpoint for NCAA football odds - Parse JSON to find specific game by team names - Extract: teams, start time/status, score (if live), odds (spread, moneyline, over/under) - Format as JSON response **4. Implement `get_recent_player_stats`:** - Use CFBD API to get player stats for recent games - Query last 5 games by player - Format output with game dates, opponents, key stats per game **5. Implement `get_team_recent_results`:** - Call CFBD games endpoint for team's recent games - Get last 5 games with opponents, dates, scores, W/L outcomes - Format as structured JSON **6. Implement `get_team_info`:** - Use CFBD records and rankings endpoints - Get team's W-L record, conference record, current ranking - Format output with season summary **7. Implement `get_next_game_odds`:** - Query The Odds API for upcoming games - Find team's next scheduled game - Extract matchup, date, and betting odds - Format response **8. MCP Server Integration:** - Tie all functions into MCP server routing - Register functions as MCP endpoints - Handle JSON requests/responses - Verify server runs and endpoints are recognized **9. Testing & Debugging:** - Test each function with known data - Fix immediate bugs - Verify no crashes and responses are structured correctly **10. Dockerization:** - Write Dockerfile (Python 3.11-slim base) - Copy project files, install dependencies - Expose port 8000 - Build and test container **11. Documentation & GitHub Push:** - Create README.md with setup and usage instructions - Include Docker usage examples - Commit code to GitHub (ensure .env in .gitignore) **12. Quick Polish:** - Refine output formatting if time permits - Add code comments - Review function names and behaviors ## Extended Implementation Plan (Phase 2) **13. Agent Service Setup:** - Create `agent_service/` directory structure - Implement FastAPI skeleton with `/chat` endpoint - Add CORS middleware - Implement bearer token authentication - Add thread ID management **14. Agent Service - LLM Integration:** - Integrate OpenAI SDK - Implement tool-calling loop - Create MCP client wrapper to call MCP server functions - Map user queries to appropriate MCP function calls - Format LLM responses with tool results **15. Web UI Implementation:** - Create `web_ui/` directory - Implement single-page HTML chat interface - Add JavaScript for API communication - Implement localStorage for token/thread management - Style for mobile-friendly use - Add "Add to Home Screen" support **16. Caddy Configuration:** - Create Caddyfile with domain routing - Configure HTTPS (Let's Encrypt) - Set up reverse proxy rules - Test routing configuration **17. Docker Compose Setup:** - Create docker-compose.yml - Configure all services (mcp-server, agent-service, web-ui, caddy) - Set up environment variables - Configure volumes and networking - Test multi-container deployment **18. Integration Testing:** - Test end-to-end flow: Web UI → Agent → MCP Server - Verify HTTPS and routing - Test authentication - Test thread persistence - Mobile browser testing **19. Documentation & Deployment:** - Update README with full architecture - Document environment variables - Add deployment instructions - Document domain setup for Caddy - Add troubleshooting guide ## Extended Architecture (Phase 2) The project has been extended to include a complete web application with chat interface: ### System Components 1. **MCP Server (Backend Service)** - The college football data API server (already implemented) - Exposes 5 core functions for game data, odds, player/team stats - Runs on port 8000 - FastAPI-based REST API 2. **Agent Service** - FastAPI chat service that orchestrates LLM + MCP calls - `POST /chat` endpoint for user messages - Integrates with LLM (OpenAI) for tool-calling - Calls MCP server functions based on user queries - Thread-based conversation management - Bearer token authentication 3. **Web UI** - Static HTML/JS chat interface - Single-page application with chat interface - Mobile-friendly (can be added to home screen) - Calls agent-service API - LocalStorage for token and thread management 4. **Caddy Reverse Proxy** - HTTPS and routing - Automatic HTTPS via Let's Encrypt - Routes `/api/*` to agent-service - Routes root to web UI - Single domain deployment 5. **Docker Compose** - Orchestration - Multi-container setup - Environment variable management - Volume management for Caddy ### Deployment Architecture ``` User → Caddy (HTTPS) → { /api/* → Agent Service → MCP Server + LLM /* → Web UI (static) } ``` ## Technology Stack - **Backend (MCP Server):** Python 3.11+, FastAPI, requests, python-dotenv - **Agent Service:** Python 3.11+, FastAPI, OpenAI SDK - **Frontend:** HTML5, Vanilla JavaScript, CSS - **Reverse Proxy:** Caddy 2 - **Containerization:** Docker, Docker Compose - **APIs:** The Odds API, CollegeFootballData API (CFBD), OpenAI API - **Protocol:** Model Context Protocol (MCP), REST API ## Success Criteria ### Phase 1 (MCP Server) - COMPLETED - ✅ All 5 functions implemented and working - ✅ MCP server runs without errors - ✅ Functions return properly formatted JSON responses - ✅ Docker container builds and runs successfully ### Phase 2 (Full Web App) - Agent service integrates LLM with MCP server - Web UI provides chat interface - Caddy provides HTTPS and routing - Docker Compose orchestrates all services - Application accessible via web browser and mobile (home screen) - Bearer token authentication working - Thread-based conversations maintained

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/gedin-eth/cfb-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server