College Football MCP (cfb-mcp)
A Python-based Model Context Protocol (MCP) server that provides real-time college football game information, betting odds, and historical performance data for teams and players.
Features
Live Game Scores & Odds: Get real-time scores and betting odds for NCAA college football games
Player Statistics: Retrieve last 5 games' stats for any player
Team Performance: Get recent game results and team information
Next Game Odds: Find upcoming games and their betting lines
Quick Start
Prerequisites
Python 3.11+
Docker (optional, for containerized deployment)
API Keys:
The Odds API - for live scores and betting odds
CollegeFootballData API - for team and player statistics
Installation
Clone the repository:
Create a virtual environment:
Install dependencies:
Set up environment variables:
Running the Server
Docker Deployment
Build the Docker image:
Run the container:
For VPS deployment, pull and run:
API Endpoints
The MCP server exposes the following functions via both MCP protocol (/mcp/*) and REST API (/api/*) endpoints:
1. get_game_odds_and_score
Get live game scores and betting odds for a specific matchup.
POST /mcp/get_game_odds_and_score or GET /api/get_game_odds_and_score
Request:
Response:
2. get_recent_player_stats
Get player's last 5 games statistics.
POST /mcp/get_recent_player_stats or GET /api/get_recent_player_stats
Request:
3. get_team_recent_results
Get team's last 5 game results.
POST /mcp/get_team_recent_results or GET /api/get_team_recent_results
Request:
4. get_team_info
Get team's current season overview including record and rankings.
POST /mcp/get_team_info or GET /api/get_team_info
Request:
5. get_next_game_odds
Get next scheduled game and betting odds for a team.
POST /mcp/get_next_game_odds or GET /api/get_next_game_odds
Request:
Architecture
Phase 1: MCP Server (Completed ✅)
FastAPI-based REST API server
5 core functions for college football data
Integration with The Odds API and CollegeFootballData API
Phase 2: Full Web Application (In Progress)
Agent Service: FastAPI chat service with LLM integration
Web UI: Single-page chat interface (mobile-friendly)
Caddy: Reverse proxy with automatic HTTPS
Docker Compose: Multi-container orchestration
Development
This project follows the Model Context Protocol standard for AI agent integration.
Project Structure
Environment Variables
Create a .env file in the project root with the following variables:
Getting API Keys
The Odds API: Sign up at the-odds-api.com
CollegeFootballData API: Get a free key at collegefootballdata.com
OpenAI API: Get your key from platform.openai.com
Architecture Overview
System Components
Component Details
MCP Server (
src/): FastAPI server exposing 5 core functions for college football dataAgent Service (
agent_service/): FastAPI service that orchestrates LLM + MCP callsWeb UI (
web_ui/): Single-page chat interface (mobile-friendly)Caddy: Reverse proxy providing HTTPS and routing
Deployment
Quick Start with Docker Compose
Set up environment variables:
cp .env.example .env # Edit .env and add your API keysUpdate Caddyfile: Edit
Caddyfileand replacecfb.yourdomain.comwith your actual domain.Deploy:
docker compose up -d --buildAccess:
Web UI:
https://cfb.yourdomain.comAPI:
https://cfb.yourdomain.com/api/*
Domain Setup for Caddy
Point your domain to your VPS IP address (A record)
Ensure ports are open:
Port 80 (HTTP)
Port 443 (HTTPS)
Caddy will automatically:
Obtain SSL certificate from Let's Encrypt
Renew certificates automatically
Handle HTTPS redirects
Individual Service Deployment
MCP Server Only
Agent Service Only
Troubleshooting
Common Issues
"Missing Bearer token" error:
Ensure
APP_TOKENis set in.envCheck that the token is being sent in the Authorization header
"ODDS_API_KEY is not configured":
Verify
.envfile exists and containsODDS_API_KEYCheck that the MCP server container has access to the
.envfile
Caddy certificate issues:
Ensure domain DNS points to your server
Check that ports 80 and 443 are open
Verify Caddyfile domain matches your actual domain
Agent service can't reach MCP server:
Check
MCP_SERVER_URLenvironment variableVerify both services are on the same Docker network
Check service names in docker-compose.yml
OpenAI API errors:
Verify
OPENAI_API_KEYis set correctlyCheck API key has sufficient credits
Review OpenAI API rate limits
Logs
View logs for all services:
View logs for specific service:
Development
Running Locally (without Docker)
MCP Server:
cd /path/to/cfb-mcp source venv/bin/activate uvicorn src.server:app --host 0.0.0.0 --port 8000Agent Service:
cd agent_service python -m venv venv source venv/bin/activate pip install -r requirements.txt uvicorn main:app --host 0.0.0.0 --port 8001Web UI:
Serve with any static file server, or use nginx locally
Update
API_BASEinindex.htmlto point to agent service
License
MIT