Surfline MCP Server
A Model Context Protocol (MCP) server that provides comprehensive surf forecasts from Surfline's API. Access detailed surf conditions, swell analysis, forecaster insights, tides, and more directly through Claude or any MCP-compatible client.
Features
🌊 Comprehensive Surf Data
Current conditions for 11 Santa Cruz spots (easily extensible to other regions)
Detailed swell breakdown (height, period, direction, power for each swell component)
8-hour hourly forecasts showing how conditions evolve
Expert forecaster observations with AM/PM specific timing advice
Wind conditions (speed, direction, offshore/onshore classification)
Quality ratings (1-5 stars)
🌅 Timing Information
Sunrise, sunset, dawn, and dusk times
Tide schedule with high/low times and heights
All times properly converted to Pacific timezone
🔐 Secure Authentication
Google OAuth integration for secure access
Works seamlessly with claude.ai web and mobile
No Surfline API keys required (uses public endpoints)
Quick Start
Prerequisites
Node.js 18+
A Cloudflare account (free tier works)
A Google Cloud project for OAuth (free)
Installation
Clone and install dependencies: ```bash cd surfline-mcp-server npm install ```
Set up Google OAuth:
Go to Google Cloud Console
Create a new OAuth 2.0 Client ID (Web application type)
Add authorized redirect URIs:
Note your Client ID and Client Secret
Create a KV namespace: ```bash npx wrangler kv namespace create OAUTH_KV ``` Update `wrangler.jsonc` with the returned KV ID.
Set secrets: ```bash echo 'YOUR_GOOGLE_CLIENT_ID' | npx wrangler secret put GOOGLE_CLIENT_ID echo 'YOUR_GOOGLE_CLIENT_SECRET' | npx wrangler secret put GOOGLE_CLIENT_SECRET echo $(openssl rand -hex 32) | npx wrangler secret put COOKIE_ENCRYPTION_KEY ```
Deploy: ```bash npm run deploy ```
Connect to Claude
Go to claude.ai
Navigate to Settings → Integrations
Add your deployed worker URL: `https://your-worker-name.your-subdomain.workers.dev/mcp\`
Authenticate with Google
Ask Claude: "How's the surf in Santa Cruz?"
Available Tools
`get_complete_surf_report`
Primary tool - Returns everything in one call:
Forecaster notes with expert observations
Sunrise/sunset times
Tide schedule
Current conditions for all spots
Swell breakdown
8-hour forecasts
Secondary Tools
Individual data fetchers available if you need specific information:
`get_surf_forecast` - Basic spot conditions only
`get_forecaster_notes` - Human observations only
`get_tides` - Tide information only
`get_best_spot` - Ranked recommendations
Spots Covered
North County: Davenport, Waddell Creek, Four Mile, Three Mile
Central: Steamer Lane, Cowells, 26th Ave
East Side: Pleasure Point, Jack's, The Hook
South: Capitola
Data Source
This server uses Surfline's undocumented public API endpoints - the same ones their website uses. No API keys or authentication required for basic forecast data. The endpoints have been stable for years and are widely used by the surf community.
Important: Webcams and premium features are not available through these endpoints.
Extending to Other Regions
To add more spots, edit `src/index.ts` and add to the `SANTA_CRUZ_SPOTS` object:
```typescript const SANTA_CRUZ_SPOTS: Record<string, string> = { "Your Spot Name": "spotIdFromSurfline", // ... }; ```
Find spot IDs by inspecting network requests on surfline.com.
Architecture
Cloudflare Workers: Serverless hosting (free tier: 100k requests/day)
Durable Objects: OAuth state management
KV Storage: Token persistence
Google OAuth: Secure authentication
MCP Protocol: Standard tool interface for AI assistants
Development
Run locally: ```bash npm run dev ```
The server will be available at `http://localhost:8788\`
Test with MCP Inspector: ```bash npx @modelcontextprotocol/inspector ```
License
MIT
Acknowledgments
Surfline for providing accessible surf forecast data
Cloudflare for the MCP and OAuth libraries
The surf community for documenting the API endpoints
This server cannot be installed