Integrates Google Search Console data into the editor, providing performance metrics, clicks, impressions, rankings, and content strategy insights based on real search data.
Rampify MCP Server - Claude SEO Checker
Turn Claude Code into a powerful SEO checker. Real-time site audits, Google Search Console integration, and AI-powered recommendations directly in your editor (Cursor, Claude Code).
What is a Claude SEO Checker?
A Claude SEO checker is an MCP (Model Context Protocol) server that adds SEO validation capabilities to Claude Code and Cursor. Unlike AI rank trackers that check if your site appears in Claude's AI responses, Rampify analyzes your website's technical SEO, validates meta tags, detects issues, and provides actionable fix recommendations—all from your terminal or IDE.
Not an AI rank tracker. This is a developer tool that brings SEO intelligence into your coding workflow.
Claude SEO Checker vs AI Rank Trackers
Feature | Claude SEO Checker (Rampify) | AI Rank Trackers |
What it checks | YOUR website's SEO (meta tags, schema, performance) | If your site appears in Claude AI responses |
Use case | Fix SEO issues before deployment | Track AI visibility |
Where it works | Your IDE (Claude Code, Cursor) | Separate dashboard |
Target audience | Developers building sites | Marketers tracking AI rank |
Data source | Your site + Google Search Console | Claude AI responses |
Keywords: claude seo checker, mcp seo server, seo tools for claude code, cursor seo tools, ai seo checker, claude code seo
Bring Google Search Console data, SEO insights, and AI-powered recommendations directly into your editor. No context switching, no delays.
Why Rampify?
Real-time SEO intelligence in your editor (Cursor, Claude Code)
Google Search Console integration - See clicks, impressions, rankings
Content strategy insights - Discover what to write next based on real search data
AI-powered recommendations - Fix issues with one command
Pre-deployment checks - Catch SEO issues before they go live
Zero context switching - Stay in your workflow
Installation
Prerequisites
Node.js 18 or higher
Rampify account (free to start)
Install via npm
The global installation makes the rampify-mcp command available system-wide.
Usage
Get Your API Key
Before configuring the MCP server, get your API key:
Sign up for Rampify (free to start)
Go to your Rampify dashboard
Navigate to Settings → API Keys
Click "Generate New Key"
Copy the key (starts with
sk_live_...)Use it in the configuration below
Quick Setup for a Project (Claude CLI)
Recommended: Configure MCP server per-project so each project knows its domain:
Now you can use MCP tools without specifying domain:
get_page_seo- Automatically uses your project's domainget_issues- Automatically uses your project's domaincrawl_site- Automatically uses your project's domain
Global Setup (Claude CLI)
For global access across all projects (must specify domain in each request):
Manual Configuration (Cursor)
Add to your Cursor settings UI or ~/.cursor/config.json:
Manual Configuration (Claude Code)
Add to your Claude Code MCP settings:
Configuration Options
Environment Variables
BACKEND_API_URL(required): Rampify API endpoint - always usehttps://www.rampify.devAPI_KEY(required): Your API key from Rampify dashboard (starts withsk_live_...)SEO_CLIENT_DOMAIN(optional): Default domain for this project (e.g.,yoursite.com)CACHE_TTL(optional): Cache duration in seconds (default: 3600)LOG_LEVEL(optional):debug,info,warn, orerror(default:info)
How to Use Tools
Discovering Available Tools
Ask Claude directly:
Claude will show you all available tools with descriptions.
Natural Language vs Direct Calls
Recommended: Use natural language (Claude will pick the right tool)
Alternative: Call tools directly (if you know the exact name)
Common Workflows
After Deployment:
Before Deployment:
Regular Monitoring:
Content Planning:
Available Tools
Quick Reference
Tool | Purpose | When to Use |
| Get SEO data for a specific page | Analyzing individual pages, checking performance |
| Get all SEO issues with health score | Site-wide audits, finding problems |
| Get GSC performance data with content recommendations | Discovering what to write next, finding ranking opportunities |
| Generate optimized meta tags | Fixing title/description issues, improving CTR |
| Auto-generate structured data | Adding schema.org JSON-LD to pages |
| Trigger fresh crawl | After deployments, to refresh data |
1. get_page_seo
Get comprehensive SEO data and insights for a specific page. Works with both production sites AND local dev servers!
Parameters:
domain(optional): Site domain (e.g., "example.com" or "localhost:3000"). UsesSEO_CLIENT_DOMAINenv var if not provided.url_path(optional): Page URL path (e.g., "/blog/post")file_path(optional): Local file path (will be resolved to URL)content(optional): Current file content
Examples:
Production Site:
Local Development Server:
Response includes:
Source indicator:
production_database,local_dev_server, ordirect_contentFetched from: Exact URL that was analyzed
Performance metrics (clicks, impressions, position, CTR) - only for production
Top keywords ranking for this page - only for production
Detected SEO issues with fixes - works for both local and production
Quick win opportunities
AI summary and recommendations
Local Development Workflow
Test pages BEFORE deployment:
Start your dev server:
npm run dev # Usually runs on localhost:3000Query local pages:
Ask Claude: "Check SEO of localhost:3000/blog/draft-post"Fix issues in your editor, then re-check:
Ask Claude: "Re-check SEO for this page on localhost"Deploy when clean!
What gets analyzed locally:
Title tags
Meta descriptions
Heading structure (H1, H2, H3)
Images and alt text
Schema.org structured data
Internal/external links
Search performance (not available for local - GSC data only exists for production)
Response format:
2. get_issues
Get SEO issues for entire site with health score. Returns a comprehensive report of all detected problems.
Parameters:
domain(optional): Site domain (usesSEO_CLIENT_DOMAINif not provided)filters(optional):severity: Array of severity levels to include (['critical', 'warning', 'info'])issue_types: Array of specific issue typeslimit: Max issues to return (1-100, default: 50)
Examples:
Response includes:
Health score (0-100) and grade (A-F)
Issue summary by severity (critical, warning, info)
Detailed list of issues with fix recommendations
Recommended actions prioritized by impact
Use cases:
Site-wide SEO audits
Finding all problems at once
Tracking improvements over time
Prioritizing fixes by severity
3. get_gsc_insights (NEW)
Get Google Search Console performance data with AI-powered content recommendations. Discover what to write next based on real search data.
Parameters:
domain(optional): Site domain (usesSEO_CLIENT_DOMAINif not provided)period(optional): Time period for analysis -7d,28d, or90d(default:28d)include_recommendations(optional): Include AI-powered content recommendations (default:true)
Examples:
What it provides:
1. Performance Summary
Total clicks, impressions, average position, CTR
Compare performance across time periods
2. Top Performing Pages
Top 20 pages by clicks
Each with performance metrics and top queries
See what content resonates with your audience
3. Query Opportunities (4 types automatically detected)
Improve CTR: High impressions (100+) but low CTR (<2%) → Optimize meta tags
Improve Ranking: Position 6-20 → Push to page 1 with content improvements
Keyword Cannibalization: Multiple pages competing for same query → Consolidate content
Keyword Gap: High position (<5) but low volume → Expand content to target related queries
4. AI-Powered Content Recommendations
High-priority topics based on search data
Target queries for each recommendation
Prioritized by potential impact (high/medium/low)
5. Query Clustering
Groups related queries into topic themes
Identifies topic authority opportunities
Suggests comprehensive content pieces
Response includes:
Use cases:
Content Strategy:
Performance Optimization:
Keyword Research:
Topic Authority:
When to use:
Weekly content planning sessions
Quarterly content strategy reviews
After publishing new content (check performance)
When looking for low-hanging fruit (page 2 rankings)
Before creating new content (avoid cannibalization)
Requirements:
Google Search Console must be connected (connect in Rampify dashboard)
Site must have some search traffic (impressions)
GSC data synced (happens automatically weekly, or trigger manually)
Pro tips:
Start with 28-day period for balanced view (not too recent, not too old)
Use 7-day period to track recent changes
Use 90-day period for seasonal trends
Combine with
generate_metato optimize high-opportunity pagesRun after GSC sync completes for latest data
4. crawl_site
Trigger a fresh site crawl and analysis. This is an active operation that fetches and analyzes all pages.
Parameters:
domain(optional): Site domain (usesSEO_CLIENT_DOMAINif not provided)
Examples:
What it does:
Discovers all URLs (via sitemap or navigation crawl)
Checks each URL (status, speed, SEO elements)
Detects issues (missing tags, errors, broken links)
Updates database with current state
Automatically clears cache so next
get_issuesorget_page_seoshows fresh data
Response includes:
Total URLs found
URLs checked
Issues detected
Crawl duration
Crawl method (sitemap vs navigation)
When to use:
After deploying code changes
After fixing SEO issues
Before running
get_issuesto ensure fresh dataWeekly/monthly for monitoring
Note: This is the only tool that actively crawls your site. get_issues and get_page_seo just fetch existing data.
5. generate_schema
Auto-generate structured data (schema.org JSON-LD) for any page. Detects page type and generates appropriate schema with validation.
Parameters:
domain(optional): Site domain (usesSEO_CLIENT_DOMAINif not provided)url_path(required): Page URL path (e.g., "/blog/post")schema_type(optional): Specific schema type or "auto" to detect (default: "auto")
Supported schema types:
Article/BlogPosting- Blog posts, articles, newsProduct- Product pages, e-commerceOrganization- About pages, company infoFAQPage- FAQ pages with Q&ABreadcrumbList- Auto-added for navigation
Examples:
What it does:
Fetches page HTML (local or production)
Analyzes content (title, description, author, date, images)
Detects page type from URL patterns and content
Generates appropriate JSON-LD schema
Validates schema and warns about placeholders
Returns ready-to-use code snippets
Response includes:
Detected page type
List of recommended schemas
Generated JSON-LD for each schema
Validation results with warnings
Code snippets (Next.js or HTML)
Implementation instructions
Use cases:
Fixing "missing schema" warnings from
get_issuesAdding rich snippets for better search visibility
Enabling Google Discover eligibility (requires Article schema)
Improving CTR with enhanced search results
Example output:
Pro tip: After generating schema, test it with Google Rich Results Test
6. generate_meta (Enhanced with Client Profile Context)
Generate optimized meta tags (title, description, Open Graph tags) for a page. Now uses your client profile to generate highly personalized, business-aware meta tags that align with your target audience, brand voice, and competitive positioning.
Parameters:
domain(optional): Site domain (usesSEO_CLIENT_DOMAINif not provided)url_path(required): Page URL path (e.g., "/blog" or "/blog/post")include_og_tags(optional): Include Open Graph tags for social sharing (default: true)framework(optional): Framework format for code snippet -nextjs,html,astro, orremix(default: "nextjs")
NEW: Client Profile Integration
The tool automatically fetches your client profile and uses context like:
Target keywords → Ensures they appear in title/description
Target audience → Adjusts tone and technical depth
Brand voice → Matches your preferred tone (conversational, technical, formal)
Differentiators → Highlights unique selling points for better CTR
Primary CTA → Ends description with appropriate call-to-action
Examples:
What it does:
Fetches page HTML (local or production)
Analyzes current meta tags (title, description)
Extracts content structure (headings, topics, word count)
Detects page type (homepage, blog_post, blog_index, product, about)
Identifies key topics from content
Returns analysis for AI to generate optimized meta tags
Provides framework-specific code snippets
Response includes:
Page analysis:
Current title and description
Main heading and all headings
Word count and content preview
Detected page type
Key topics extracted from content
Images for OG tags
Current issues:
Title too short/long
Meta description too short/long
Missing meta tags
AI-generated meta tags:
Optimized title (50-60 characters)
Compelling meta description (150-160 characters)
Open Graph tags (if requested)
Twitter Card tags (if requested)
Ready-to-use code for your framework
Use cases:
Fixing "title too short" or "description too short" warnings
Improving click-through rate (CTR) from search results
Optimizing social media sharing (OG tags)
Aligning meta tags with actual page content
A/B testing different meta descriptions
Real-World Impact: Before vs. After
Without Profile Context (Generic):
With Profile Context (Target audience: developers, Differentiators: "real-time collaboration, 50% faster"):
Profile Warnings System:
If your profile is incomplete, you'll get helpful warnings:
Or if no profile exists at all:
Example workflow:
Setting Up Your Profile:
To get the most value from generate_meta:
Visit
/clients/{your-client-id}/profilein the dashboardFill out key fields:
Target Audience (e.g., "developers and technical founders")
Target Keywords (e.g., "real-time collaboration, dev tools")
Brand Voice (e.g., "technical but approachable")
Your Differentiators (e.g., "50% faster than competitors")
Primary CTA (e.g., "try_free" or "request_demo")
Use the tool - Profile context is automatically applied
See better results - Meta tags now match your business context
SEO Best Practices (Built-in):
Title length: 50-60 characters (includes brand name if space allows)
Description length: 150-160 characters (compelling call-to-action)
Keyword placement: Primary keywords near the start
Uniqueness: Each page gets unique meta tags based on its content
Accuracy: Meta tags reflect actual page content (no clickbait)
Framework-specific output:
Next.js (App Router):
HTML:
Pro tips:
Run after fixing content to ensure meta tags match
Test social sharing with Facebook Sharing Debugger
Monitor CTR improvements in Google Search Console
Update meta tags when page content significantly changes
Development
Watch Mode
This will recompile TypeScript on every change.
Testing Locally
The MCP server will connect to your local backend at http://localhost:3000.
Debug Logging
Set LOG_LEVEL=debug in your .env file to see detailed logs:
Architecture
Caching
The MCP server caches responses for 1 hour (configurable via CACHE_TTL) to improve performance.
Cache is cleared automatically when:
Entries expire (TTL reached)
Server restarts
You manually clear (not yet implemented)
Troubleshooting
"No client found for domain"
Solution: Add the site to your dashboard first at http://localhost:3000
"Backend API connection failed"
Checklist:
Is the backend running? (
npm run devin root directory)Is
BACKEND_API_URLcorrect in.env?Check logs with
LOG_LEVEL=debug
"MCP server not appearing in Cursor"
Checklist:
Did you build the server? (
npm run build)Is the path absolute in Cursor config?
Restart Cursor after changing config
Check Cursor logs (Help → Toggle Developer Tools → Console)
Empty or missing data
Common causes:
Site not analyzed yet (run analysis in dashboard first)
GSC not connected (connect in dashboard settings)
No URLs in database (trigger site analysis)
"Could not connect to local dev server"
Solution:
Make sure your dev server is running (
npm run dev)Verify the port (default is 3000, but yours might be different)
Use full domain with port:
localhost:3000(not justlocalhost)Check dev server logs for CORS or other errors
Example error:
Local vs Production Confusion
How to tell which source you're using:
Every response includes explicit source and fetched_from fields:
Pro tip: Set SEO_CLIENT_DOMAIN per project to avoid specifying domain every time:
For local dev:
SEO_CLIENT_DOMAIN=localhost:3000For production:
SEO_CLIENT_DOMAIN=yoursite.com
Roadmap
Phase 1: Core Tools (Complete)
DONE:
get_page_seo- Get SEO data for a specific pageDONE:
get_issues- Get all site issues with health scoreDONE:
get_gsc_insights- Get GSC performance data with content recommendations (NEW)DONE:
generate_meta- AI-powered title and meta description generationDONE:
generate_schema- Auto-generate structured data (Article, Product, etc.)DONE:
crawl_site- Trigger fresh site crawl
Phase 2: Workflow & Optimization Tools (Planned)
PLANNED:
suggest_internal_links- Internal linking recommendationsPLANNED:
check_before_deploy- Pre-deployment SEO validationPLANNED:
optimize_blog_post- Deep optimization for blog contentPLANNED:
optimize_landing_page- Conversion-focused SEO
Future (Phase 4+)
Bulk operations across multiple pages
Historical trend analysis
Competitive monitoring
Advanced AI insights and recommendations
Support
Need help?
Documentation - Complete guides and tutorials
GitHub Issues - Report bugs or request features
Rampify Settings - Manage your sites and API keys
Learn More
What is Rampify? - Product overview
MCP Server Guide - Detailed documentation
Blog - SEO tips and product updates
License
MIT