Skip to main content
Glama
RichardDillman

SEO Audit MCP Server

Server Configuration

Describes the environment variables required to run the server.

NameRequiredDescriptionDefault

No arguments

Capabilities

Server capabilities have not been inspected yet.

Tools

Functions exposed to the LLM to take actions

NameDescription
analyze_page

Analyze a single web page for SEO factors including:

  • Meta tags (title, description, canonical, robots)

  • Heading structure (H1-H6)

  • Structured data (JSON-LD, with special focus on JobPosting schema)

  • JavaScript rendering analysis (CSR vs SSR detection)

  • Link analysis (internal, external, nofollow)

  • Image analysis (alt tags, lazy loading)

  • Mixed content detection

  • Basic load time measurement

Use this for detailed analysis of specific pages like job detail pages, landing pages, or homepage.

crawl_site

Crawl multiple pages of a website starting from a URL. Discovers internal links and analyzes each page.

Returns:

  • Aggregated statistics (pages with titles, meta descriptions, schema, etc.)

  • Page type classification (job detail, category landing, location pages, etc.)

  • Duplicate detection (titles, descriptions)

  • Critical issues and warnings

  • All individual page analyses

Use this for comprehensive site audits. Respects crawl limits and delays.

run_lighthouse

Run a Lighthouse performance audit on a URL.

Returns:

  • Performance, Accessibility, Best Practices, and SEO scores

  • Core Web Vitals (LCP, CLS, TBT/INP proxy, FCP, TTFB)

  • Optimization opportunities with estimated savings

  • Diagnostics (long tasks, layout shifts, etc.)

  • SEO audit results (crawlability, meta tags, etc.)

Use this for performance analysis. Run separately for mobile and desktop if both matter.

Note: Requires Lighthouse CLI to be installed (npm install -g lighthouse).

analyze_sitemap

Analyze a site's robots.txt and XML sitemaps.

Returns:

  • robots.txt rules and any blocking issues

  • All discovered sitemaps (from robots.txt and common locations)

  • URL counts and job-specific URL detection

  • Sitemap freshness analysis

  • Recommendations for job boards (Indexing API, job sitemaps)

Use this as a first step to understand site structure before crawling.

check_urls

Check HTTP status codes for a list of URLs.

Returns status code, redirect destination (if redirected), and response time for each URL.

Use this to:

  • Verify expired job pages are handled correctly

  • Check for broken links

  • Analyze redirect chains

plan_audit

RECOMMENDED FIRST STEP - Analyze sitemaps and create an intelligent sampling strategy for large sites.

This tool is essential for job boards and large sites with 100k+ pages. Instead of crawling everything, it:

  1. Discovers and validates all sitemaps (robots.txt + common locations)

  2. Identifies distinct route patterns (job pages, category pages, location pages, etc.)

  3. Estimates total pages per route type

  4. Generates a smart sampling strategy

  5. Recommends which pages to analyze with Lighthouse

Returns:

  • Sitemap validation (URL limits, lastmod coverage, compression)

  • Route pattern classification with estimated counts

  • Sampling strategy (how many pages to sample per type)

  • Issues, warnings, and recommendations

Use this BEFORE crawl_site or sample_pages to understand site structure.

sample_pages

Intelligently sample and analyze pages based on an audit plan.

Use this AFTER plan_audit to analyze representative pages from each route type.

For a site with 500k job pages, instead of crawling all of them, this will:

  • Sample 30-50 job detail pages (random + oldest + newest)

  • Sample 10-20 category landing pages

  • Sample 10-20 location pages

  • Sample company pages, static pages, etc.

Returns:

  • Detailed analysis of each sampled page

  • Aggregated issues per route type

  • Cross-cutting findings (% missing titles, schema errors, etc.)

  • Common issues ranked by frequency

This approach finds template-level issues that affect all pages of that type.

run_audit

FULL AUDIT - Run a complete SEO audit with automatic sampling, caching, and report generation.

This is the main audit tool that orchestrates the entire workflow:

  1. Discovers and analyzes sitemaps

  2. Identifies route patterns and creates sampling strategy

  3. Captures sample pages (cached - only fetches once)

  4. Analyzes SEO, structured data, technical issues, social graph

  5. Generates prioritized recommendations

  6. Saves everything to reports/[sitename]/ folder

The audit captures pages ONCE and stores:

  • HTML snapshots for inspection

  • Full analysis data as JSON

  • Final report as JSON + Markdown

Returns comprehensive findings and prioritized fix recommendations.

Prompts

Interactive templates invoked by user choice

NameDescription

No prompts

Resources

Contextual data attached and managed by the client

NameDescription

No resources

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/RichardDillman/seo-audit-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server