Skip to main content
Glama
houtini-ai

SEO Crawler MCP

by houtini-ai

Server Configuration

Describes the environment variables required to run the server.

NameRequiredDescriptionDefault
DEBUGNoSet to "true" to enable verbose debug loggingfalse
OUTPUT_DIRYesDirectory where crawl results are saved

Capabilities

Features and capabilities supported by this server

CapabilityDetails
tools
{}

Tools

Functions exposed to the LLM to take actions

NameDescription
run_seo_audit

Crawl a website and extract comprehensive SEO data using Crawlee HttpCrawler. Returns crawl ID and output path.

analyze_seo

Analyze SEO data from a completed crawl. Runs 25+ SQL queries to detect critical issues, content problems, technical SEO issues, security vulnerabilities, and optimization opportunities. Returns structured report with affected URLs and fix recommendations.

query_seo_data

Execute a specific SEO analysis query by name. Use list_seo_queries to see available queries. Returns detailed results with affected URLs and context.

list_seo_queries

List all available SEO analysis queries with descriptions, priorities, and fix recommendations. Optionally filter by category or priority level.

Prompts

Interactive templates invoked by user choice

NameDescription

No prompts

Resources

Contextual data attached and managed by the client

NameDescription

No resources

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/houtini-ai/seo-crawler-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server