agentaeo-mcp-server
The @agentaeo/mcp-server enables running Answer Engine Optimization (AEO) audits and generating AI-optimized content suites for websites, measuring and improving visibility across AI-powered search engines like ChatGPT, Perplexity, Claude, and Google AI.
Core capabilities:
Run AEO Audits (
run_aeo_audit): Start an async AEO audit for any URL, with a free tier (8 queries) or paid tier (40 queries), and an optional primary keyword.Check Audit Status (
check_aeo_audit_status): Poll a running audit byauditId— free tier audits stop at a preview, paid audits deliver a full report.Generate AEO Content Suite (
generate_aeo_content_suite): Kick off async generation of a full content bundle (HTML, JSON-LD structured data, andllms.txt) from a completed audit. Supports admin bypass mode for internal QA.Check Content Suite Status (
check_aeo_content_suite_status): Poll content suite generation progress byorderIduntil completed or failed (typically 5–25+ minutes).Download Content Suite ZIP (
download_aeo_content_suite_zip): Download the completed content suite as a ZIP file to a specified directory or the current working directory.
Allows running and monitoring Answer Engine Optimization (AEO) audits and generating optimized content suites to analyze and improve content visibility within Google AI.
Allows running and monitoring Answer Engine Optimization (AEO) audits and generating optimized content suites to analyze and improve content visibility within Perplexity's AI search results.
AgentAEO MCP Server
Ask Claude: "Why is [competitor] being cited instead of us for [category] queries?" Get the answer. Get the fix.
AgentAEO is a retrieval intelligence layer for AI agents. It tells Claude, Cursor, and any MCP-compatible agent which brands are winning AI citations and exactly why.
Install in 2 minutes
npx @agentaeo/mcp-server@latestThen in Claude Desktop, ask:
"Run an AI citation audit on [domain.com] for [category] queries. Tell me which competitors are being cited instead and why."
That is it. You now have live selection intelligence inside your AI workflow.
What this measures
AgentAEO runs real buyer queries across ChatGPT, Perplexity, Claude, and Google AI and returns:
Citation rates per engine per query
Which competitor wins each query
Revenue leakage estimate ($/month)
Exact structural reasons AI skips you
Copy-paste schema fixes
Why this exists
60% of brands investing in SEO are invisible in AI-generated answers despite strong Google rankings.
This is the Selection Gap: AI finds your brand but does not consistently choose it. Competitors enter the consideration set before you do.
AgentAEO measures this gap. Quantifies it in dollars. Generates the fix.
Also known as: Generative Engine Optimization (GEO), Answer Engine Optimization (AEO), AI SEO — AgentAEO measures the citation layer that determines whether AI recommends you or your competitor.
Three workflows to try now
1. Competitive intelligence
"Audit stripe.com for payment processing queries. Which fintech brands are being recommended instead of Stripe by ChatGPT and Perplexity?"
2. Own brand monitoring
"Run a citation health check on [your-domain.com] and give me the monthly revenue leakage estimate plus the top 3 fixes."
3. Agency research
"Audit these 5 domains: [list]. Rank them by Retrieval Marketing Score. Which has the biggest citation gap versus its competitors?"
Built on the Retrieval Marketing Framework™
The only AEO platform with a working MCP server on the official Anthropic registry. AAA rated on Glama.
→ agentaeo.com
→ retrieval.marketing
→ @agentaeo/mcp-server on npm
Latest Blog Posts
MCP directory API
We provide all the information about MCP servers via our MCP API.
curl -X GET 'https://glama.ai/api/mcp/v1/servers/agentaeo/agentaeo-mcp-server'
If you have feedback or need assistance with the MCP directory API, please join our Discord server