| list_propertiesB | Retrieves and returns the user's Search Console properties. |
| add_siteB | Add a site to your Search Console properties.
Args:
site_url: The URL of the site to add (e.g. https://example.com or sc-domain:example.com)
|
| delete_siteB | Remove a site from your Search Console properties.
Args:
site_url: The URL of the site to remove
|
| get_search_analyticsB | Get search analytics data for a specific property.
Args:
site_url: Exact GSC property URL (e.g. "sc-domain:example.com")
days: Number of days to look back (default: 28)
dimensions: Dimensions to group by, comma-separated (query, page, device, country, date, searchAppearance)
row_limit: Number of rows to return (default: 20, max: 500)
search_type: Type of search results (WEB, IMAGE, VIDEO, NEWS, DISCOVER)
|
| get_advanced_search_analyticsB | Get advanced search analytics with sorting, filtering (including regex), and pagination.
Args:
site_url: Exact GSC property URL (e.g. "sc-domain:example.com")
start_date: Start date YYYY-MM-DD (defaults to 28 days ago)
end_date: End date YYYY-MM-DD (defaults to today)
dimensions: Dimensions comma-separated (query,page,device,country,date,searchAppearance)
search_type: WEB, IMAGE, VIDEO, NEWS, DISCOVER
row_limit: Max rows (up to 25000)
start_row: Starting row for pagination
sort_by: Metric to sort by (clicks, impressions, ctr, position)
sort_direction: ascending or descending
filter_dimension: Single filter dimension (query, page, country, device)
filter_operator: contains, equals, notContains, notEquals, includingRegex, excludingRegex
filter_expression: Filter value
filters: JSON array of filter objects for AND logic. Each needs dimension, operator, expression.
data_state: "all" (default) or "final" (confirmed only, 2-3 day lag)
|
| get_performance_overviewB | Get a performance overview with totals and daily trend.
Args:
site_url: Exact GSC property URL
days: Number of days to look back (default: 28)
|
| compare_search_periodsB | Compare search analytics between two time periods.
Args:
site_url: Exact GSC property URL
period1_start: Start date for period 1 (YYYY-MM-DD)
period1_end: End date for period 1
period2_start: Start date for period 2
period2_end: End date for period 2
dimensions: Dimensions to group by (default: query)
limit: Top N results to compare (default: 20)
|
| get_search_by_page_queryB | Get search queries driving traffic to a specific page.
Args:
site_url: Exact GSC property URL
page_url: The specific page URL to analyze
days: Days to look back (default: 28)
row_limit: Rows to return (default: 20, max: 500)
|
| inspect_urlC | Inspect a URL for indexing status, rich results, and mobile usability.
Args:
site_url: Exact GSC property URL (e.g. "sc-domain:example.com")
page_url: The specific URL to inspect
|
| batch_inspect_urlsA | Inspect multiple URLs for indexing status. Handles rate limiting automatically.
API limit: 2000/day, 600/minute. This tool handles up to 50 URLs per call.
Args:
site_url: Exact GSC property URL (e.g. "sc-domain:example.com")
urls: List of URLs to inspect, one per line
|
| get_sitemapsC | List all sitemaps for a property with detailed info.
Args:
site_url: Exact GSC property URL
|
| submit_sitemapB | Submit or resubmit a sitemap to Google.
Args:
site_url: Exact GSC property URL
sitemap_url: Full URL of the sitemap to submit
|
| delete_sitemapA | Delete (unsubmit) a sitemap from Google Search Console.
Args:
site_url: Exact GSC property URL
sitemap_url: Full URL of the sitemap to delete
|
| request_indexingA | Request Google to crawl and index a URL via the Indexing API.
IMPORTANT: Only works for pages with JobPosting or BroadcastEvent structured data.
Default quota: 200 requests/day.
Args:
url: The full URL to request indexing for
|
| request_removalA | Request Google to remove a URL from the index via the Indexing API.
IMPORTANT: Only works for pages with JobPosting or BroadcastEvent structured data.
Args:
url: The full URL to request removal for
|
| batch_request_indexingA | Request indexing for multiple URLs. Processes sequentially with rate limiting.
Default quota: 200/day. Only for pages with JobPosting or BroadcastEvent structured data.
Args:
urls: List of URLs to index, one per line (max 100 per batch)
|
| check_indexing_notificationC | Check the latest indexing notification status for a URL.
Args:
url: The URL to check
|
| get_core_web_vitalsA | Get Core Web Vitals (LCP, INP, CLS) from the Chrome UX Report (CrUX) API.
Free API, no OAuth needed — just a CRUX_API_KEY env variable.
Args:
url_or_origin: Full URL or origin (e.g. "https://example.com" for origin-level)
form_factor: PHONE, DESKTOP, or TABLET (default: PHONE)
|
| get_pagespeed_insightsB | Run Google's PageSpeed Insights API for a URL.
Returns Lighthouse lab data plus available Chrome UX Report field data.
Args:
url: Full page URL
strategy: mobile or desktop
categories: Comma-separated Lighthouse categories
|
| run_lighthouse_auditA | Run a local Lighthouse CLI audit via npx.
Requires Node.js plus a locally available Chrome/Chromium browser.
Args:
url: Full page URL
form_factor: mobile or desktop
categories: Comma-separated Lighthouse categories
|
| inspect_robots_txtB | Fetch and summarize the site's robots.txt file.
Args:
url_or_origin: Full URL, origin, or sc-domain property
|
| analyze_sitemapB | Fetch and analyze an XML sitemap or sitemap index.
Args:
sitemap_url: Full sitemap URL
sample_urls: Number of sitemap URLs to validate with GET requests
|
| analyze_page_seoB | Fetch a page and analyze on-page SEO signals, structured data, and indexability hints.
Args:
url: Full page URL
|
| crawl_site_seoC | Crawl a site from a start URL and aggregate common technical/on-page SEO issues.
Args:
start_url: First URL to crawl
max_pages: Maximum number of same-origin HTML pages to crawl
|
| audit_live_siteA | Run a live SEO audit without requiring Search Console access.
Combines page analysis, robots.txt inspection, sitemap discovery, PSI data, and a small same-origin crawl.
Args:
url: Full site/page URL
crawl_pages: Number of pages to crawl for duplicate/missing-tag issues
include_lighthouse: Whether to also run a local Lighthouse CLI audit
|
| find_striking_distance_keywordsA | Find "striking distance" keywords — queries ranking at positions 5-20 with decent impressions.
These are quick-win optimization targets that could reach page 1 with small improvements.
Args:
site_url: Exact GSC property URL
days: Days to look back (default: 28)
min_impressions: Minimum impressions to include (default: 10)
row_limit: Max results (default: 50)
|
| detect_cannibalizationA | Detect keyword cannibalization — queries where multiple pages compete for the same keyword.
This dilutes ranking power and confuses Google about which page to show.
Args:
site_url: Exact GSC property URL
days: Days to look back (default: 28)
min_impressions: Minimum impressions per query-page pair (default: 5)
|
| split_branded_queriesB | Split search performance into branded vs non-branded queries.
Shows true organic SEO growth by separating brand searches.
Args:
site_url: Exact GSC property URL
brand_name: Your brand name to filter (e.g. "cdljobscenter")
days: Days to look back (default: 28)
|
| site_auditA | Run a comprehensive site audit: checks sitemap health, inspects URLs for indexing issues,
identifies coverage problems, and reports findings.
Args:
site_url: Exact GSC property URL (e.g. "sc-domain:example.com")
sitemap_url: Optional sitemap URL. If not provided, auto-detects from GSC.
max_inspect: Max URLs to inspect (default: 30, costs 1 API call each)
|
| reauthenticateA | Perform a logout and new login sequence.
Deletes the current OAuth token and triggers a new browser auth flow. |