Skip to main content
Glama
patchwindow

seo-mcp

by patchwindow

gsc_traffic_drop

Compare two date periods to detect pages or queries with significant click drops. Identify traffic losses from algorithm updates, technical issues, or content decay.

Instructions

Compare two date periods and identify pages or queries with significant traffic drops. Useful for diagnosing algorithm updates, technical issues, or content decay.

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
site_urlNoSite URL in GSC format, e.g. 'sc-domain:example.com'. Uses config default if omitted.
current_startYesCurrent period start date (YYYY-MM-DD).
current_endYesCurrent period end date (YYYY-MM-DD).
previous_startYesPrevious period start date (YYYY-MM-DD).
previous_endYesPrevious period end date (YYYY-MM-DD).
dimensionNoGroup by page or query. Default: 'page'.
min_drop_percentNoMinimum click drop percentage to flag (e.g. 20 = 20% drop). Default: 20.
min_clicks_previousNoMinimum clicks in previous period to include (filters out low-traffic noise). Default: 5.
row_limitNoMax dropped items to show. Default: 25.

Implementation Reference

  • The main tool definition including name 'gsc_traffic_drop', schema, and the handler function that queries Google Search Console for two date periods, compares clicks, and returns items with significant traffic drops.
    export const gscTrafficDrop: ToolDefinition<typeof schema> = {
      name: "gsc_traffic_drop",
      description:
        "Compare two date periods and identify pages or queries with significant traffic drops. Useful for diagnosing algorithm updates, technical issues, or content decay.",
      schema,
      handler: async (args, config) => {
        const auth = getOAuth2Client();
        const sc = google.searchconsole({ version: "v1", auth });
    
        const siteUrl = args.site_url ?? config.gsc?.default_site;
        if (!siteUrl) {
          throw new Error(
            "site_url is required. Pass it as an argument or set gsc.default_site in ~/.seo-mcp/config.json"
          );
        }
    
        const dimension = args.dimension ?? "page";
        const minDropPct = args.min_drop_percent ?? 20;
        const minClicksPrev = args.min_clicks_previous ?? 5;
        const rowLimit = args.row_limit ?? 25;
    
        const [currentRes, previousRes] = await Promise.all([
          sc.searchanalytics.query({
            siteUrl,
            requestBody: {
              startDate: args.current_start,
              endDate: args.current_end,
              dimensions: [dimension],
              rowLimit: 5000,
            },
          }),
          sc.searchanalytics.query({
            siteUrl,
            requestBody: {
              startDate: args.previous_start,
              endDate: args.previous_end,
              dimensions: [dimension],
              rowLimit: 5000,
            },
          }),
        ]);
    
        const currentMap = new Map<string, number>();
        for (const row of currentRes.data.rows ?? []) {
          currentMap.set(row.keys?.[0] ?? "", row.clicks ?? 0);
        }
    
        interface DropItem {
          key: string;
          previous: number;
          current: number;
          drop: number;
          dropPct: number;
        }
    
        const drops: DropItem[] = [];
        for (const row of previousRes.data.rows ?? []) {
          const key = row.keys?.[0] ?? "";
          const prev = row.clicks ?? 0;
          if (prev < minClicksPrev) continue;
          const curr = currentMap.get(key) ?? 0;
          const dropPct = prev > 0 ? ((prev - curr) / prev) * 100 : 0;
          if (dropPct >= minDropPct) {
            drops.push({ key, previous: prev, current: curr, drop: prev - curr, dropPct });
          }
        }
    
        drops.sort((a, b) => b.drop - a.drop);
        const topDrops = drops.slice(0, rowLimit);
    
        if (topDrops.length === 0) {
          return {
            content: [
              {
                type: "text",
                text: `No ${dimension}s with ≥${minDropPct}% click drops found between the two periods.`,
              },
            ],
          };
        }
    
        const summary =
          `Traffic drop analysis: ${dimension}s\n` +
          `Current: ${args.current_start} → ${args.current_end}\n` +
          `Previous: ${args.previous_start} → ${args.previous_end}\n` +
          `Flagged ${topDrops.length} ${dimension}s with ≥${minDropPct}% drop.\n\n`;
    
        const header = `${dimension}\tprev_clicks\tcurr_clicks\tdrop\tdrop_pct`;
        const lines = topDrops.map((d) => {
          return `${d.key}\t${d.previous}\t${d.current}\t-${d.drop}\t-${d.dropPct.toFixed(1)}%`;
        });
    
        return { content: [{ type: "text", text: summary + [header, ...lines].join("\n") }] };
      },
    };
  • Zod schema defining input parameters: site_url, current_start/end, previous_start/end, dimension (page/query), min_drop_percent, min_clicks_previous, and row_limit.
    const schema = z.object({
      site_url: z.string().optional().describe(
        "Site URL in GSC format, e.g. 'sc-domain:example.com'. Uses config default if omitted."
      ),
      current_start: z.string().describe("Current period start date (YYYY-MM-DD)."),
      current_end: z.string().describe("Current period end date (YYYY-MM-DD)."),
      previous_start: z.string().describe("Previous period start date (YYYY-MM-DD)."),
      previous_end: z.string().describe("Previous period end date (YYYY-MM-DD)."),
      dimension: z
        .enum(["page", "query"])
        .optional()
        .describe("Group by page or query. Default: 'page'."),
      min_drop_percent: z
        .number()
        .optional()
        .describe("Minimum click drop percentage to flag (e.g. 20 = 20% drop). Default: 20."),
      min_clicks_previous: z
        .number()
        .optional()
        .describe("Minimum clicks in previous period to include (filters out low-traffic noise). Default: 5."),
      row_limit: z.number().optional().describe("Max dropped items to show. Default: 25."),
    });
  • Imports gscTrafficDrop and registers it as a ToolDefinition in the gscTools array.
    import { gscSearchPerformance } from "./search-performance.js";
    import { gscStrikingDistance } from "./striking-distance.js";
    import { gscTrafficDrop } from "./traffic-drop.js";
    import { gscUrlInspection } from "./url-inspection.js";
    import { gscSitemapList } from "./sitemap-list.js";
    import { gscBrandNonbrand } from "./brand-nonbrand.js";
    import type { ToolDefinition } from "../../types/tool.js";
    
    export const gscTools: ToolDefinition[] = [
      gscSearchPerformance as unknown as ToolDefinition,
      gscStrikingDistance as unknown as ToolDefinition,
      gscTrafficDrop as unknown as ToolDefinition,
      gscUrlInspection as unknown as ToolDefinition,
      gscSitemapList as unknown as ToolDefinition,
      gscBrandNonbrand as unknown as ToolDefinition,
    ];
  • ToolDefinition interface that defines the structure (name, description, schema, handler) used by gscTrafficDrop.
    export interface ToolDefinition<T extends AnyZodObject = AnyZodObject> {
      name: string;
      description: string;
      schema: T;
      handler: (args: z.infer<T>, config: Config) => Promise<ToolResult>;
    }
  • getOAuth2Client helper function that provides authenticated Google API client used by the handler.
    export function getOAuth2Client() {
      const clientId = process.env.GSC_CLIENT_ID;
      const clientSecret = process.env.GSC_CLIENT_SECRET;
    
      if (!clientId || !clientSecret) {
        throw new Error(
          "GSC_CLIENT_ID and GSC_CLIENT_SECRET must be set.\n" +
          "Run: npx @patchwindow/seo-mcp auth gsc\n" +
          "See README for Google Cloud Console setup instructions."
        );
      }
    
      const oauth2 = new google.auth.OAuth2(clientId, clientSecret, GSC_REDIRECT_URI);
    
      const tokens = readTokens();
      if (!tokens) {
        throw new Error(
          "GSC not authenticated. Run: npx @patchwindow/seo-mcp auth gsc"
        );
Behavior3/5

Does the description disclose side effects, auth requirements, rate limits, or destructive behavior?

No annotations provided, so the description carries the burden. It only states 'identify pages or queries with significant traffic drops' without detailing the comparison mechanism, return format, or potential side effects. The parameter descriptions in the schema partially compensate.

Agents need to know what a tool does to the world before calling it. Descriptions should go beyond structured annotations to explain consequences.

Conciseness5/5

Is the description appropriately sized, front-loaded, and free of redundancy?

The description is two sentences with no unnecessary words, front-loading the purpose and secondary sentence providing context. It is efficient and well-structured.

Shorter descriptions cost fewer tokens and are easier for agents to parse. Every sentence should earn its place.

Completeness3/5

Given the tool's complexity, does the description cover enough for an agent to succeed on first attempt?

No output schema exists, and the description does not explain what the return format is or the exact metrics compared. It mentions 'traffic drops' but doesn't specify if it's clicks, impressions, etc. The parameter details help fill the gap, but more context would improve completeness.

Complex tools with many parameters or behaviors need more documentation. Simple tools need less. This dimension scales expectations accordingly.

Parameters3/5

Does the description clarify parameter syntax, constraints, interactions, or defaults beyond what the schema provides?

Schema description coverage is 100%, so all parameters are documented in the schema. The description adds no additional meaning beyond what is in the schema, meeting the baseline for good coverage.

Input schemas describe structure but not intent. Descriptions should explain non-obvious parameter relationships and valid value ranges.

Purpose5/5

Does the description clearly state what the tool does and how it differs from similar tools?

The description clearly states the tool compares two date periods and identifies pages or queries with significant traffic drops, using verbs 'compare' and 'identify'. It distinguishes from sibling tools like gsc_search_performance by focusing on drops.

Agents choose between tools based on descriptions. A clear purpose with a specific verb and resource helps agents select the right tool.

Usage Guidelines4/5

Does the description explain when to use this tool, when not to, or what alternatives exist?

The description provides use cases: 'diagnosing algorithm updates, technical issues, or content decay', giving context for when to use it. It does not explicitly state when not to use it or mention alternatives, but the use cases are relevant.

Agents often have multiple tools that could apply. Explicit usage guidance like "use X instead of Y when Z" prevents misuse.

Install Server

Other Tools

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/patchwindow/seo-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server