Skip to main content
Glama

create_digest_watch

Register a scheduled webhook to receive daily or weekly curated summaries of pricing changes. Set-and-forget monitoring that provides periodic snapshots without requiring real-time subscriptions. Ideal for tracking pricing trends.

Instructions

Register a scheduled digest webhook that fires daily or weekly with a curated summary of pricing changes (regardless of whether anything dramatic happened). Costs 1 credit. Watch lives 90 days. Set-and-forget for agents that want a periodic snapshot without subscribing to realtime transitions.

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
cadenceYesHow often the digest fires
callback_urlYesHTTPS URL to POST to when the digest fires
secretNoOptional HMAC shared secret

Implementation Reference

  • The MCP server tool registration and handler for 'create_digest_watch'. Defines the tool with cadence/callback_url/secret schema, calls fetchJSON to POST /premium/watches, and returns a confirmation text response.
    server.tool(
      'create_digest_watch',
      'Register a scheduled digest webhook that fires daily or weekly with a curated summary of pricing changes (regardless of whether anything dramatic happened). Costs 1 credit. Watch lives 90 days. Set-and-forget for agents that want a periodic snapshot without subscribing to realtime transitions.',
      {
        cadence: z.enum(['daily', 'weekly']).describe('How often the digest fires'),
        callback_url: z.string().describe('HTTPS URL to POST to when the digest fires'),
        secret: z.string().optional().describe('Optional HMAC shared secret'),
      },
      async ({ cadence, callback_url, secret }) => {
        const body: Record<string, unknown> = {
          spec: { type: 'digest', cadence },
          callback_url,
        };
        if (secret !== undefined) body.secret = secret;
        const data = (await fetchJSON('/premium/watches', { method: 'POST', body, auth: true })) as {
          watch: { id: string; expires_at: string };
          billing?: { credits_remaining?: number };
        };
        return {
          content: [
            {
              type: 'text' as const,
              text: `Created ${cadence} digest watch ${data.watch.id} (expires ${data.watch.expires_at}). First fire at the next 7am UTC daily cron. Credits remaining: ${data.billing?.credits_remaining ?? '?'}`,
            },
          ],
        };
      },
    );
  • The fetchJSON helper function used by the handler to make authenticated API requests to the TensorFeed API.
    async function fetchJSON(path: string, opts: FetchOptions = {}): Promise<unknown> {
      const headers: Record<string, string> = {
        'User-Agent': `TensorFeed-MCP/${SDK_VERSION}`,
      };
      if (opts.body !== undefined) headers['Content-Type'] = 'application/json';
      if (opts.auth) {
        const token = process.env.TENSORFEED_TOKEN;
        if (!token) {
          throw new Error(
            'TENSORFEED_TOKEN env var is not set. Premium MCP tools require a bearer token. ' +
              'Buy credits at https://tensorfeed.ai/developers/agent-payments and pass the returned tf_live_... token via the TENSORFEED_TOKEN env var in your MCP client config.',
          );
        }
        headers['Authorization'] = `Bearer ${token}`;
      }
      const res = await fetch(`${API_BASE}${path}`, {
        method: opts.method ?? 'GET',
        headers,
        ...(opts.body !== undefined ? { body: JSON.stringify(opts.body) } : {}),
      });
      if (!res.ok) {
        let errPayload: unknown;
        try {
          errPayload = await res.json();
        } catch {
          errPayload = await res.text().catch(() => '');
        }
        if (res.status === 402) {
          throw new Error(
            `Payment required (402). Your token may be out of credits. Top up at https://tensorfeed.ai/developers/agent-payments. Detail: ${JSON.stringify(errPayload)}`,
          );
        }
        if (res.status === 401) {
          throw new Error(
            `Token rejected (401). Check that TENSORFEED_TOKEN is set to a valid tf_live_... token. Detail: ${JSON.stringify(errPayload)}`,
          );
        }
        throw new Error(`API error ${res.status}: ${JSON.stringify(errPayload)}`);
      }
  • Zod schema for the tool's input parameters: cadence (enum daily/weekly), callback_url (string), and optional secret (string).
    {
      cadence: z.enum(['daily', 'weekly']).describe('How often the digest fires'),
      callback_url: z.string().describe('HTTPS URL to POST to when the digest fires'),
      secret: z.string().optional().describe('Optional HMAC shared secret'),
    },
  • JavaScript SDK method createDigestWatch - delegates to createWatch with spec type 'digest'.
    async createDigestWatch(options: {
      cadence: 'daily' | 'weekly';
      callbackUrl: string;
      secret?: string;
      fireCap?: number;
    }): Promise<WatchCreateResponse> {
      return this.createWatch({
        spec: { type: 'digest', cadence: options.cadence },
        callbackUrl: options.callbackUrl,
        secret: options.secret,
        fireCap: options.fireCap,
      });
    }
  • Python SDK method create_digest_watch - delegates to create_watch with spec type 'digest'.
    def create_digest_watch(
        self,
        *,
        cadence: str,
        callback_url: str,
        secret: str | None = None,
        fire_cap: int | None = None,
    ) -> dict[str, Any]:
        """Register a scheduled digest watch.
    
        Costs 1 credit at registration. Fires on a fixed cadence (daily
        or weekly) with a curated summary of pricing changes,
        added/removed models, and total change count for the period,
        regardless of whether anything dramatic happened. Set-and-forget
        for agents that want a periodic snapshot without subscribing to
        realtime transitions.
    
        First fire happens at the next daily 7am UTC cron after
        registration. Daily watches re-fire roughly every 24 hours;
        weekly watches re-fire roughly every 7 days. A small slack
        window (1h for daily, 12h for weekly) absorbs cron drift.
    
        Same delivery contract as ``create_watch``: HMAC-SHA256 signed
        POST to callback_url with X-TensorFeed-Signature and
        X-TensorFeed-Watch-Id headers. The fire payload's
        ``match.type`` is ``'digest'`` so consumers can route on it.
    
        Args:
            cadence: 'daily' or 'weekly'.
            callback_url: HTTPS URL to POST to on each fire.
            secret: Optional HMAC shared secret.
            fire_cap: Max fires before auto-disable (default 100).
    
        Returns:
            Same shape as ``create_watch``: dict with ``watch`` (full
            record) and ``billing``.
    
        Raises:
            ValueError: if no token is set on the client
            PaymentRequired: if the token has insufficient credits
            TensorFeedError: 400 on invalid cadence or callback URL
        """
        return self.create_watch(
            spec={"type": "digest", "cadence": cadence},
            callback_url=callback_url,
            secret=secret,
            fire_cap=fire_cap,
        )
Behavior4/5

Does the description disclose side effects, auth requirements, rate limits, or destructive behavior?

With no annotations, the description discloses key behaviors: costs 1 credit, watch lasts 90 days, fires with curated summary regardless of dramatic changes. Missing details on whether it's a write operation, but sufficient for a simple registration tool.

Agents need to know what a tool does to the world before calling it. Descriptions should go beyond structured annotations to explain consequences.

Conciseness5/5

Is the description appropriately sized, front-loaded, and free of redundancy?

Three concise sentences with front-loaded purpose, followed by cost, lifetime, and use case. No redundant information.

Shorter descriptions cost fewer tokens and are easier for agents to parse. Every sentence should earn its place.

Completeness3/5

Given the tool's complexity, does the description cover enough for an agent to succeed on first attempt?

Covers main aspects (cadence, webhook nature, cost, lifetime) but lacks mention of return value (e.g., watch ID) or webhook payload structure. Adequate for a tool with no output schema.

Complex tools with many parameters or behaviors need more documentation. Simple tools need less. This dimension scales expectations accordingly.

Parameters3/5

Does the description clarify parameter syntax, constraints, interactions, or defaults beyond what the schema provides?

Schema covers all parameters with 100% description coverage. Description adds context like 'curated summary' but does not provide additional semantics beyond schema (e.g., format of callback_url). Meets baseline but no extra value.

Input schemas describe structure but not intent. Descriptions should explain non-obvious parameter relationships and valid value ranges.

Purpose5/5

Does the description clearly state what the tool does and how it differs from similar tools?

Clearly states the tool registers a scheduled digest webhook for pricing changes with daily/weekly cadence. Distinguishes from realtime alternatives (set-and-forget) and mentions it fires regardless of drama, differing from siblings like create_price_watch.

Agents choose between tools based on descriptions. A clear purpose with a specific verb and resource helps agents select the right tool.

Usage Guidelines4/5

Does the description explain when to use this tool, when not to, or what alternatives exist?

Implicitly tells agents to use this for periodic snapshots instead of realtime transitions. Notes cost and lifetime, but does not explicitly name alternative tools or state when not to use.

Agents often have multiple tools that could apply. Explicit usage guidance like "use X instead of Y when Z" prevents misuse.

Install Server

Other Tools

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/RipperMercs/tensorfeed'

If you have feedback or need assistance with the MCP directory API, please join our Discord server