Skip to main content
Glama
127,390 tools. Last updated 2026-05-05 15:09

"namespace:io.github.cammac-creator" matching MCP tools:

  • Task management operations. Create and manage task lists and individual tasks within workspaces and shares. Tasks support statuses, priorities, assignees, dependencies, and bulk operations. Responses include `created_by` (user ID of creator). On shares, requires admin or named member role. Destructive actions: delete-list soft-deletes a task list and all its tasks, delete-task soft-deletes a task. Actions & required params: - list-lists: profile_type, profile_id (+ optional: sort_by, sort_dir, limit, offset, format) - create-list: profile_type, profile_id, name (+ optional: description) - list-details: list_id (+ optional: format) - update-list: list_id (+ optional: name, description) - delete-list: list_id [DESTRUCTIVE] - list-tasks: list_id (+ optional: sort_by, sort_dir, status, assignee, limit, offset, format) - create-task: list_id, title (+ optional: description, status, priority, assignee_id, dependencies, node_id) - task-details: list_id, task_id (+ optional: format) - update-task: list_id, task_id (+ optional: title, description, status, priority, assignee_id, dependencies, node_id) - delete-task: list_id, task_id [DESTRUCTIVE] - change-status: list_id, task_id, status - assign-task: list_id, task_id (+ optional: assignee_id) - bulk-status: list_id, task_ids, status - move-task: list_id, task_id, target_task_list_id — move a task to a different list (+ optional: sort_order) - reorder-tasks: list_id, task_ids — reorder tasks within a list - reorder-lists: profile_type, profile_id, list_ids — reorder task lists - filtered-list: profile_type, profile_id, filter (+ optional: status [required when filter=status], limit, offset, format) - summary: profile_type, profile_id (+ optional: format)
    Connector
  • Step 2 of the MCP donation flow. Required inputs: campaign_id, amount, reasoning, and tx_hash. This tool verifies the on-chain payment by checking the expected network, the USDC token contract, the recipient creator wallet, the declared amount, confirmation status, duplicate tx_hash replay protection, and that the transaction sender matches the calling agent's wallet_address. If verification succeeds, it records the donation, increments campaign funded_amount, and returns donation_id, status 'completed', and tx_hash.
    Connector
  • Fetch evidence documents for one campaign. Required input: campaign_id. This tool checks the calling agent's rolling 30-day donation volume against the configured evidence threshold. If the agent is not eligible yet, it returns a structured response with eligibility_status, total_30d, and evidence_threshold. If the agent is eligible and evidence pricing is still inactive (evidence_access_price = 0), it returns evidence_documents directly. If the agent is eligible and evidence pricing is active (evidence_access_price > 0), it returns the canonical x402 handoff shape: status 'payment_required', x402_endpoint, price, and currency. Available documents include document_id, document_type, mime_type, file_size_bytes, submitted_at, status 'available', signed_url, signed_url_expires_at, and file_reference. signed_url is a time-limited URL for fetching file bytes and expires after 15 minutes; agents should use signed_url rather than file_reference. Creator-deleted evidence is returned as a tombstone with document_id, document_type, mime_type, file_size_bytes, submitted_at, status 'removed', deleted_at, signed_url null, signed_url_expires_at null, and file_reference retained for backwards compatibility. zooidfund retains tombstone metadata after file deletion, and agents are responsible for retaining copies of any evidence used in donation decisions.
    Connector
  • Spawn a new on-chain $fomox402 round. You become the creator. WHAT IT DOES: invokes the Anchor program's `create_game` instruction, paying the rent for new round-specific PDAs. The calling agent's wallet becomes the round's creator and earns creatorBps of every settled pot for the round's lifetime — including all dividends ratcheting up before settle. WHEN TO USE: when no live round suits your strategy, or when you want to earn a long-term creator share. Each round costs ~0.005 SOL in rent (refunded to the creator on settle). DEFAULTS (omit to accept): - minBidRaw = '1' (1 raw atomic unit of the chosen token) - tokenMint = $fomox402 mint - tokenDecimals = 9 - roundDurationSec = 600 (10 minutes) - antiSnipeThresholdSec= 30 (last 30s extends the timer) - antiSnipeExtensionSec= 30 (each anti-snipe bid adds 30s) - winnerBps = 8000 (80% of pot to last bidder) - creatorBps = 500 (5% to creator — that's you) - referrerBps = 500 (5% to bidder's referrer if any) - devBps = 1000 (10% to staccpad.fun dev wallet) Splits MUST sum to 10000 bps. RETURNS: { gameId, creator, tx (Solana sig), config: { ...effective defaults } }. RELATED: list_games (find existing rounds), place_bid (the first bid is the biggest moat — consider seeding your own round).
    Connector
  • Todo checklist operations. List, create, view, update, delete, toggle, and bulk-toggle todos scoped to workspaces and shares. Responses include `created_by` (user ID of creator). Requires workflow to be enabled on the target entity (workspace action enable-workflow or share action enable-workflow). On shares, requires admin or named member role. Destructive action: delete soft-deletes a todo. Actions & required params: - list: profile_type, profile_id (+ optional: sort_by, sort_dir, filter_done, limit, offset, format) - create: profile_type, profile_id, title (+ optional: assignee_id) - details: todo_id (+ optional: format) - update: todo_id (+ optional: title, done, assignee_id) - delete: todo_id [DESTRUCTIVE] - toggle: todo_id - bulk-toggle: profile_type, profile_id, todo_ids, done - filtered-list: profile_type, profile_id, filter (+ optional: limit, offset, format) - summary: profile_type, profile_id (+ optional: format)
    Connector
  • View applications for your listing. Returns each applicant's profile (name, skills, equipment, location, reputation, jobs completed) and their pitch message. Use this to evaluate candidates, then hire with make_listing_offer. Only the listing creator can view applications.
    Connector

Matching MCP Servers

  • A
    license
    A
    quality
    C
    maintenance
    A tool that enables AI assistants to conversationally scaffold, build, and publish Python MCP servers to PyPI. It automates the entire development lifecycle, including package naming, tool scaffolding, GitHub repository setup, and package publishing.
    Last updated
    10
    MIT

Matching MCP Connectors

  • IBAN validation, BIC/SWIFT lookup, Swiss BC-Nummer, EMI/vIBAN, SEPA + VoP, compliance scoring.

  • GitHub MCP — wraps the GitHub public REST API (no auth required for public endpoints)

  • [BUY — Agent Step 2] Confirm your USDC payment and claim the listing. Call after sending USDC to the address returned by initiate_agent_purchase. Verifies your on-chain USDC transfer, mints your ERC-1155 NFT, fires ERC-8004 reputation signals for both buyer and seller, distributes revenue to creator and brand, and returns your download URL. Include buyerAgentId (your ERC-8004 agent ID) for an agent-to-agent trust signal on-chain. For physical products you MUST include: shipping_name, shipping_address_line1, shipping_city, shipping_postal_code, shipping_country, shipping_phone, and buyerEmail. shipping_phone is required for delivery confirmation. buyerEmail is required so the buyer receives their order confirmation.
    Connector
  • Search for specific videos within a YouTube channel. Use when the user wants to find a specific video from a known creator, e.g. 'find the video where @mkbhd talks about iPhone', 'did [creator] ever cover [topic]'. Costs 1 credit.
    Connector
  • File upload operations. Chunked uploads via POST /blob sidecar (create session, POST raw binary to /blob, upload chunks with blob_id, finalize), streaming uploads (single-call `stream-upload` that creates a session and streams in one shot — auto-finalizes), web URL imports, batch uploads (many small files in one round-trip via `batch`), and upload configuration. Side effects: finalize/stream/stream-upload/batch create new files that consume storage credits. UPLOAD STRATEGY (default to single-file paths): 1) For files with a URL: use `web-import` (single call). 2) For files with unknown size (generated/piped content): use `stream-upload` — one call creates the session and streams the bytes (auto-finalizes). 3) For files with known size: create-session → POST to /blob → chunk with blob_id → finalize. The POST /blob sidecar is the canonical large-file path — bypasses MCP transport limits, no base64 overhead, up to 100 MB. Specialized: `batch` is for the specific case of uploading multiple small files (≤4 MB each, ≤200 per call, ≤100 MB total) in one round-trip — useful for AI-output bundles, exported CSVs, receipts. Don't reach for `batch` for one-off single uploads or streamed content; use the single-file paths above. BINARY CONTENT IN-BAND: `content` is **text-only** — it is stored verbatim as UTF-8 bytes, so passing a base64 string there will write a base64-encoded text file (NOT the decoded binary). For sandboxed agents that can produce base64 in JSON but cannot reach POST /blob, use `content_base64` on `chunk`/`stream`/`stream-upload`/`batch` entries — the server decodes it before writing. Practical cap is bounded by the MCP transport message size (a few MB); for larger binary files, POST /blob + `blob_id` is still the right path. STREAM MODE: When you don't know the file size upfront, prefer the consolidated `stream-upload` action — it accepts profile/parent/filename plus one of content|content_base64|blob_id and handles create-session + stream + auto-finalize internally. The lower-level `create-session` (with stream=true) + `stream` pair is still supported for cases where you need the session ID between calls. MAX_SIZE GUIDANCE: `max_size` is a ceiling on the stream body — exceeding it aborts the upload mid-transfer. **Always overestimate, never undershoot.** There is no penalty for setting it higher than you need. Safest default: omit `max_size` entirely and the server uses your plan's file-size limit. Note: streaming uploads via MCP are also bounded by the `POST /blob` sidecar (100 MB cap per blob) — for larger files, use the chunked flow (`create-session` → `chunk` → `finalize`) instead, and call `upload` action `limits` first to confirm your plan's max file size. POST /blob SIDECAR: The MCP server exposes a `/blob` HTTP endpoint that accepts raw data (no base64, no MCP transport limit, up to 100 MB). The create-session response includes blob_upload with the endpoint URL, your session ID, and a ready-to-use curl command. Blobs expire after 5 minutes and are single-use. OVERWRITE A SPECIFIC NODE: Pass `target_node_id` on create-session or stream-upload to deterministically overwrite a specific node (preserves node_id; new version created). This is the reliable way to update an existing file — don't delete+reupload. When target_node_id is set, parent_node_id is ignored and filename is optional — if omitted, the existing node's current filename is auto-resolved and reused (pass filename only when renaming). BATCH MODE: `batch` uploads up to 200 small files to one target folder in a single round-trip. Hard limits: ≤200 files per call, ≤4 MB per file, ≤100 MB total resolved bytes. Any file exceeding 4 MB rejects the whole call — route those through the chunked flow (create-session → chunk → finalize). `batch` requires authentication (anonymous callers are rejected with HTTP 401, code 10011) — for unauthenticated public-receive/public-exchange share uploads, use the single-file `create-session` path instead. Input: `files[]` array where each entry has `filename` (required), one of `blob_id`/`content`/`content_base64` (required), and optional `relative_path` (trailing slash required; auto-normalized; no leading slash, no `.`/`..` segments) to place the file in a sub-folder. The batch endpoint is rate-limited in a bucket independent of single-file `/upload/` — on HTTP 429 the tool surfaces the `x-ve-limit-expires` UTC datetime in the error message along with the remaining `x-ve-limit-avail`/`x-ve-limit-max` quota. SHA-256 is computed client-side by default (set `include_hash: false` to skip). **Partial success is normal**: HTTP 200 with `count_errored > 0` is a successful response — inspect `results[]` per entry and retry only the errored ones. When every entry errors, `all_failed: true` is set on the response. **`node_id` is nullable on success**: workspaces with async storage return `status: "ok"` with `node_id: null` — the storage node is assigned later; this is SUCCESS, not failure. If the final node_id is required, poll `storage` action `list` with the target folder afterward. Actions & required params: - create-session: profile_type, profile_id, parent_node_id, filename, filesize (+ optional: chunk_size, stream, max_size, target_node_id). When stream=true, filesize is optional. When target_node_id is provided, parent_node_id is ignored and filename is optional (auto-resolved from the existing node). - stream-upload: profile_type, profile_id, parent_node_id, filename, content | content_base64 | blob_id (exactly one) (+ optional: max_size, target_node_id, hash, hash_algo). Creates a stream session and uploads in one call. Auto-finalizes. When target_node_id is provided, parent_node_id is ignored and filename is optional (auto-resolved from the existing node). - batch: profile_type, profile_id, files[] (1..200 items, each with filename + exactly one of blob_id|content|content_base64, optional relative_path/hash/hash_algo) (+ optional: folder_id, creator, include_hash). - chunk: upload_id, chunk_number, content | content_base64 | blob_id (exactly one). Not allowed on stream sessions. - stream: upload_id, content | content_base64 | blob_id (exactly one) (+ optional: hash, hash_algo). Only for stream sessions. Auto-finalizes. Prefer `stream-upload` unless you need the session ID between calls. - finalize: upload_id. Not needed for stream sessions. - status: upload_id (+ optional: wait) - cancel: upload_id [DESTRUCTIVE] - list-sessions: (none) - cancel-all: (none) [DESTRUCTIVE] - chunk-status: upload_id (+ optional: chunk_id) - chunk-delete: upload_id, chunk_number [DESTRUCTIVE] - web-import: profile_type, profile_id, parent_node_id, url (+ optional: filename) - web-list: (+ optional: limit, offset, status) - web-cancel: upload_id [DESTRUCTIVE] - web-status: upload_id - limits: (+ optional: action_context, instance_id, file_id, org) - extensions: (+ optional: plan) - blob-info: (none) — returns POST /blob endpoint URL, session ID, headers, curl example, and workflow for shell-based uploads
    Connector
  • Create a structured prediction market comparing two competitors on a specific metric (volume, trades, price, unique_traders). 0.5 SOL seed, creator fee tiers (NEW 25%, PROVEN 35%, ELITE 50%), automated data resolution. For freeform claim-based markets use create_tmb_battle instead (0.1 SOL seed, any statement, tribunal-resolved).
    Connector
  • [BUY — Agent Step 2] Confirm your USDC payment and claim the listing. Call after sending USDC to the address returned by initiate_agent_purchase. Verifies your on-chain USDC transfer, mints your ERC-1155 NFT, fires ERC-8004 reputation signals for both buyer and seller, distributes revenue to creator and brand, and returns your download URL. Include buyerAgentId (your ERC-8004 agent ID) for an agent-to-agent trust signal on-chain. For physical products you MUST include: shipping_name, shipping_address_line1, shipping_city, shipping_postal_code, shipping_country, shipping_phone, and buyerEmail. shipping_phone is required for delivery confirmation. buyerEmail is required so the buyer receives their order confirmation.
    Connector
  • Fetch full details for a single MCP server by its slug. Returns description, install commands, security score/risk/findings, ratings, creator info, setup requirements (API keys/credentials the user will need), and the list of MCP tools the server exposes. The `security.critical_findings` array lists every severity=critical|high issue — you MUST show these to the user before recommending they install. Use this as the last step before any install recommendation.
    Connector
  • Resolve a YouTube @handle, channel URL, or video link to get channel information. Use when the user mentions ANY YouTuber, creator, channel name, or @handle. Also use to identify which channel uploaded a specific video. Free, no credits consumed.
    Connector
  • Get Instagram Profile Fetches a public Instagram profile by username (handle) and returns full name, biography, external link, profile picture URL, followers count, following count, posts count, verified/private flags, and category. Use to enrich CRM/lead records, verify influencer reach before outreach, monitor competitor accounts, or build datasets of creator metadata for vetting and analytics.
    Connector
  • Edit metadata (title, creator, external ID). Use new_external_id to re-link to correct database entry, then call re_enrich to fetch correct metadata.
    Connector
  • Retrieve a contract by ID, SHA-256 hash, or UUID. With a valid API key (contract creator): returns the full contract including human-readable text, machine-readable JSON, status, and principal declaration. Without authentication: returns metadata only (contract_id, status, hash, dates). Supports three lookup formats: - Contract ID: "amb-2026-0042" - SHA-256 hash: 64-character hex string - UUID: Standard UUID format Args: - id (string, required): Contract ID, SHA-256 hash, or UUID Returns: Full contract (if authorized) or metadata-only response. Legibility: retrieval preserves the dual-format pairing — prose and JSON always replay to the same SHA-256.
    Connector
  • Search the USER'S COLLECTION by title, creator, genre, or theme. Returns items they own with status, ratings, and for box sets, the list of contained albums. Use this to answer questions about what the user has.
    Connector
  • List all MCP servers by a single creator, plus aggregate trust signals. Use to evaluate a publisher holistically: 'do they ship consistently?', 'what's their security track record?', 'are there other servers by the same author?'. Match is case-insensitive on display name. Returns aggregate stats (total servers, avg security score, grade distribution, critical-finding count) plus the per-server list.
    Connector
  • Retrieves a creator's profile and performance statistics including followers, average views, view rates, engagements, engagement rate, average likes, comments, shares, and saves. Looks up the creator by their social media handle and platform type. If the creator is not yet in the database but exists on a supported platform (Instagram, TikTok, YouTube), it will be discovered and saved automatically.
    Connector