Skip to main content
Glama
114,452 tools. Last updated 2026-04-21 12:55
  • Deploy a project to the staging environment. This triggers: (1) Schema validation, (2) Docker image build, (3) GitHub commit, (4) Kubernetes deployment, (5) Database migrations. The operation is ASYNCHRONOUS - it returns immediately with a job_id. Use get_job_status with the job_id to monitor progress. Deployment typically takes 2-5 minutes depending on schema complexity. If deployment fails, check: (1) Schema format is FLAT (no 'fields' nesting), (2) Every field has a 'type' property, (3) Foreign keys reference existing tables, (4) No PostgreSQL reserved words in table/field names. Use get_project_info to see if the deployment succeeded.
    Connector
  • Check the status of a deployment job. STATUS VALUES: pending (job queued), running (deployment in progress), completed (success), failed (deployment failed). TIMELINE: Typical deployment takes 2-5 minutes. If status is 'running' for >10 minutes, check get_project_info for detailed pod status. If status is 'failed', use get_project_info to see deployment errors and check schema format (must be FLAT, no 'fields' nesting).
    Connector
  • Deploy a graph project to the staging environment. This triggers: (1) Schema validation, (2) Neo4j entity code generation, (3) Docker image build, (4) GitHub commit, (5) Kubernetes deployment with Neo4j instance. The operation is ASYNCHRONOUS — returns immediately with a job_id. Use get_job_status to monitor progress. Deployment typically takes 2-5 minutes. Use get_graph_project_info to verify deployment succeeded.
    Connector
  • Generate a proof-of-funds letter (PDF) for the authenticated user. Requires completed identity verification and at least one verified wallet. Returns a download link valid for 30 days. The response renders an inline widget with a thumbnail preview and download/share controls; you do not need to repeat the download URL in your text response — the widget handles presentation.
    Connector
  • Get the latest full snapshot of all tracked stablecoins with per-issuer breakdown. Returns each stablecoin's total_supply_usd, issuer, and per-chain deployment data.
    Connector
  • Show current deployment status of all groups — which version is live, last deployment type, and deployer.
    Connector

Matching MCP Servers

Matching MCP Connectors

  • GitHub MCP — wraps the GitHub public REST API (no auth required for public endpoints)

  • Connect your video workflows to cloud storage. Organize and access video assets across projects wi…

  • Generate a starter harness from a user's public GitHub profile. Reads their repos, topics, commit style, and pinned projects, synthesizes a first-draft CLAUDE.md / AGENTS.md / rules file for the chosen target. The agent writes the returned files to the user's machine. No OAuth required; uses unauthenticated public-repo data. Optional gh_token arg increases GitHub rate limits.
    Connector
  • Deploy a solution FROM its GitHub repo. Reads .ateam/export.json + connector source from the repo and feeds it into the deploy pipeline. Use this to restore a previous version or deploy from GitHub as the source of truth.
    Connector
  • Get the latest full snapshot of all tracked stablecoins with per-issuer breakdown. Returns each stablecoin's total_supply_usd, issuer, and per-chain deployment data.
    Connector
  • List all projects in the workspace associated with the API key. Returns a list of projects with id, name, type, image count, and more.
    Connector
  • Send a reply message to a ticket in Teamwork Desk by specifying the ticket ID and message body. Useful for automating ticket responses, integrating external communication systems, or customizing support workflows.
    Connector
  • Get the public profile of a GitHub user. Returns login, name, bio, company, location, public repos count, followers, and more.
    Connector
  • Create a new ticket in Teamwork Desk by specifying subject, description, priority, and status. "Useful for automating ticket creation, integrating external systems, or customizing support workflows.
    Connector
  • Look up Node.js package information from NPM registry. Returns latest version, download statistics (weekly/monthly), dependency list, package description, license, and GitHub link. Use for evaluating JavaScript libraries, checking maintenance status, or reviewing package popularity.
    Connector
  • Search GitHub repositories by keyword to discover code, projects, and libraries. Returns matching repositories with star count, description, language, and URL. Use for finding libraries, examples, or competitive projects in specific domains.
    Connector
  • Build and deploy a governed AI Team solution in one step. ⚠️ HEAVIEST OPERATION (60-180s): validates solution+skills → deploys all connectors+skills to A-Team Core (regenerates MCP servers) → health-checks → optionally runs a warm test → auto-pushes to GitHub. AUTO-DETECTS GitHub repo: if you omit mcp_store and a repo exists, connector code is pulled from GitHub automatically. First deploy requires mcp_store. After that, write files via ateam_github_write, then just call build_and_run without mcp_store. For small changes to an already-deployed solution, prefer ateam_patch (faster, incremental). Requires authentication.
    Connector
  • Rollback a project to a previous version. ⚠️ WARNING: This reverts schema AND code to the specified commit. Database data is NOT rolled back. Use get_version_history to find the commit SHA of the version you want to rollback to. After rollback, use get_job_status to monitor the redeployment. Rollback is useful when a schema change breaks deployment.
    Connector
  • Return annual-accounts filings (financial statements) for a company. Convenience wrapper over `list_filings(category='accounts')` that normalizes the fiscal-period shape across registries and pre-computes the download URL so callers don't need a second `get_document_metadata` round-trip. Each item has `period_end` (fiscal-period end date, the primary sort key a user thinks in), optional `period_start` / `registration_date`, a `document_id` that can be passed to `fetch_document`, `document_format` (e.g. XBRL XML, XHTML, PDF — may be empty when the upstream negotiates format on fetch), `source_url` for direct download, and `jurisdiction_data` carrying raw upstream fields verbatim. Results are newest-first. Filters: `year=YYYY` keeps periods ending in that calendar year; `period_end=YYYY-MM-DD` pinpoints a single period (takes precedence over `year`). `limit` caps the post-filter slice — omit to return all matches. The whole accounts history is walked per query because late-filed amendments can land out of order. If the adapter doesn't implement `list_filings` at all, this returns 501. Per-country caveats (ID format, document format availability, whether bodies are paid) — call `list_jurisdictions({jurisdiction:"<code>"})`.
    Connector
  • Returns a curated list of example plans with download links for reports and zip bundles. Use this to preview what PlanExe output looks like before creating your own plan. Especially useful when the user asks what the output looks like before committing to a plan. No API key required.
    Connector
  • Fetch the next page of a large tool response. Use the nextCursor from _pagination in a previous response. This tool loads data into the context window — prefer the artifact download URL when available.
    Connector
  • Create a new sncro session. Returns a session key and secret. Args: project_key: The project key from CLAUDE.md (registered at sncro.net) git_user: The current git username (for guest access control). If omitted or empty, the call is treated as a guest session — allowed only when the project owner has "Allow guest access" enabled. brief: If True, skip the first-run briefing (tool list, tips, mobile notes) and return a compact response. Pass this on the second and subsequent create_session calls in the same conversation, once you already know how to use the tools. After calling this, tell the user to paste the enable_url in their browser. Then use the returned session_key and session_secret with all other sncro tools. If no project key is available: tell the user to go to https://www.sncro.net/projects to register their project and get a key. It takes 30 seconds — sign in with GitHub, click "+ Add project", enter the domain, and copy the project key into CLAUDE.md.
    Connector
  • Deletes a stream, specified by the provided resource 'name' parameter. * The resource 'name' parameter is in the form: 'projects/{project name}/locations/{location}/streams/{stream name}', for example: 'projects/my-project/locations/us-central1/streams/my-streams'. * This tool returns a long-running operation. Use the 'get_operation' tool with the returned operation name to poll its status until it completes. Operation may take several minutes; do not check more often than every ten seconds.
    Connector
  • Bulk create subnames under a parent ENS name. Designed for agent fleet deployment — create identities like agent001.company.eth, agent002.company.eth, etc. Each subname can have its own owner and records (addresses, text records). Uses the ENS NameWrapper for subname creation. Returns complete transaction recipes (contract address, encoded calldata, gas estimates) for each subname. Your wallet signs and broadcasts the transactions. Subnames are free to create — only gas costs apply.
    Connector
  • Find working SOURCE CODE examples from 27 indexed Senzing GitHub repositories. Indexes only source code files (.py, .java, .cs, .rs) and READMEs — NOT build files (Cargo.toml, pom.xml), data files (.jsonl, .csv), or project configuration. For sample data, use get_sample_data instead. Covers Python, Java, C#, and Rust SDK usage patterns including initialization, record ingestion, entity search, redo processing, and configuration. Also includes message queue consumers, REST API examples, and performance testing. Supports three modes: (1) Search: query for examples across all repos, (2) File listing: set repo and list_files=true to see all indexed source files in a repo, (3) File retrieval: set repo and file_path to get full source code. Use max_lines to limit large files. Returns GitHub raw URLs for file retrieval — fetch to read the source code.
    Connector
  • List all generated reports with status and summary info. Returns an array of report objects with id, report_type, status, title, and summary. Use the report id with atlas_get_report for details or atlas_download_report to download completed PDFs. Free.
    Connector
  • [BUY — Agent Step 2] Confirm your USDC payment and claim the drop. Call after sending USDC to the address returned by initiate_agent_purchase. Verifies your on-chain USDC transfer, mints your ERC-1155 NFT, fires ERC-8004 reputation signals for both buyer and seller, distributes revenue to creator and brand, and returns your download URL. Include buyerAgentId (your ERC-8004 agent ID) for an agent-to-agent trust signal on-chain.
    Connector
  • Generate a Record of Art Identity (RAI) PDF for a work. Returns a download URL and UWI (Universal Work Identifier). The RAI is the signed, portable, verifiable identity of the work. Use search_natural_language to find the work_id by title — never ask the user for it. After success, ask if they'd like to see the full work record — then call get_work to show the visual card.
    Connector
  • Return specific pages of a PDF in one of three formats: • format='pdf' — pdf-lib page slice, preserves the original text layer and fonts (no re-encoding). This is the ONLY format that gives you byte-exact, citation-grade content. Use this for financial numbers, legal quotes, and any answer requiring precision. • format='text' — raw extracted text from pdfjs. Machine-readable but NOT authoritative — OCR errors on bad-quality text layers can silently garble digits. Use only for summarisation / light reading, and cross-check numbers by re-fetching with format='pdf'. • format='png' — page rasterization via Cloudflare Browser Rendering, for documents with text_layer='none' (scanned PDFs). Phase 6 — may return 'not implemented' in current deployment. The response includes at most 100 pages (Anthropic document-block hard cap). Split larger ranges into multiple calls. Requires the document's bytes to already be cached — call fetch_document on the full document first if this is a new filing.
    Connector
  • Confirm a narrative lens and generate targeted CV edits with trade-offs (5 credits, takes 20-30s). Returns an array of section edits with before/after text, trade-off notes, and optionally clean + review PDF download URLs. This is step 3 (final step) of the positioning pipeline. Pass confirmed_lens from ceevee_analyze_positioning, and optionally positioning_snapshot, detected_lens_full, recruiter_inference, selected_opportunities from prior steps for richer edits. Use ceevee_explain_change to understand any specific edit.
    Connector
  • List active projects the authenticated user has access to. By default, only projects with an active status (CUSTOMER, PITCH, TRIAL, ONBOARDING, API_PARTNER) are returned. Set include_inactive to true to include ended/paused projects. Returns columnar JSON: {columns, rows, rowCount}. Columns: id, name, status. The id is used as project_id in other tools. Call this first to discover available projects.
    Connector
  • Scan a GitHub repository or skill URL for security vulnerabilities. This tool performs static analysis and AI-powered detection to identify: - Hardcoded credentials and API keys - Remote code execution patterns - Data exfiltration attempts - Privilege escalation risks - OWASP LLM Top 10 vulnerabilities Requires a valid X-API-Key header. Cached results (24h) do not consume credits. Args: skill_url: GitHub repository URL (e.g., https://github.com/owner/repo) or raw file URL to scan Returns: ScanResult with security score (0-100), recommendation, and detected issues. Score >= 80 is SAFE, 50-79 is CAUTION, < 50 is DANGEROUS. Example: scan_skill("https://github.com/anthropics/anthropic-sdk-python")
    Connector
  • Download a completed report as PDF. Returns base64-encoded PDF content. Confirm report status='completed' via atlas_get_report(report_id) first. report_id from atlas_start_report response or atlas_list_reports. Free.
    Connector
  • Get state-level broadband availability summary. Returns aggregated broadband statistics for the state including provider counts and technology deployment. Useful for BEAD program analysis to identify states with significant unserved/underserved populations. Args: state_fips: 2-digit state FIPS code (e.g. '53' for Washington, '11' for DC). Always a string, never an integer. speed_download: Minimum download speed threshold in Mbps (default 25). speed_upload: Minimum upload speed threshold in Mbps (default 3). as_of_date: BDC filing date in YYYY-MM-DD format (default 2024-06-30).
    Connector
  • Get broadband providers and availability at a specific lat/lon location. Returns a list of broadband providers serving the location with their advertised download/upload speeds and technology types. Includes BEAD classification (unserved/underserved/served) based on max available speeds. NOTE: The FCC Broadband Map API has bot protection and may reject requests. If you get an error, the API endpoint may have changed. The FCC updates this API frequently without notice. Args: latitude: Location latitude (e.g. 38.8977 for Washington DC). longitude: Location longitude (e.g. -77.0365 for Washington DC). technology_code: Filter by technology (0=All, 10=Copper, 40=Cable, 50=Fiber, 60=Satellite, 70=Fixed Wireless). speed_download: Minimum download speed in Mbps (default 25). speed_upload: Minimum upload speed in Mbps (default 3). as_of_date: BDC filing date in YYYY-MM-DD format (default 2024-06-30).
    Connector
  • Generate SDK scaffold code for common workflows. Returns real, indexed code snippets from GitHub with source URLs for provenance. Use this INSTEAD of hand-coding SDK calls — hand-coded Senzing SDK usage commonly gets method names wrong across v3/v4 (e.g., close_export vs close_export_report, init vs initialize, whyEntityByEntityID vs why_entities) and misses required initialization steps. Languages: python, java, csharp, rust. Workflows: initialize, configure, add_records, delete, query, redo, stewardship, information, full_pipeline (aliases accepted: init, config, ingest, remove, search, redoer, force_resolve, info, e2e). V3 supports Python and Java only. Returns GitHub raw URLs — fetch each snippet to read the source code.
    Connector
  • [BUY — Step 2] Complete the purchase by submitting the signed EIP-712 permit from initiate_purchase. Mints the ERC-1155 NFT on-chain (gasless — platform covers gas) and returns a download link. For physical products, you MUST include shipping address fields. The response includes revenue split details.
    Connector
  • Upload connector code to Core and restart — WITHOUT redeploying skills. Use this to update connector source code (server.js, UI assets, plugins) quickly. Set github=true to pull files from the solution's GitHub repo, or pass files directly. Much faster than ateam_build_and_run for connector-only changes.
    Connector
  • Check the status of a transcribe or summarize job. Returns the current state and, when completed, presigned download URLs for each output file. Optionally pass `format` (srt, txt, vtt, json) to get the transcript content inline — useful when you need the text directly without fetching a URL. Poll this periodically after calling complete_upload — wait at least 60 seconds between checks. For files under 10 minutes, jobs usually complete within 1-2 minutes. For long files (1hr+), expect 10-30 minutes. Download URLs are presigned and time-limited (1 hour); fetch them when needed rather than caching long-term. Also use this to recover from lost state: if the original challenge was lost, call get_job_status(job_id) to retrieve a fresh challenge (status "awaiting_payment") or the upload URL (status "awaiting_upload").
    Connector
  • Edit a file in the solution's GitHub repo and commit. Two modes: 1. FULL FILE: provide `content` — replaces entire file (good for new files or small files) 2. SEARCH/REPLACE: provide `search` + `replace` — surgical edit without sending full file (preferred for large files like server.js) Always use search/replace for large files (>5KB). Always read the file first with ateam_github_read to get the exact text to search for.
    Connector
  • Lists all projects accessible by the user. Call this function first to discover available projects.
    Connector
  • Delete an instance from a project. The request requires the 'name' field to be set in the format 'projects/{project}/instances/{instance}'. Example: { "name": "projects/my-project/instances/my-instance" } Before executing the deletion, you MUST confirm the action with the user by stating the full instance name and asking for "yes/no" confirmation.
    Connector
  • Retrieve a list of all AWS regions. ## Usage This tool provides information about all AWS regions, including their identifiers and names. ## When to Use - When planning global infrastructure deployments - To validate region codes for other API calls - To get a complete AWS regional inventory ## Result Interpretation Each region result includes: - region_id: The unique region code (e.g., 'us-east-1') - region_long_name: The human-friendly name (e.g., 'US East (N. Virginia)') ## Common Use Cases 1. Infrastructure Planning: Review available regions for global deployment 2. Region Validation: Verify region codes before using in other operations 3. Regional Inventory: Get a complete list of AWS's global infrastructure
    Connector
  • [BUY — Agent Step 2] Confirm your USDC payment and claim the drop. Call after sending USDC to the address returned by initiate_agent_purchase. Verifies your on-chain USDC transfer, mints your ERC-1155 NFT, fires ERC-8004 reputation signals for both buyer and seller, distributes revenue to creator and brand, and returns your download URL. Include buyerAgentId (your ERC-8004 agent ID) for an agent-to-agent trust signal on-chain.
    Connector
  • Gets the status of a long-running operation. ***Usage*** Some tools (for example, `run_stream`) return a long-running operation. You can use this tool to get the status of the operation. It can be called repeatedly until the operation is complete. **Parameters** * `name`: The name of the operation to get. * `name` should be the name returned by the tool that initiated the operation. * `name` should be in the format of: `projects/{project}/locations/{location}/operations/{operation}`. **Returns** * An `Operation` object that contains the status of the operation. * If the operation is not complete, the response will be empty. Do not check more than every ten seconds. * If the operation is complete, the response will contain either: * A `response` field that contains the result of the operation and indicates that it was successful. * A `error` field that indicates any errors that occurred during the operation.
    Connector
  • Render a Mermaid diagram definition and return the image with metadata. The definition should be valid Mermaid syntax (e.g. flowchart, sequence, class, ER, state, or Gantt diagram). Returns a list of content blocks: the rendered image plus a JSON text block with metadata including a mermaid.live edit link for opening the diagram in a browser editor. Args: definition: Mermaid diagram definition text. filename: Output filename without extension. format: Output format — ``"png"`` (default), ``"svg"``, or ``"pdf"``. download_link: If True, store the image on the server and return a temporary download URL path (/images/{token}) instead of the inline image. The link expires after 15 minutes.
    Connector
  • Get report status and metadata (without PDF). Returns status (pending/processing/completed/failed), title, type, inputs, and summary. This is the polling tool for ceevee_generate_report — call every 30 seconds, up to 40 times (20 min max). When status='completed', download PDF with ceevee_download_report(report_id). If status='failed', relay error_message. If still processing after 40 polls, stop and give the user the report_id to check later. Free.
    Connector
  • Generate industry-standard documentation for any project using SUMA graph memory. This tool does NOT fabricate. It retrieves real war stories, architecture rulings, and deployment facts from the K-WIL graph, then uses Gemini to render them as professional documentation. The graph IS the source of truth — suma_doc makes it readable. Why this beats a generic doc generator: Generic: "Here is how to install." (stateless, stale, hallucinated) suma_doc: "We chose REST over MCP because [Architect Ruling Apr 5]. Here is how it works in production: [real deployment from graph]. Avoid X — we tried it and [root cause]." Args: prompt: What documentation to generate. Be specific. Examples: "Write a README for the SUMA MCP Server API" "Generate an ARCHITECTURE.md explaining the ring_search algorithm" "Write a CHANGELOG entry for today's /api/wakeup deployment" "Create an API reference for /api/ingest and /api/search" "Write an onboarding guide for a new backend engineer joining the QMS team" project: Optional filter to narrow graph search to a specific product. Examples: "suma-mcp", "squad-qms", "squad-ghostgate", "squad-companion" doc_type: Optional hint to shape output format. "readme" → GitHub README with badges + sections "architecture" → Design doc with decisions, trade-offs, diagrams description "api_reference" → Endpoint table + request/response examples "changelog" → Conventional Commits format, grouped by type "onboarding" → Step-by-step guide for a new engineer "runbook" → Ops runbook with commands, failure modes, escalation If omitted, Gemini infers the best format from the prompt. Returns: document: The generated documentation (markdown) nodes_used: Number of graph nodes retrieved as source material source_summary: Brief description of what the graph provided doc_type_detected: What format was generated
    Connector
  • Detect a company's technology stack by analyzing HTTP headers, DNS records, and GitHub repositories. Returns frameworks, programming languages, hosting providers, analytics tools, and CDNs. Use this instead of lookup_company when you only need technology information. Requires a domain name — company names are not supported for this tool.
    Connector