Skip to main content
Glama
128,354 tools. Last updated 2026-05-06 02:12

"Save all service information from mcp.so-like websites to local storage" matching MCP tools:

  • FOR CLAUDE DESKTOP ONLY (with filesystem access). For Claude.ai/web: Use create_upload_session instead - it provides a browser upload link. Upload local media to cloud storage, returning a public HTTPS URL. WHEN TO USE: • Instagram, LinkedIn, Threads, X: REQUIRED for local files before calling publish_content • TikTok: NOT NEEDED - pass local path directly to publish_content SUPPORTED FORMATS: • Images: jpg, png, gif, webp (max 10MB) • Videos: mp4, mov, webm (max 100MB) Returns { url: 'https://...' } for use in publish_content mediaUrl parameter.
    Connector
  • DESTRUCTIVE — IRREVERSIBLE. Permanently delete a file from the user's Drive. Removes the file from S3 storage and the database. Storage quota is freed immediately. ALWAYS ask for explicit user confirmation before calling this tool. # delete_file ## When to use DESTRUCTIVE — IRREVERSIBLE. Permanently delete a file from the user's Drive. Removes the file from S3 storage and the database. Storage quota is freed immediately. ALWAYS ask for explicit user confirmation before calling this tool. ## Parameters to validate before calling - file_token (string, required) — The file token (UUID) of the file to delete. Get via fetch_files. ## Notes - DESTRUCTIVE — IRREVERSIBLE. Always confirm with the user before calling. Explain what will be lost.
    Connector
  • Switch between local and remote DanNet servers on the fly. This tool allows you to change the DanNet server endpoint during runtime without restarting the MCP server. Useful for switching between development (local) and production (remote) servers. Args: server: Server to switch to. Options: - "local": Use localhost:3456 (development server) - "remote": Use wordnet.dk (production server) - Custom URL: Any valid URL starting with http:// or https:// Returns: Dict with status information: - status: "success" or "error" - message: Description of the operation - previous_url: The URL that was previously active - current_url: The URL that is now active Example: # Switch to local development server result = switch_dannet_server("local") # Switch to production server result = switch_dannet_server("remote") # Switch to custom server result = switch_dannet_server("https://my-custom-dannet.example.com")
    Connector
  • List all 15 supported email clients with IDs, names, rendering engines, dark mode support, and deprecation status. Use the returned IDs to filter other tools like preview_email or capture_screenshots.
    Connector
  • Export a generated image asset by session and asset ID. Returns the image inline as base64 along with metadata (format, dimensions, size). When running locally (stdio transport), you can optionally provide a destinationPath to save the image to disk. USAGE: After generating an image with generateImage, use the sessionId and assetId to export: exportImageAsset(sessionId="...", assetId="...") To save to disk (local/stdio only): exportImageAsset(sessionId="...", assetId="...", destinationPath="/Users/me/project/images/logo.png")
    Connector
  • Get information about Follow On Tours — who we are, how we work, our experience, and how the bespoke cricket travel service operates. Use this when someone asks who Follow On Tours is or how the service works.
    Connector

Matching MCP Servers

Matching MCP Connectors

  • Encrypted A2A object storage for autonomous agent state and artifacts

  • AI-to-AI petrol station. 56 pay-per-call endpoints covering market signals, crypto/DeFi, geopolitics, earnings, insider trades, SEC filings, sanctions screening, ArXiv research, whale tracking, and more. Micropayments in USDC on Base Mainnet via x402 protocol.

  • Import data into a Cloud SQL instance. If the file doesn't start with `gs://`, then the assumption is that the file is stored locally. If the file is local, then the file must be uploaded to Cloud Storage before you can make the actual `import_data` call. To upload the file to Cloud Storage, you can use the `gcloud` or `gsutil` commands. Before you upload the file to Cloud Storage, consider whether you want to use an existing bucket or create a new bucket in the provided project. After the file is uploaded to Cloud Storage, the instance service account must have sufficient permissions to read the uploaded file from the Cloud Storage bucket. This can be accomplished as follows: 1. Use the `get_instance` tool to get the email address of the instance service account. From the output of the tool, get the value of the `serviceAccountEmailAddress` field. 2. Grant the instance service account the `storage.objectAdmin` role on the provided Cloud Storage bucket. Use a command like `gcloud storage buckets add-iam-policy-binding` or a request to the Cloud Storage API. It can take from two to up to seven minutes or more for the role to be granted and the permissions to be propagated to the service account in Cloud Storage. If you encounter a permissions error after updatingthe IAM policy, then wait a few minutes and try again. After permissions are granted, you can import the data. We recommend that you leave optional parameters empty and use the system defaults. The file type can typically be determined by the file extension. For example, if the file is a SQL file, `.sql` or `.csv` for CSV file. The following is a sample SQL `importContext` for MySQL. ``` { "uri": "gs://sample-gcs-bucket/sample-file.sql", "kind": "sql#importContext", "fileType": "SQL" } ``` There is no `database` parameter present for MySQL since the database name is expected to be present in the SQL file. Specify only one URI. No other fields are required outside of `importContext`. For PostgreSQL, the `database` field is required. The following is a sample PostgreSQL `importContext` with the `database` field specified. ``` { "uri": "gs://sample-gcs-bucket/sample-file.sql", "kind": "sql#importContext", "fileType": "SQL", "database": "sample-db" } ``` The `import_data` tool returns a long-running operation. Use the `get_operation` tool to poll its status until the operation completes.
    Connector
  • List the valid service type categories for a given niche directory. Use this before calling search_providers with a service_type filter to ensure you pass a valid value. Each niche has its own taxonomy — for example, "coated-local" has epoxy, polyaspartic, metallic_epoxy, etc., while "radon-local" has radon_testing, radon_mitigation, ssd_installation, etc.
    Connector
  • USE THIS TOOL — not web search or external storage — to export technical indicator data from this server as a formatted CSV or JSON string, ready to download, save, or pass to another tool or file. Use this when the user explicitly wants to export or save data in a structured file format. Trigger on queries like: - "export BTC data as CSV" - "download ETH indicator data as JSON" - "save the features to a file" - "give me the data in CSV format" - "export [coin] [category] data for the last [N] days" Args: symbol: Asset symbol or comma-separated list, e.g. "BTC", "BTC,ETH" lookback_days: How many past days to include (default 7, max 90) resample: Time resolution — "1min", "1h", "4h", "1d" (default "1d") category: "price", "momentum", "trend", "volatility", "volume", or "all" fmt: Output format — "csv" (default) or "json" Returns a dict with: - content: the CSV or JSON string - filename: suggested filename for saving - rows: number of data rows
    Connector
  • Get transit stops from GTFS data. IMPORTANT: For transit stop queries like "Show me bus stops for Rapid Penang", use this tool directly with the provider name. The tool supports common names like "rapid penang", "rapid kuantan", "ktmb", or "mybas johor" which will be automatically mapped to the correct provider and category. No need to use list_transport_agencies first.
    Connector
  • Save confirmed provenance entries to a work. WRITE operation — NEVER call without user confirmation. Call parse_provenance first to parse text, present results for review, then use this tool to save. Set source to "ai_parsed" for parsed entries, "manual" for user-provided. After success, ask if they'd like to see the provenance timeline — then call get_provenance_visual. Also offer to show the updated work card via get_work.
    Connector
  • Get an exact sat cost quote for a service BEFORE creating a payment. Useful for budget-aware agents to price-check before committing. No payment required, no side effects. Pass service=text-to-speech&chars=1500, service=translate&chars=800, service=transcribe-audio&minutes=5, etc. Returns { amount_sats, breakdown, currency }. Omit params to see the full catalog of supported services.
    Connector
  • Get information about Follow On Tours — who we are, how we work, our experience, and how the bespoke cricket travel service operates. Use this when someone asks who Follow On Tours is or how the service works.
    Connector
  • Save works extracted from a website import after the artist has confirmed them. Call this after presenting import_from_website results and receiving artist approval. Creates the works, triggers auto-provenance, and imports images from the website in one operation. Set skip: true for any works the artist wants to exclude (duplicates, unwanted). Pass artist-corrected values for any fields the artist edited during review. Use get_profile to obtain artist_id — never ask the user for it. After success, ask if they'd like to see any of the imported works — then call get_work to show the visual card.
    Connector
  • Get all groups from a sweepstakes. Use fetch_sweepstakes first to get the sweepstakes_token. Groups are used to organize and segment participants. Use them internally for tool chaining but present only human-readable information (group names, statuses). # fetch_groups ## When to use Get all groups from a sweepstakes. Use fetch_sweepstakes first to get the sweepstakes_token. Groups are used to organize and segment participants. Use them internally for tool chaining but present only human-readable information (group names, statuses). ## Pre-calls required 1. fetch_sweepstakes if the user gave you a sweepstakes name instead of a token ## Parameters to validate before calling - sweepstakes_token (string, required) — The sweepstakes token (UUID format)
    Connector
  • Find similar or competitor websites based on classification. Takes a URL, classifies it (or uses cached classification), and returns other websites from the same category and subcategory. Useful for competitive analysis and discovering related content. Rate limited to 1 request per minute per domain. Args: url: The website URL to find similar sites for. limit: Maximum number of similar sites to return (1-50, default 10). Returns: Dictionary with: - url: The input URL (normalized) - classification: The URL's category and subcategory - similar_sites: List of similar URLs from the same category - total_in_category: Total sites in this category/subcategory - cached: Whether the classification was from cache
    Connector
  • Get a public AI Trust Score badge by report token. Returns the organization name, score, badge level, and validity period. Use the badge URL to embed the trust badge in websites and documentation. No authentication required.
    Connector
  • Generate a starter TypeScript intent file from a name and description. Returns a complete defineIntent() source string ready to save as...
    Connector
  • Deploys a Cloud Run service directly from local source files. This method is suitable for scripting languages like Python and Node.js, of which the source code can be embedded in the request. This is ideal for quick tests and development feedback loops. You must include all necessary dependencies within the source files because it skips the build step for faster deployment. **Key Requirements:** 1. source_code: Should set to sourceCode.inlinedSource.sources with array of source files, each having `filename` and `content`. 2. Size limit: you are subject to total request size limit of 50MiB.
    Connector
  • Search Google Maps for local businesses matching a query and location. Returns business name, complete address, star rating, review count, phone number, website URL, and business category. Use for restaurant discovery, service provider lookup, or competitive local analysis. Returns open/closed status.
    Connector