Submit data processing jobs to Google Cloud Dataproc clusters for Spark, PySpark, Hive, and Hadoop workloads using specified project, region, and cluster parameters.
Submit a job to a Dataproc cluster by specifying project ID, region, cluster name, job type, main file, and optional arguments, JAR files, and properties. Supports Spark, PySpark, Hive, Pig, and Hadoop job types.
Simulate liquidity pool rebalancing based on impermanent loss thresholds to optimize DeFi positions. Provides adjustment recommendations for Solana wallets.
Turns any static website into an MCP-searchable knowledge base by deploying a Cloudflare Worker that provides full-text search tools, enabling AI assistants to search and retrieve content from your site.
Provides access to the Reuters Business and Financial News API to retrieve articles, trending news, and market data. It enables searching and filtering financial content by date, author, category, and keywords.
MCP server that exposes Alpaca Market Data & Broker API as tools, enabling access to financial data like stock bars, assets, market days, and news through the Message Control Protocol.