Skip to main content
Glama

Connecting Alpaca to Brokerages, News Sites, and Spark Cluster for Workflow Rebalancing MCP tools

Production-ready MCP servers that extend AI capabilities through file access, database connections, APIs, and contextual services.

46,232 tools. Last updated 2025-12-21 11:36
  • Set up a new Databricks cluster with custom parameters like cluster name, Spark version, node type, worker count, and auto-termination settings using Databricks MCP Server.
  • Submit a job to a Dataproc cluster by specifying project ID, region, cluster name, job type, main file, and optional arguments, JAR files, and properties. Supports Spark, PySpark, Hive, Pig, and Hadoop job types.
    MIT License
  • Simulate liquidity pool rebalancing when impermanent loss exceeds your threshold. Get position adjustment recommendations to optimize your DeFi portfolio on Solana.
    MIT License
  • Submit and manage batch jobs on Google Cloud Dataproc. Define job types (Spark, PySpark, Spark SQL), include main files, JARs, arguments, and configure properties for efficient job execution.
    MIT License
  • Set up a new Databricks cluster by defining its name, Spark version, node type, and worker count, enabling execution of data processing workflows.
  • Retrieve detailed Alpaca account data such as equity, buying power, and day trade status for portfolio management and trading strategy development.

Interested in MCP?

Join the MCP community for support and updates.

RedditDiscord

Matching MCP servers