Skip to main content
Glama

YindDao RPA MCP Server

by AutomaApp

YingDao AI Power: An AI low-code platform that quickly creates AI agents and AI workflows, helping users leverage AI effectively.

YingDao RPA: An RPA low-code platform, a user-friendly RPA automation product that frees people from repetitive labor.

Yingdao RPA MCP Server is implemented based on the Model Context Protocol (MCP), providing a bridge for interaction between YindDao AI Power and other tools that can serve as MCP Hosts (such as Claude Desktop, Cursor, etc.). It enables AI to utilize RPA capabilities.

It supports both SSE Server and Stdio Server modes.

Getting Started

There are two ways to run YindDao RPA:

Local Mode

Set environment variables:

RPA_MODEL=local SHADOWBOT_PATH={your_shadowbot_path} //Path to YindDao RPA executable USER_FOLDER={your_user_folder} //Path to YindDao RPA user folder

Path to YindDao RPA executable

Windows

D:\Program Files\{installation directory}\ShadowBot.exe

Mac

/Applications/影刀.app

Path to YindDao RPA user folder

Find the user folder option in YindDao RPA settings

Open API Mode (Enterprise users only)

Set environment variables:

RPA_MODEL=openApi ACCESS_KEY_ID={your_access_key_id} ACCESS_KEY_SECRET={your_access_key_secret}

How to obtain

Enterprise administrators can obtain this by logging into the YindDao RPA console. Please refer to YindDao RPA Help Documentation - Authentication

Stdio Server Startup

Configure in the client:

{ "mcpServers": { "YingDao RPA MCP Server": { "command": "npx", "args": ["-y", "@automa-ai-power/rpa-mcp-servers", "-stdio"], "env":{ "RPA_MODEL":"openApi", "ACCESS_KEY_ID":"{your_access_key_id}", "ACCESS_KEY_SECRET":"{your_access_key_secret}" } } } }

SSE Server Configuration

Build

Clone the repository and build:

git clone https://github.com/AutomaApp/aipower-rpa-mcp-server.git cd aipower-rpa-mcp-servers npm install npm run build

Configuration

Add a .env file with configuration items as described above

Startup

npm run start:server

Client Configuration

AI Power client configuration:

{ "mcpServers": { "YingDao RPA MCP Server": { "url": "http://localhost:3000/sse", "description": "Yingdao RPA MCP Server" } } }

The default port is 3000

Capabilities

Local Mode

  1. queryRobotParam: Query RPA application parameters
  2. queryApplist: Query RPA application list
  3. runApp: Run RPA application

Open API Mode

  1. uploadFile: Upload files to the RPA platform
  2. queryRobotParam: Query RPA application parameters
  3. queryApplist: Get paginated RPA application list
  4. startJob: Start an RPA job
  5. queryJob: Query RPA job status
  6. queryClientList: Query the list of RPA robot clients

License

MIT

You must be authenticated.

A
security – no known vulnerabilities
A
license - permissive license
A
quality - confirmed to work

hybrid server

The server is able to function both locally and remotely, depending on the configuration or use case.

A server implementing the Model Context Protocol (MCP) that connects YindDao AI Power with MCP Hosts, enabling AI to utilize RPA capabilities for workflow automation.

  1. Getting Started
    1. Local Mode
    2. Open API Mode (Enterprise users only)
  2. Stdio Server Startup
    1. SSE Server Configuration
      1. Build
      2. Configuration
      3. Startup
      4. Client Configuration
    2. Capabilities
      1. Local Mode
      2. Open API Mode

    Related MCP Servers

    • -
      security
      F
      license
      -
      quality
      A TypeScript implementation of a Model Context Protocol (MCP) server that exposes Dify workflows as tools for AI systems to interact with.
      Last updated -
      9
      TypeScript
    • A
      security
      A
      license
      A
      quality
      A Model Context Protocol (MCP) server that exposes the official Notion SDK, allowing AI models to interact with Notion workspaces.
      Last updated -
      17
      77
      7
      TypeScript
      Apache 2.0
      • Apple
      • Linux
    • A
      security
      F
      license
      A
      quality
      Model Context Protocol (MCP) server that integrates Redash with AI assistants like Claude, allowing them to query data, manage visualizations, and interact with dashboards through natural language.
      Last updated -
      10
      53
      16
      JavaScript
      • Apple
    • -
      security
      F
      license
      -
      quality
      Implements the Model Context Protocol (MCP) to provide AI models with a standardized interface for connecting to external data sources and tools like file systems, databases, or APIs.
      Last updated -
      90
      Python
      • Apple
      • Linux

    View all related MCP servers

    MCP directory API

    We provide all the information about MCP servers via our MCP API.

    curl -X GET 'https://glama.ai/api/mcp/v1/servers/AutomaApp/aipower-rpa-mcp-server'

    If you have feedback or need assistance with the MCP directory API, please join our Discord server