A2A Client MCP Server
An MCP server that acts as a client to the Agent-to-Agent (A2A) protocol, allowing LLMs to interact with A2A agents through the Model Context Protocol (MCP).
Features
- Connect to any A2A-compatible agent
- Send and receive messages
- Track and manage tasks
- Support for streaming responses
- Query agent capabilities and metadata
Installation
Configuration
Environment Variables
A2A_ENDPOINT_URL
: URL of the A2A agent to connect to (default: "http://localhost:41241")
Usage with Claude Desktop
Add this to your claude_desktop_config.json
:
NPX
Docker
Build the Docker image:
Configure Claude Desktop:
Available Tools
a2a_send_task
Send a task to an A2A agent
message
(string): Message to send to the agenttaskId
(string, optional): Task ID (generated if not provided)
a2a_get_task
Get the current state of a task
taskId
(string): ID of the task to retrieve
a2a_cancel_task
Cancel a running task
taskId
(string): ID of the task to cancel
a2a_send_task_subscribe
Send a task and subscribe to updates (streaming)
message
(string): Message to send to the agenttaskId
(string, optional): Task ID (generated if not provided)maxUpdates
(number, optional): Maximum updates to receive (default: 10)
a2a_agent_info
Get information about the connected A2A agent
- No parameters required
Resources
The server provides access to two MCP resources:
a2a://agent-card
: Information about the connected A2A agenta2a://tasks
: List of recent A2A tasks
Example Usage
This example shows how to use A2A Client MCP Server to interact with a Coder Agent:
Development
License
MIT
This server cannot be installed
hybrid server
The server is able to function both locally and remotely, depending on the configuration or use case.
An MCP server that enables LLMs to interact with Agent-to-Agent (A2A) protocol compatible agents, allowing for sending messages, tracking tasks, and receiving streaming responses.
Related MCP Servers
- -securityAlicense-qualityA Model Context Protocol (MCP) server that enables LLMs to interact directly the documents that they have on-disk through agentic RAG and hybrid search in LanceDB. Ask LLMs questions about the dataset as a whole or about specific documents.Last updated -1231TypeScriptMIT License
- -securityAlicense-qualityMCP server for toolhouse.ai. This does not rely on an external llm unlike the official server.Last updated -1PythonMIT License
- -securityAlicense-qualityA server for the Machine Chat Protocol (MCP) that provides a YAML-based configuration system for LLM applications, allowing users to define resources, tools, and prompts without writing code.Last updated -5PythonMIT License
- AsecurityFlicenseAqualityA server that implements the Model Context Protocol to connect LLMs to Brightsy AI agents, allowing users to pass messages to and receive responses from these agents.Last updated -196JavaScript