Generate a new AutoGen agent with a unique name, type, system message, and LLM configuration to enable collaborative multi-agent conversations through the AutoGen MCP Server.
Run a specified workflow with optional streaming support using AutoGen MCP Server, enabling integration and multi-agent conversations with provided input data.
Design and enable real-time streaming workflows by configuring agents and specifying workflow details using the AutoGen MCP Server's multi-agent conversation framework.
Search and retrieve the latest documentation for specific queries and libraries. Supports langchain, llama-index, autogen, agno, openai-agents-sdk, mcp-doc, camel-ai, and crew-ai. Input query and library to extract relevant text.
Initiate a real-time chat session with an agent, enabling continuous message exchange. Specify the agent name and initial message to begin streaming interactions.