outsource_text
Delegate text generation to external AI models for different capabilities or perspectives. Access multiple providers through a unified interface.
Instructions
Delegate text generation to another AI model. Use this when you need capabilities
or perspectives from a different model than yourself.
Args:
provider: The AI provider to use (e.g., "openai", "anthropic", "google", "groq")
model: The specific model identifier (e.g., "gpt-4o", "claude-3-5-sonnet-20241022", "gemini-2.0-flash-exp")
prompt: The instruction or query to send to the external model
Returns:
The text response from the external model, or an error message if the request fails
Example usage:
To get a different perspective: provider="anthropic", model="claude-3-5-sonnet-20241022", prompt="Analyze this problem from a different angle..."
To leverage specialized models: provider="deepseek", model="deepseek-coder", prompt="Write optimized Python code for..."
Input Schema
TableJSON Schema
| Name | Required | Description | Default |
|---|---|---|---|
| provider | Yes | ||
| model | Yes | ||
| prompt | Yes |
Implementation Reference
- server.py:64-108 (handler)The core handler function for the 'outsource_text' tool. Decorated with @mcp.tool() for automatic registration and schema inference from type hints and docstring. It maps the provider to a model class, creates an Agent, and executes the prompt to return the generated text.@mcp.tool() async def outsource_text(provider: str, model: str, prompt: str) -> str: """ Delegate text generation to another AI model. Use this when you need capabilities or perspectives from a different model than yourself. Args: provider: The AI provider to use (e.g., "openai", "anthropic", "google", "groq") model: The specific model identifier (e.g., "gpt-4o", "claude-3-5-sonnet-20241022", "gemini-2.0-flash-exp") prompt: The instruction or query to send to the external model Returns: The text response from the external model, or an error message if the request fails Example usage: To get a different perspective: provider="anthropic", model="claude-3-5-sonnet-20241022", prompt="Analyze this problem from a different angle..." To leverage specialized models: provider="deepseek", model="deepseek-coder", prompt="Write optimized Python code for..." """ try: # Get the appropriate model class based on provider provider_lower = provider.lower() if provider_lower not in PROVIDER_MODEL_MAP: raise ValueError(f"Unknown provider: {provider}") model_class = PROVIDER_MODEL_MAP[provider_lower] # Create the agent agent = Agent( model=model_class(id=model), name="Text Generation Agent", instructions="You are a helpful AI assistant. Respond to the user's prompt directly and concisely.", ) # Run the agent and get response response = await agent.arun(prompt) # Extract the text content from the response if hasattr(response, "content"): return response.content else: return str(response) except Exception as e: return f"Error generating text: {str(e)}"
- server.py:32-61 (helper)Global dictionary mapping lowercase provider names to their corresponding model classes from the 'agno' library, used within the outsource_text handler to dynamically instantiate the correct model based on user input.# Provider to model class mapping PROVIDER_MODEL_MAP = { "openai": OpenAIChat, "anthropic": Claude, "google": Gemini, "groq": Groq, "deepseek": DeepSeek, "xai": xAI, "perplexity": Perplexity, "cohere": Cohere, "fireworks": Fireworks, "huggingface": HuggingFace, "mistral": MistralChat, "nvidia": Nvidia, "ollama": Ollama, "openrouter": OpenRouter, "sambanova": Sambanova, "together": Together, "litellm": LiteLLM, "vercel": v0, "v0": v0, "aws": AwsBedrock, "bedrock": AwsBedrock, "azure": AzureAIFoundry, "cerebras": Cerebras, "meta": Llama, "deepinfra": DeepInfra, "ibm": WatsonX, "watsonx": WatsonX, }
- server.py:64-64 (registration)The @mcp.tool() decorator registers the outsource_text function as an MCP tool, automatically generating schema from signature and docstring.@mcp.tool()