Skip to main content
Glama

outsource_text

Delegate text generation to external AI models for different capabilities or perspectives. Access multiple providers through a unified interface.

Instructions

Delegate text generation to another AI model. Use this when you need capabilities or perspectives from a different model than yourself. Args: provider: The AI provider to use (e.g., "openai", "anthropic", "google", "groq") model: The specific model identifier (e.g., "gpt-4o", "claude-3-5-sonnet-20241022", "gemini-2.0-flash-exp") prompt: The instruction or query to send to the external model Returns: The text response from the external model, or an error message if the request fails Example usage: To get a different perspective: provider="anthropic", model="claude-3-5-sonnet-20241022", prompt="Analyze this problem from a different angle..." To leverage specialized models: provider="deepseek", model="deepseek-coder", prompt="Write optimized Python code for..."

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
providerYes
modelYes
promptYes

Implementation Reference

  • The core handler function for the 'outsource_text' tool. Decorated with @mcp.tool() for automatic registration and schema inference from type hints and docstring. It maps the provider to a model class, creates an Agent, and executes the prompt to return the generated text.
    @mcp.tool() async def outsource_text(provider: str, model: str, prompt: str) -> str: """ Delegate text generation to another AI model. Use this when you need capabilities or perspectives from a different model than yourself. Args: provider: The AI provider to use (e.g., "openai", "anthropic", "google", "groq") model: The specific model identifier (e.g., "gpt-4o", "claude-3-5-sonnet-20241022", "gemini-2.0-flash-exp") prompt: The instruction or query to send to the external model Returns: The text response from the external model, or an error message if the request fails Example usage: To get a different perspective: provider="anthropic", model="claude-3-5-sonnet-20241022", prompt="Analyze this problem from a different angle..." To leverage specialized models: provider="deepseek", model="deepseek-coder", prompt="Write optimized Python code for..." """ try: # Get the appropriate model class based on provider provider_lower = provider.lower() if provider_lower not in PROVIDER_MODEL_MAP: raise ValueError(f"Unknown provider: {provider}") model_class = PROVIDER_MODEL_MAP[provider_lower] # Create the agent agent = Agent( model=model_class(id=model), name="Text Generation Agent", instructions="You are a helpful AI assistant. Respond to the user's prompt directly and concisely.", ) # Run the agent and get response response = await agent.arun(prompt) # Extract the text content from the response if hasattr(response, "content"): return response.content else: return str(response) except Exception as e: return f"Error generating text: {str(e)}"
  • Global dictionary mapping lowercase provider names to their corresponding model classes from the 'agno' library, used within the outsource_text handler to dynamically instantiate the correct model based on user input.
    # Provider to model class mapping PROVIDER_MODEL_MAP = { "openai": OpenAIChat, "anthropic": Claude, "google": Gemini, "groq": Groq, "deepseek": DeepSeek, "xai": xAI, "perplexity": Perplexity, "cohere": Cohere, "fireworks": Fireworks, "huggingface": HuggingFace, "mistral": MistralChat, "nvidia": Nvidia, "ollama": Ollama, "openrouter": OpenRouter, "sambanova": Sambanova, "together": Together, "litellm": LiteLLM, "vercel": v0, "v0": v0, "aws": AwsBedrock, "bedrock": AwsBedrock, "azure": AzureAIFoundry, "cerebras": Cerebras, "meta": Llama, "deepinfra": DeepInfra, "ibm": WatsonX, "watsonx": WatsonX, }
  • server.py:64-64 (registration)
    The @mcp.tool() decorator registers the outsource_text function as an MCP tool, automatically generating schema from signature and docstring.
    @mcp.tool()
Install Server

Other Tools

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/gwbischof/outsource-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server