Skip to main content
Glama

outsource_text

Delegate text generation to external AI models for diverse perspectives or specialized tasks by specifying provider, model, and prompt.

Instructions

Delegate text generation to another AI model. Use this when you need capabilities or perspectives from a different model than yourself. Args: provider: The AI provider to use (e.g., "openai", "anthropic", "google", "groq") model: The specific model identifier (e.g., "gpt-4o", "claude-3-5-sonnet-20241022", "gemini-2.0-flash-exp") prompt: The instruction or query to send to the external model Returns: The text response from the external model, or an error message if the request fails Example usage: To get a different perspective: provider="anthropic", model="claude-3-5-sonnet-20241022", prompt="Analyze this problem from a different angle..." To leverage specialized models: provider="deepseek", model="deepseek-coder", prompt="Write optimized Python code for..."

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
modelYes
promptYes
providerYes

Implementation Reference

  • The core handler function for the 'outsource_text' tool, registered via @mcp.tool() decorator. It maps the provider to a model class, creates an Agent instance, executes the prompt, and returns the response content or an error message.
    @mcp.tool() async def outsource_text(provider: str, model: str, prompt: str) -> str: """ Delegate text generation to another AI model. Use this when you need capabilities or perspectives from a different model than yourself. Args: provider: The AI provider to use (e.g., "openai", "anthropic", "google", "groq") model: The specific model identifier (e.g., "gpt-4o", "claude-3-5-sonnet-20241022", "gemini-2.0-flash-exp") prompt: The instruction or query to send to the external model Returns: The text response from the external model, or an error message if the request fails Example usage: To get a different perspective: provider="anthropic", model="claude-3-5-sonnet-20241022", prompt="Analyze this problem from a different angle..." To leverage specialized models: provider="deepseek", model="deepseek-coder", prompt="Write optimized Python code for..." """ try: # Get the appropriate model class based on provider provider_lower = provider.lower() if provider_lower not in PROVIDER_MODEL_MAP: raise ValueError(f"Unknown provider: {provider}") model_class = PROVIDER_MODEL_MAP[provider_lower] # Create the agent agent = Agent( model=model_class(id=model), name="Text Generation Agent", instructions="You are a helpful AI assistant. Respond to the user's prompt directly and concisely.", ) # Run the agent and get response response = await agent.arun(prompt) # Extract the text content from the response if hasattr(response, "content"): return response.content else: return str(response) except Exception as e: return f"Error generating text: {str(e)}"
  • Dictionary mapping provider names to their corresponding model classes, used by the outsource_text handler to instantiate the correct model.
    # Provider to model class mapping PROVIDER_MODEL_MAP = { "openai": OpenAIChat, "anthropic": Claude, "google": Gemini, "groq": Groq, "deepseek": DeepSeek, "xai": xAI, "perplexity": Perplexity, "cohere": Cohere, "fireworks": Fireworks, "huggingface": HuggingFace, "mistral": MistralChat, "nvidia": Nvidia, "ollama": Ollama, "openrouter": OpenRouter, "sambanova": Sambanova, "together": Together, "litellm": LiteLLM, "vercel": v0, "v0": v0, "aws": AwsBedrock, "bedrock": AwsBedrock, "azure": AzureAIFoundry, "cerebras": Cerebras, "meta": Llama, "deepinfra": DeepInfra, "ibm": WatsonX, "watsonx": WatsonX, }

Other Tools

Related Tools

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/gwbischof/outsource-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server