Skip to main content
Glama

Outsource MCP

Server Configuration

Describes the environment variables required to run the server.

NameRequiredDescriptionDefault

No arguments

Schema

Prompts

Interactive templates invoked by user choice

NameDescription

No prompts

Resources

Contextual data attached and managed by the client

NameDescription

No resources

Tools

Functions exposed to the LLM to take actions

NameDescription
outsource_text
Delegate text generation to another AI model. Use this when you need capabilities or perspectives from a different model than yourself. Args: provider: The AI provider to use (e.g., "openai", "anthropic", "google", "groq") model: The specific model identifier (e.g., "gpt-4o", "claude-3-5-sonnet-20241022", "gemini-2.0-flash-exp") prompt: The instruction or query to send to the external model Returns: The text response from the external model, or an error message if the request fails Example usage: To get a different perspective: provider="anthropic", model="claude-3-5-sonnet-20241022", prompt="Analyze this problem from a different angle..." To leverage specialized models: provider="deepseek", model="deepseek-coder", prompt="Write optimized Python code for..."
outsource_image
Delegate image generation to an external AI model. Use this when you need to create visual content. Args: provider: The AI provider to use (currently only "openai" is supported) model: The image model to use ("dall-e-3" for high quality, "dall-e-2" for faster/cheaper) prompt: A detailed description of the image you want to generate Returns: The URL of the generated image, which can be shared with users or used in responses Example usage: For high-quality images: provider="openai", model="dall-e-3", prompt="A photorealistic rendering of..." For quick concepts: provider="openai", model="dall-e-2", prompt="A simple sketch showing..." Note: Only OpenAI currently supports image generation. Other providers will return an error.

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/gwbischof/outsource-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server