Skip to main content
Glama

MCP-Creator-MCP

Server Configuration

Describes the environment variables required to run the server.

NameRequiredDescriptionDefault
LOG_LEVELNoLogging level for the applicationINFO
GRADIO_SHARENoWhether to create a public share link for the Gradio interfacefalse
OPENAI_API_KEYNoYour OpenAI API key for AI guidance (at least one AI provider key is required)
OLLAMA_BASE_URLNoBase URL for Ollama service (at least one AI provider is required)http://localhost:11434
ANTHROPIC_API_KEYNoYour Anthropic API key for AI guidance (at least one AI provider key is required)
DEFAULT_OUTPUT_DIRNoDirectory where generated MCP servers will be saved./mcp_servers
GRADIO_SERVER_PORTNoPort for the Gradio interface server7860

Schema

Prompts

Interactive templates invoked by user choice

NameDescription

No prompts

Resources

Contextual data attached and managed by the client

NameDescription

No resources

Tools

Functions exposed to the LLM to take actions

NameDescription
create_mcp_server
Create a new MCP server based on specifications. IMPORTANT NOTES: - AI sampling (ctx.sample) is not currently supported in Claude Desktop - Use modern typing: dict, list, str | None instead of Dict, List, Optional - Generated servers include proper process cleanup and error handling - All generated code uses working MCP SDK patterns Args: name: Name of the MCP server (must be valid Python identifier) description: Description of what the server does language: Programming language (python, gradio, typescript) template_type: Type of template (basic, fastmcp_server) features: list of features to include (tools, resources, prompts) output_dir: Output directory (defaults to configured default) Returns: Status message with creation details and next steps
list_templates
list available templates for MCP server creation. Args: language: Filter by language (optional) Returns: Formatted list of available templates
get_ai_guidance
Get structured guidance for MCP server development. IMPORTANT NOTES: - AI sampling (ctx.sample) is NOT currently supported in Claude Desktop - Use modern typing: dict, list, str | None instead of Dict, List, Optional - Always implement proper process cleanup and signal handling - Follow MCP SDK patterns for tools, resources, and prompts This tool provides structured, deterministic guidance instead of AI-generated content. For dynamic AI assistance, use Claude Desktop's built-in capabilities directly. Args: topic: Topic to get guidance on (best_practices, security, performance, typing, etc.) server_type: Type of server for contextualized advice Returns: Structured guidance and recommendations with working code patterns
save_workflow
Save a creation workflow for reuse. Args: name: Workflow name description: Workflow description steps: list of workflow steps Returns: Confirmation message

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/angrysky56/mcp-creator-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server