Skip to main content
Glama
j0hanz

PromptTuner MCP

by j0hanz

Server Configuration

Describes the environment variables required to run the server.

NameRequiredDescriptionDefault
LLM_MODELNoOptional: override default model for the chosen provider
LLM_PROVIDERNoChoose LLM provider (openai, anthropic, or google)openai
GOOGLE_API_KEYNoGoogle API key for Gemini models (Gemini 2.0 Flash, Gemini 1.5 Pro) - Get from https://aistudio.google.com/apikey
OPENAI_API_KEYNoOpenAI API key for GPT models (gpt-4o, gpt-4o-mini, gpt-4-turbo) - Get from https://platform.openai.com/api-keys
ANTHROPIC_API_KEYNoAnthropic API key for Claude models (Claude 3.5 Sonnet/Haiku) - Get from https://console.anthropic.com

Tools

Functions exposed to the LLM to take actions

NameDescription

No tools

Prompts

Interactive templates invoked by user choice

NameDescription

No prompts

Resources

Contextual data attached and managed by the client

NameDescription

No resources

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/j0hanz/prompt-tuner-mcp-server'

If you have feedback or need assistance with the MCP directory API, please join our Discord server