Skip to main content
Glama
j0hanz

PromptTuner MCP

Server Configuration

Describes the environment variables required to run the server.

NameRequiredDescriptionDefault
LLM_MODELNoOptional: override default model for the chosen provider
LLM_PROVIDERNoChoose LLM provider (openai, anthropic, or google)openai
GOOGLE_API_KEYNoGoogle API key for Gemini models (Gemini 2.0 Flash, Gemini 1.5 Pro) - Get from https://aistudio.google.com/apikey
OPENAI_API_KEYNoOpenAI API key for GPT models (gpt-4o, gpt-4o-mini, gpt-4-turbo) - Get from https://platform.openai.com/api-keys
ANTHROPIC_API_KEYNoAnthropic API key for Claude models (Claude 3.5 Sonnet/Haiku) - Get from https://console.anthropic.com

Capabilities

Features and capabilities supported by this server

CapabilityDetails
tools
{
  "listChanged": true
}
logging
{}
prompts
{
  "listChanged": true
}
resources
{
  "listChanged": true
}

Tools

Functions exposed to the LLM to take actions

NameDescription
fix_prompt

Polish and refine a prompt for better clarity, readability, and flow.

boost_prompt

Transform a prompt using prompt engineering best practices for maximum clarity and effectiveness.

crafting_prompt

Generate a structured, reusable workflow prompt for complex tasks based on a raw request and a few settings.

Prompts

Interactive templates invoked by user choice

NameDescription
fix-promptTemplate for fixing grammar and clarity
boost-promptTemplate for boosting prompt effectiveness

Resources

Contextual data attached and managed by the client

NameDescription

No resources

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/j0hanz/prompt-tuner-mcp-server'

If you have feedback or need assistance with the MCP directory API, please join our Discord server