Skip to main content
Glama

Server Configuration

Describes the environment variables required to run the server.

NameRequiredDescriptionDefault
LOG_LEVELNoLogging level: debug, info, warn, errorinfo

Tools

Functions exposed to the LLM to take actions

NameDescription
generate_prompt

Transform a raw idea into a well-structured, actionable prompt optimized for AI assistants.

Use this tool when you need to: • Create a new prompt from scratch • Structure a vague idea into a clear request • Generate role-specific prompts (coding, writing, research, etc.)

Supports templates: coding (for programming tasks), writing (for content creation), research (for investigation), analysis (for data/business analysis), factcheck (for verification), general (versatile).

IMPORTANT: When available, pass workspace context (file structure, package.json, tech stack) to generate prompts that align with the user's project.

refine_prompt

Iteratively improve an existing prompt based on specific feedback.

Use this tool when you need to: • Improve a prompt that didn't get good results • Add missing context or constraints • Make a prompt more specific or clearer • Adapt a prompt for a different AI model

The tool preserves the original structure while applying targeted improvements.

IMPORTANT: When available, pass workspace context (file structure, package.json, tech stack) to ensure refined prompts comply with the user's project scope and original request.

analyze_prompt

Evaluate prompt quality and get actionable improvement suggestions.

Use this tool when you need to: • Assess if a prompt is well-structured • Identify weaknesses before using a prompt • Get specific suggestions for improvement • Compare prompt quality before/after refinement

Returns scores (0-100) for: clarity, specificity, structure, actionability.

get_server_status

Get PromptArchitect server status and performance metrics.

Use this tool to check: • Whether AI (Gemini) is available • Cache hit rate and request statistics • Average response latency

Prompts

Interactive templates invoked by user choice

NameDescription

No prompts

Resources

Contextual data attached and managed by the client

NameDescription
Template CategoriesList of available template categories
Debug CodeAnalyze and fix bugs in code
Code ReviewReview code for quality, security, and best practices
Blog PostGenerate a blog post outline or draft
Professional EmailDraft a professional email
Research SummarySummarize research findings
SWOT AnalysisConduct a SWOT analysis
Comparison AnalysisCompare multiple options or solutions
Fact CheckVerify claims and statements
Coding TemplatesAll templates in the coding category
Writing TemplatesAll templates in the writing category
Research TemplatesAll templates in the research category
Analysis TemplatesAll templates in the analysis category
Factcheck TemplatesAll templates in the factcheck category

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/xXMSGXx/promptarchitect-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server