Skip to main content
Glama

Just Prompt

by disler

Server Configuration

Describes the environment variables required to run the server.

NameRequiredDescriptionDefault
OLLAMA_HOSTNoThe host URL for Ollamahttp://localhost:11434
GROQ_API_KEYNoYour Groq API key
GEMINI_API_KEYNoYour Google Gemini API key
OPENAI_API_KEYNoYour OpenAI API key
DEEPSEEK_API_KEYNoYour DeepSeek API key
ANTHROPIC_API_KEYNoYour Anthropic API key

Schema

Prompts

Interactive templates invoked by user choice

NameDescription

No prompts

Resources

Contextual data attached and managed by the client

NameDescription

No resources

Tools

Functions exposed to the LLM to take actions

NameDescription
prompt

Send a prompt to multiple LLM models

prompt_from_file

Send a prompt from a file to multiple LLM models

prompt_from_file_to_file

Send a prompt from a file to multiple LLM models and save responses to files

ceo_and_board

Send a prompt to multiple 'board member' models and have a 'CEO' model make a decision based on their responses

list_providers

List all available LLM providers

list_models

List all available models for a specific LLM provider

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/disler/just-prompt'

If you have feedback or need assistance with the MCP directory API, please join our Discord server