Skip to main content
Glama

Server Configuration

Describes the environment variables required to run the server.

NameRequiredDescriptionDefault
LLM_PROVIDERNoThe LLM provider to use (ollama or openai).ollama
OLLAMA_MODELNoThe model to use with Ollama.llama2
OPENAI_MODELNoThe OpenAI model to use.
OPENAI_API_KEYNoThe API key for OpenAI.
OLLAMA_BASE_URLNoThe base URL for the local Ollama instance.http://localhost:11434
UPLOAD_MAX_FILESNoMaximum number of files allowed for upload.10
CHROMA_PERSIST_DIRNoThe absolute path or relative path to the directory where Chroma DB data is persisted../data/chroma
UPLOAD_MAX_FILE_SIZE_MBNoMaximum allowed file size for uploads in MB.50

Capabilities

Features and capabilities supported by this server

CapabilityDetails
tools
{
  "listChanged": false
}
prompts
{
  "listChanged": false
}
resources
{
  "subscribe": false,
  "listChanged": false
}
experimental
{}

Tools

Functions exposed to the LLM to take actions

NameDescription
assess_document

Assess a security document (PDF, Word, etc.) and return a risk report.

Args: file_path: Absolute path to the file to be assessed. scenario_id: The assessment scenario ID (default: "default").

Returns: JSON string containing the assessment report (risks, gaps, remediations).

query_knowledge_base

Query the internal security knowledge base (policies, standards).

Args: query: The search query (e.g., "password complexity requirements"). top_k: Number of results to return.

Returns: JSON string with retrieved document chunks.

Prompts

Interactive templates invoked by user choice

NameDescription

No prompts

Resources

Contextual data attached and managed by the client

NameDescription
get_kb_statsGet statistics about the knowledge base (document count, etc.)

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/arthurpanhku/DocSentinel'

If you have feedback or need assistance with the MCP directory API, please join our Discord server