Skip to main content
Glama

Server Configuration

Describes the environment variables required to run the server.

NameRequiredDescriptionDefault
PROLOG_REASONER_LLM_MODELNoLLM model for library modegpt-5.4-mini
PROLOG_REASONER_LOG_LEVELNoLog level for the serverINFO
PROLOG_REASONER_SWIPL_PATHNoPath to SWI-Prolog executableswipl
PROLOG_REASONER_LLM_API_KEYNoAPI key for library mode only — leave unset for MCP
PROLOG_REASONER_LLM_PROVIDERNoLLM provider for library mode (openai or anthropic)openai
PROLOG_REASONER_LLM_TEMPERATURENoLLM temperature for library mode0.0
PROLOG_REASONER_LLM_TIMEOUT_SECONDSNoLLM timeout for library mode30.0
PROLOG_REASONER_EXECUTION_TIMEOUT_SECONDSNoExecution timeout for Prolog10.0

Capabilities

Features and capabilities supported by this server

CapabilityDetails
tools
{
  "listChanged": true
}
logging
{}
prompts
{
  "listChanged": false
}
resources
{
  "subscribe": false,
  "listChanged": false
}
experimental
{}

Tools

Functions exposed to the LLM to take actions

NameDescription
execute_prologA

Execute Prolog code and return reasoning results.

Write Prolog facts and rules, then run a query against them. Supports CLP(FD) constraints, negation-as-failure, and all standard SWI-Prolog features.

Prompts

Interactive templates invoked by user choice

NameDescription

No prompts

Resources

Contextual data attached and managed by the client

NameDescription

No resources

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/rikarazome/prolog-reasoner'

If you have feedback or need assistance with the MCP directory API, please join our Discord server