Skip to main content
Glama

Server Configuration

Describes the environment variables required to run the server.

NameRequiredDescriptionDefault
PROLOG_REASONER_LLM_MODELNoLLM model for library modegpt-5.4-mini
PROLOG_REASONER_LOG_LEVELNoLog level for the serverINFO
PROLOG_REASONER_SWIPL_PATHNoPath to SWI-Prolog executableswipl
PROLOG_REASONER_LLM_API_KEYNoAPI key for library mode only — leave unset for MCP
PROLOG_REASONER_LLM_PROVIDERNoLLM provider for library mode (openai or anthropic)openai
PROLOG_REASONER_LLM_TEMPERATURENoLLM temperature for library mode0.0
PROLOG_REASONER_LLM_TIMEOUT_SECONDSNoLLM timeout for library mode30.0
PROLOG_REASONER_EXECUTION_TIMEOUT_SECONDSNoExecution timeout for Prolog10.0

Capabilities

Features and capabilities supported by this server

CapabilityDetails
tools
{
  "listChanged": true
}
logging
{}
prompts
{
  "listChanged": false
}
resources
{
  "subscribe": false,
  "listChanged": false
}
extensions
{
  "io.modelcontextprotocol/ui": {}
}
experimental
{}

Tools

Functions exposed to the LLM to take actions

NameDescription
execute_prologB

Execute Prolog code and return reasoning results.

Write Prolog facts and rules, then run a query against them. Supports CLP(FD) constraints, negation-as-failure, and all standard SWI-Prolog features.

list_rule_basesA

List all saved rule bases with description and tags.

Returns {"rule_bases": [{"name": str, "description": str, "tags": list[str]}, ...]} sorted by name. Metadata is extracted from the leading % description: / % tags: comments of each rule base file (see §4.10).

get_rule_baseB

Retrieve the Prolog source of a saved rule base.

save_rule_baseA

Save a named rule base containing Prolog rules that can be reused across execute_prolog calls.

Use this for stable, reusable knowledge (e.g. piece_moves for chess piece movement rules). For one-time facts, include them directly in prolog_code instead.

delete_rule_baseC

Delete a saved rule base by name.

Prompts

Interactive templates invoked by user choice

NameDescription

No prompts

Resources

Contextual data attached and managed by the client

NameDescription

No resources

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/rikarazome/prolog-reasoner'

If you have feedback or need assistance with the MCP directory API, please join our Discord server