Skip to main content
Glama

Server Configuration

Describes the environment variables required to run the server.

NameRequiredDescriptionDefault

No arguments

Tools

Functions exposed to the LLM to take actions

NameDescription
get_model

Retrieve detailed information about a specific HUMMBL mental model using its code (e.g., P1, IN3, CO5).

list_all_models

Retrieve complete list of all 120 HUMMBL mental models with basic information.

search_models

Search HUMMBL mental models by keyword across codes, names, and definitions.

recommend_models

Get recommended mental models based on a natural language problem description using HUMMBL REST API.

get_related_models

Get all models related to a specific model with relationship details

add_relationship

Add a relationship between two mental models with evidence.

get_methodology

Retrieve the canonical Self-Dialectical AI Systems methodology with HUMMBL Base120 mappings.

audit_model_references

Audit a list of HUMMBL model references for existence, transformation alignment, and duplicates.

Prompts

Interactive templates invoked by user choice

NameDescription

No prompts

Resources

Contextual data attached and managed by the client

NameDescription
all-modelsComplete Base120 framework with all 120 mental models.
self-dialectical-methodologyCanonical Self-Dialectical AI Systems methodology with HUMMBL Base120 mappings.
self-dialectical-methodology-markdownHuman-readable markdown overview of the Self-Dialectical AI Systems methodology, derived from the canonical structured definition.

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/hummbl-dev/mcp-server'

If you have feedback or need assistance with the MCP directory API, please join our Discord server