Skip to main content
Glama

Lucidity MCP

Server Configuration

Describes the environment variables required to run the server.

NameRequiredDescriptionDefault
HOSTNoHost to bind the server to (use 0.0.0.0 for all interfaces)127.0.0.1
PORTNoPort to listen on for network connections6969
DEBUGNoEnable debug loggingfalse
VERBOSENoEnable verbose logging for HTTP requestsfalse
LOG_FILENoPath to log file (required for stdio transport if logs enabled)
LOG_LEVELNoSet the logging levelINFO
TRANSPORTNoTransport type to use (stdio for terminal, sse for network)stdio

Schema

Prompts

Interactive templates invoked by user choice

NameDescription
analyze_changesGenerate a prompt for analyzing git code changes via MCP. This function creates a structured prompt that will be passed back to the AI model through the Model Context Protocol, guiding it to analyze git changes effectively. Args: code: The changed code to analyze language: The programming language of the code original_code: The original code before changes (optional) Returns: A formatted prompt for the AI to analyze git changes

Resources

Contextual data attached and managed by the client

NameDescription

No resources

Tools

Functions exposed to the LLM to take actions

NameDescription
analyze_changes

Prepare git changes for analysis through MCP.

This tool examines the current git diff, extracts changed code, and prepares structured data with context for the AI to analyze.

The tool doesn't perform analysis itself - it formats the git diff data and provides analysis instructions which get passed back to the AI model through the Model Context Protocol.

Args: workspace_root: The root directory of the workspace/git repository path: Optional specific file path to analyze

Returns: Structured git diff data with analysis instructions for the AI

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/hyperb1iss/lucidity-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server