Skip to main content
Glama

Houtini-lm

by houtini-ai

custom_prompt

Execute custom prompts for code analysis and generation tasks using local LLM with optional file context support for single or multi-file projects.

Instructions

Universal fallback executor for any custom prompt with optional file context. Uses dynamic token allocation based on your loaded model - can handle everything from quick tasks to comprehensive multi-file analysis. The Swiss Army knife when no other specialized function matches your needs.

WORKFLOW: Flexible analysis and generation for any development task TIP: Provide clear instructions for any analysis or generation task SAVES: Claude context for strategic decisions

Input Schema

NameRequiredDescriptionDefault
analysisDepthNoLevel of analysis detaildetailed
analysisTypeNoType of analysis to performgeneral
codeNoThe code to analyze (for single-file analysis)
contextNoOptional structured context object for the task
filePathNoPath to single file to analyze
filesNoArray of specific file paths to include as context
languageNoProgramming language (if applicable)text
maxDepthNoMaximum directory depth for multi-file discovery (1-5)
projectPathNoPath to project root (for multi-file analysis)
promptYesThe custom prompt/task to send to local LLM
working_directoryNoWorking directory context (defaults to current working directory)

Input Schema (JSON Schema)

{ "properties": { "analysisDepth": { "default": "detailed", "description": "Level of analysis detail", "enum": [ "basic", "detailed", "comprehensive" ], "type": "string" }, "analysisType": { "default": "general", "description": "Type of analysis to perform", "enum": [ "general", "technical", "creative", "analytical" ], "type": "string" }, "code": { "description": "The code to analyze (for single-file analysis)", "type": "string" }, "context": { "description": "Optional structured context object for the task", "type": "object" }, "filePath": { "description": "Path to single file to analyze", "type": "string" }, "files": { "description": "Array of specific file paths to include as context", "type": "array" }, "language": { "default": "text", "description": "Programming language (if applicable)", "type": "string" }, "maxDepth": { "default": 3, "description": "Maximum directory depth for multi-file discovery (1-5)", "type": "number" }, "projectPath": { "description": "Path to project root (for multi-file analysis)", "type": "string" }, "prompt": { "description": "The custom prompt/task to send to local LLM", "type": "string" }, "working_directory": { "description": "Working directory context (defaults to current working directory)", "type": "string" } }, "required": [ "prompt" ], "type": "object" }

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/houtini-ai/lm'

If you have feedback or need assistance with the MCP directory API, please join our Discord server