Skip to main content
Glama

collect_legacy_project_intake

collect_legacy_project_intake

Collect and persist legacy project context to improve AI guidance quality while reducing repeated prompts and token usage.

Instructions

Collect and persist legacy-project contextual intake to improve AI guidance quality while minimizing repeated prompts/tokens.

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
intakePathNo
dryRunNo
reasonNo
maxDiffLinesNo
askForMissingContextNo
projectGoalNo
businessDomainNo
criticalityNo
runtimeLandscapeNo
ui5RuntimeVersionNo
allowedRefactorScopeNo
mustKeepStableAreasNo
knownPainPointsNo
constraintsNo
complianceRequirementsNo
notesNo

Output Schema

TableJSON Schema
NameRequiredDescriptionDefault
dryRunYes
changedYes
previewYes
projectYes
summaryYes
questionsYes
intakePathYes
applyResultYes
missingContextYes
needsUserInputYes
qualityPriorityYes

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/santiagosanmartinn/mcpui5server'

If you have feedback or need assistance with the MCP directory API, please join our Discord server