Skip to main content
Glama

get_context_for_prompt

Retrieves relevant codebase context including file summaries, code snippets, related files, and past session memories to understand features, prepare for changes, and explore unfamiliar code.

Instructions

Get relevant codebase context optimized for prompt enhancement. This is the primary tool for understanding code and gathering context before making changes.

Returns:

  • File summaries and relevance scores

  • Smart-extracted code snippets (most relevant parts)

  • Related file suggestions for dependency awareness

  • Relevant memories from previous sessions (preferences, decisions, facts)

  • Token-aware output (respects context window limits)

Use this tool when you need to:

  • Understand how a feature is implemented

  • Find relevant code before making changes

  • Get context about a specific concept or pattern

  • Explore unfamiliar parts of the codebase

  • Recall user preferences and past decisions

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
queryYesDescription of what you need context for (e.g., "authentication logic", "database schema", "how user registration works")
max_filesNoMaximum number of files to include (default: 5, max: 20)
token_budgetNoMaximum tokens for the entire context (default: 8000). Adjust based on your context window.
include_relatedNoInclude related/imported files for better context (default: true)
min_relevanceNoMinimum relevance score (0-1) to include a file (default: 0.3)
bypass_cacheNoBypass caches (forces fresh retrieval; useful for benchmarking/debugging).

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/Kirachon/context-engine'

If you have feedback or need assistance with the MCP directory API, please join our Discord server