Skip to main content
Glama
rodhayl
by rodhayl

search

Search local files and code with four modes: intelligent LLM ranking, structured symbol-aware queries, context gathering, or filename matching.

Instructions

Unified search (intelligent|structured|gather|filenames). root defaults to "." when omitted.

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
actionYesAction: intelligent (LLM ranking), structured (symbol-aware), gather (collect relevant files), filenames (find files/dirs by name/path)
queryYesSearch query or context description
rootNoOPTIONAL: Root directory to search. Defaults to "." (workspace root) when omitted. Use "." explicitly or specify a subdirectory path like "src/" to narrow scope.
filePatternNoFilter by file glob pattern (intelligent action)
targetTypeNoFilter by symbol type (structured action)
languagesNoLanguages to search (structured action)
pathNoPath for context gathering (gather action)
scopeNoContext scope (gather action)
strategyNoContext strategy (gather action)
maxFilesNoMaximum files to analyze (gather action)
maxResultsNoMaximum results (default: 20)
includeHiddenNoInclude hidden files/directories (filenames action, default: false)
includeDirectoriesNoInclude directory matches in results (filenames action, default: false)
includePatternsNoFile patterns to include (e.g., ["*.py", "src/**"]). If set, only matching files are searched.
excludePatternsNoFile patterns to exclude (e.g., ["venv/**", "node_modules/**"]). Uses smart defaults if not specified.
formatNoOutput format: compact (paths only), dense (minimal), detailed (full), json (raw)
deterministicNoIf true, disable LLM ranking for exhaustive exact-match results (slower but complete)

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/rodhayl/mcpLocalHelper'

If you have feedback or need assistance with the MCP directory API, please join our Discord server