Skip to main content
Glama

Server Configuration

Describes the environment variables required to run the server.

NameRequiredDescriptionDefault
SMART_CODING_VERBOSENoEnable detailed loggingfalse
SMART_CODING_BATCH_SIZENoFiles to process in parallel100
SMART_CODING_CHUNK_SIZENoLines of code per chunk15
SMART_CODING_MAX_RESULTSNoMax search results5
SMART_CODING_MAX_FILE_SIZENoMax file size in bytes (1MB)1048576
SMART_CODING_SMART_INDEXINGNoEnable smart project detectiontrue

Capabilities

Features and capabilities supported by this server

CapabilityDetails
tools
{}

Tools

Functions exposed to the LLM to take actions

NameDescription
a_semantic_search

Performs intelligent hybrid code search combining semantic understanding with exact text matching. Ideal for finding code by meaning (e.g., 'authentication logic', 'database queries') even with typos or variations. Returns the most relevant code snippets with file locations and line numbers.

b_index_codebase

Manually trigger a full reindex of the codebase. This will scan all files and update the embeddings cache. Useful after large code changes or if the index seems out of date.

c_clear_cache

Clears the embeddings cache, forcing a complete reindex on next search or manual index operation. Useful when encountering cache corruption or after major codebase changes.

d_check_last_version

Get the latest version of a library/package from its official registry. Supported ecosystems: npm (JS/TS), PyPI (Python), Packagist (PHP), Crates.io (Rust), Maven (Java/Kotlin), Go, RubyGems, NuGet (.NET), Hex (Elixir), CRAN (R), CPAN (Perl), pub.dev (Dart), Homebrew (macOS), Conda (Python/R), Clojars (Clojure), Hackage (Haskell), Julia, Swift PM, Chocolatey (Windows). Returns the version string to help you avoid using outdated dependencies.

e_set_workspace

Change the project workspace path at runtime. Use this when you detect the current workspace is incorrect or you need to switch to a different project directory. Creates cache folder automatically and optionally re-indexes the new workspace.

f_get_status

Get comprehensive status information about the Smart Coding MCP server. Returns version, workspace path, model configuration, indexing status, and cache information. Useful for understanding the current state of the semantic search system.

Prompts

Interactive templates invoked by user choice

NameDescription

No prompts

Resources

Contextual data attached and managed by the client

NameDescription

No resources

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/omar-haris/smart-coding-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server