Skip to main content
Glama

Server Configuration

Describes the environment variables required to run the server.

NameRequiredDescriptionDefault
CODELLDB_PATHNoPath to an existing CodeLLDB installation (required on macOS or Windows, as the published npm bundle only ships Linux x64 CodeLLDB runtime)
SKIP_ADAPTER_VENDORNoSet to 'true' in CI environments to skip vendoring debug adapters during installation

Tools

Functions exposed to the LLM to take actions

NameDescription
create_debug_session

Create a new debugging session

list_supported_languages

List all supported debugging languages with metadata

list_debug_sessions

List all active debugging sessions

set_breakpoint

Set a breakpoint. Setting breakpoints on non-executable lines (structural, declarative) may lead to unexpected behavior

start_debugging

Start debugging a script

close_debug_session

Close a debugging session

step_over

Step over

step_into

Step into

step_out

Step out

continue_execution

Continue execution

pause_execution

Pause execution (Not Implemented)

get_variables

Get variables (scope is variablesReference: number)

get_local_variables

Get local variables for the current stack frame. This is a convenience tool that returns just the local variables without needing to traverse stack->scopes->variables manually

get_stack_trace

Get stack trace

get_scopes

Get scopes for a stack frame

evaluate_expression

Evaluate expression in the current debug context. Expressions can read and modify program state

get_source_context

Get source context around a specific line in a file

Prompts

Interactive templates invoked by user choice

NameDescription

No prompts

Resources

Contextual data attached and managed by the client

NameDescription

No resources

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/debugmcpdev/mcp-debugger'

If you have feedback or need assistance with the MCP directory API, please join our Discord server