Skip to main content
Glama

Server Configuration

Describes the environment variables required to run the server.

NameRequiredDescriptionDefault
PYTHONUNBUFFEREDNoEnsures Python output is sent straight to terminal without being buffered1

Tools

Functions exposed to the LLM to take actions

NameDescription
sessions_create

Create a new debug session for a Python script

sessions_breakpoint

Run to a breakpoint and capture local variables

sessions_continue

Continue execution to the next breakpoint

sessions_state

Get the current state of a debug session

sessions_end

End a debug session and clean up resources

sessions_step_in

Step into the next function call (requires active breakpoint)

sessions_step_over

Step over the current line (requires active breakpoint)

sessions_step_out

Step out of the current function (requires active breakpoint)

Prompts

Interactive templates invoked by user choice

NameDescription

No prompts

Resources

Contextual data attached and managed by the client

NameDescription

No resources

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/Kaina3/Debug-MCP'

If you have feedback or need assistance with the MCP directory API, please join our Discord server