Skip to main content
Glama

lldb_threads

List all threads with their IDs, names, execution points, and stop reasons to debug multithreaded C/C++ programs using LLDB.

Instructions

List all threads and their current state.

Shows:
- Thread IDs and names
- Current execution point for each thread
- Stop reason (if stopped)
- Optionally: backtrace for each thread

Args:
    params: ThreadsInput with executable and optional core file

Returns:
    str: Thread listing with state information

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
paramsYes

Implementation Reference

  • The main handler function that executes LLDB commands to examine threads in a target executable or core dump.
    async def lldb_threads(params: ThreadsInput) -> str:
        """List all threads and their current state.
    
        Shows:
        - Thread IDs and names
        - Current execution point for each thread
        - Stop reason (if stopped)
        - Optionally: backtrace for each thread
    
        Args:
            params: ThreadsInput with executable and optional core file
    
        Returns:
            str: Thread listing with state information
        """
        commands = []
    
        if params.core_file:
            commands.append(f"target create {params.executable} --core {params.core_file}")
        else:
            commands.append(f"target create {params.executable}")
            if params.breakpoint:
                commands.append(f"breakpoint set --name {params.breakpoint}")
                commands.append("run")
    
        commands.append("thread list")
    
        if params.show_backtrace:
            commands.append("thread backtrace all")
    
        if not params.core_file:
            commands.append("quit")
    
        result = _run_lldb_script(commands)
    
        return f"## Threads\n\n```\n{result['output'].strip()}\n```"
  • Pydantic BaseModel defining the input parameters for the lldb_threads tool.
    class ThreadsInput(BaseModel):
        """Input for examining threads."""
    
        model_config = ConfigDict(str_strip_whitespace=True)
    
        executable: str = Field(..., description="Path to the executable", min_length=1)
        breakpoint: str | None = Field(default=None, description="Breakpoint location to stop at")
        core_file: str | None = Field(default=None, description="Path to core dump file")
        show_backtrace: bool = Field(default=False, description="Show backtrace for each thread")
  • MCP decorator that registers the lldb_threads tool with the specified name and annotations.
    @mcp.tool(
        name="lldb_threads",
        annotations={
            "title": "Examine Threads",
            "readOnlyHint": True,
            "destructiveHint": False,
            "idempotentHint": True,
            "openWorldHint": False,
        },
    )

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/benpm/claude_lldb_mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server