Skip to main content
Glama

Server Configuration

Describes the environment variables required to run the server.

NameRequiredDescriptionDefault

No arguments

Capabilities

Features and capabilities supported by this server

CapabilityDetails
tools
{
  "listChanged": false
}
prompts
{
  "listChanged": false
}
resources
{
  "subscribe": false,
  "listChanged": false
}
experimental
{}

Tools

Functions exposed to the LLM to take actions

NameDescription
start_projectB

Initialize ty for a Python project. Must be called first.

search_symbolC

Search symbols (classes, functions, variables) across the project.

list_file_symbolsC

List all symbols defined in a file.

read_codeB

Read file content, optionally by line range (1-based).

read_contextC

Read code around a specific line with context.

stop_projectC

Stop ty and release resources.

get_definitionC

Go to definition of symbol at position (1-based).

find_usagesB

Find all references to symbol at position (1-based).

get_type_infoB

Get type information for symbol at position (1-based).

get_diagnosticsB

Get type errors and warnings for a file.

get_completionsB

Get code completion suggestions at position (1-based).

analyze_fileC

Analyze a Python file: get structure and diagnostics summary.

safe_renameB

Rename symbol across project. Set apply=True to execute.

get_code_actionsB

Get available quick fixes and refactorings at position (1-based).

apply_code_actionC

Apply a code action by index (from get_code_actions).

get_edit_previewC

Preview changes a code action would make.

Prompts

Interactive templates invoked by user choice

NameDescription

No prompts

Resources

Contextual data attached and managed by the client

NameDescription

No resources

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/qinsehm1128/mcp-ty'

If you have feedback or need assistance with the MCP directory API, please join our Discord server