Skip to main content
Glama

Server Configuration

Describes the environment variables required to run the server.

NameRequiredDescriptionDefault
graphNoPath to a pre-built tool graph JSON file (used with the 'serve' command).
configNoPath to a backends.json configuration file for aggregating multiple MCP servers (used with the 'proxy' command).
sourceNoURL or path to an OpenAPI/Swagger spec or MCP server tool list to ingest (used with the 'serve' command).

Capabilities

Features and capabilities supported by this server

CapabilityDetails
tools
{
  "listChanged": false
}
prompts
{
  "listChanged": false
}
resources
{
  "subscribe": false,
  "listChanged": false
}
experimental
{}

Tools

Functions exposed to the LLM to take actions

NameDescription
search_tools

Search for relevant tools by natural language query.

    Returns the most relevant tools for the given query, ranked by
    graph-based hybrid retrieval (BM25 + graph traversal + embedding).
    Previously called tools are automatically deprioritized to surface
    new candidates on repeated searches.

    Args:
        query: Natural language description of what you want to do.
               Examples: "user authentication", "delete a file",
               "manage shopping cart items"
        top_k: Maximum number of tools to return (default: 5)
    
get_tool_schema

Get the full schema of a specific tool by name.

    Use this after search_tools() to get complete parameter details
    for a tool you want to call.

    Args:
        name: Exact tool name (as returned by search_tools)
    
list_categories

List all tool categories in the graph.

Returns categories with their tool counts, useful for understanding the available tool landscape before searching.

graph_info

Show summary statistics about the loaded tool graph.

Returns tool count, node count, edge count, and category breakdown.

execute_tool

Execute an OpenAPI tool via HTTP.

    Sends the actual HTTP request based on the tool's method and path
    from the OpenAPI spec. Use after search_tools() + get_tool_schema()
    to call the API.

    Args:
        tool_name: Exact tool name (as returned by search_tools)
        arguments: JSON string of parameter values (e.g. '{"owner":"me","repo":"test"}')
        base_url: API base URL (e.g. https://api.github.com). Required if not inferrable.
        auth_token: Bearer token for authentication (optional)
    
load_source

Load additional tools from an OpenAPI spec URL or file path.

    Supports:
    - Direct spec URLs (JSON/YAML): https://api.example.com/openapi.json
    - Swagger UI URLs: https://api.example.com/swagger-ui/index.html
    - Local file paths: ./openapi.json, /path/to/spec.yaml

    Args:
        source: OpenAPI spec URL or local file path
    

Prompts

Interactive templates invoked by user choice

NameDescription

No prompts

Resources

Contextual data attached and managed by the client

NameDescription

No resources

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/SonAIengine/graph-tool-call'

If you have feedback or need assistance with the MCP directory API, please join our Discord server