Skip to main content
Glama
git-fabric

@git-fabric/chat

Official
by git-fabric

chat_message_list

Retrieve and paginate through chat session messages in chronological order to review conversation history and maintain context.

Instructions

List messages in a session with pagination. Returns messages in chronological order.

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
sessionIdYesUUID of the session.
limitNoMaximum number of messages to return. Default: 50.
offsetNoNumber of messages to skip (for pagination). Default: 0.

Implementation Reference

  • The listMessages handler function that wraps the adapter's getMessages method. This is the main handler called by the tool registration when executing chat_message_list.
    export async function listMessages(
      adapter: ChatAdapter,
      sessionId: string,
      limit = 50,
      offset = 0,
    ): Promise<ChatMessage[]> {
      return adapter.getMessages(sessionId, limit, offset);
    }
  • The actual getMessages implementation in the environment adapter. Reads messages from GitHub storage and applies pagination with offset and limit parameters.
    async getMessages(sessionId, limit, offset) {
      const msgs = await readMessages(githubToken, stateRepo, sessionId);
      if (!msgs) return [];
      return msgs.messages.slice(offset, offset + limit);
    },
  • src/app.ts:207-235 (registration)
    Tool registration for chat_message_list including its name, description, input schema (sessionId, limit, offset), and the execute function that calls layers.messages.listMessages.
      name: "chat_message_list",
      description:
        "List messages in a session with pagination. Returns messages in chronological order.",
      inputSchema: {
        type: "object",
        properties: {
          sessionId: {
            type: "string",
            description: "UUID of the session.",
          },
          limit: {
            type: "number",
            description: "Maximum number of messages to return. Default: 50.",
          },
          offset: {
            type: "number",
            description: "Number of messages to skip (for pagination). Default: 0.",
          },
        },
        required: ["sessionId"],
      },
      execute: async (args) =>
        layers.messages.listMessages(
          adapter,
          args.sessionId as string,
          args.limit as number | undefined,
          args.offset as number | undefined,
        ),
    },
  • ChatMessage interface defining the output schema returned by chat_message_list. Includes id, sessionId, role, content, model, inputTokens, outputTokens, timestamp, and metadata fields.
    export interface ChatMessage {
      id: string;             // UUID v4
      sessionId: string;
      role: "user" | "assistant" | "system";
      content: string;
      model?: ChatModel;      // set on assistant messages
      inputTokens?: number;   // set on assistant messages
      outputTokens?: number;  // set on assistant messages
      timestamp: string;      // ISO-8601
      metadata?: Record<string, unknown>;
    }
  • The readMessages helper function that retrieves the messages.jsonl file from GitHub and parses it into ChatMessage objects. Used by the getMessages adapter method.
    async function readMessages(
      token: string,
      stateRepo: string,
      sessionId: string,
    ): Promise<{ messages: ChatMessage[]; sha: string } | null> {
      const file = await ghGet(token, stateRepo, messagesPath(sessionId));
      if (!file) return null;
      const messages = file.content
        .split("\n")
        .filter((line) => line.trim() !== "")
        .map((line) => JSON.parse(line) as ChatMessage);
      return { messages, sha: file.sha };
    }

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/git-fabric/chat'

If you have feedback or need assistance with the MCP directory API, please join our Discord server