Skip to main content
Glama
git-fabric

@git-fabric/chat

Official
by git-fabric

chat_context_inject

Add external context like memory recall, documentation, or runtime state to chat sessions before sending messages. Injects content as messages for AI completions.

Instructions

Inject external context into a session before the next message send. Use this to pipe in Aiana memory recall, documentation snippets, or runtime state. The injected content is stored as a message and included in the next completion call.

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
sessionIdYesUUID of the session to inject context into.
contextYesContext content to inject (markdown, JSON, plain text).
roleNoRole for the injected context message. Default: system.

Implementation Reference

  • The injectContext function implements the core logic for chat_context_inject tool. It creates a new message in the session with the provided context content and role (system or user), adds metadata tracking the injection timestamp, and returns the created message ID.
    export async function injectContext(
      adapter: ChatAdapter,
      sessionId: string,
      context: string,
      role: "system" | "user" = "system",
    ): Promise<{ messageId: string; sessionId: string; role: string }> {
      const msg = await adapter.addMessage({
        sessionId,
        role,
        content: context,
        metadata: { injected: true, injectedAt: new Date().toISOString() },
      });
      return { messageId: msg.id, sessionId, role };
    }
  • src/app.ts:275-305 (registration)
    Registration of the chat_context_inject tool in the tools array. Defines the tool's name, description, input schema (sessionId, context, role), and executes the injectContext function from layers.messages module.
    {
      name: "chat_context_inject",
      description:
        "Inject external context into a session before the next message send. Use this to pipe in Aiana memory recall, documentation snippets, or runtime state. The injected content is stored as a message and included in the next completion call.",
      inputSchema: {
        type: "object",
        properties: {
          sessionId: {
            type: "string",
            description: "UUID of the session to inject context into.",
          },
          context: {
            type: "string",
            description: "Context content to inject (markdown, JSON, plain text).",
          },
          role: {
            type: "string",
            enum: ["system", "user"],
            description: "Role for the injected context message. Default: system.",
          },
        },
        required: ["sessionId", "context"],
      },
      execute: async (args) =>
        layers.messages.injectContext(
          adapter,
          args.sessionId as string,
          args.context as string,
          args.role as "system" | "user" | undefined,
        ),
    },
  • ChatMessage interface defining the data structure for messages, including role field which supports 'system' and 'user' values used by the context injection feature. The metadata field allows storing injection tracking information.
    export interface ChatMessage {
      id: string;             // UUID v4
      sessionId: string;
      role: "user" | "assistant" | "system";
      content: string;
      model?: ChatModel;      // set on assistant messages
      inputTokens?: number;   // set on assistant messages
      outputTokens?: number;  // set on assistant messages
      timestamp: string;      // ISO-8601
      metadata?: Record<string, unknown>;
    }

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/git-fabric/chat'

If you have feedback or need assistance with the MCP directory API, please join our Discord server