Skip to main content
Glama
git-fabric

@git-fabric/chat

Official
by git-fabric

Server Configuration

Describes the environment variables required to run the server.

NameRequiredDescriptionDefault
QDRANT_URLYesQdrant Cloud cluster URL
GITHUB_TOKENYesGitHub PAT for state repo read/write
OPENAI_API_KEYYesOpenAI API key for text-embedding-3-small
QDRANT_API_KEYYesQdrant Cloud API key
ANTHROPIC_API_KEYYesAnthropic API key for Claude completions
GITHUB_STATE_REPONoState repo (default: ry-ops/git-steer-state)ry-ops/git-steer-state

Capabilities

Features and capabilities supported by this server

CapabilityDetails
tools
{}

Tools

Functions exposed to the LLM to take actions

NameDescription
chat_session_create

Create a new chat session with Claude. Optionally set a system prompt, project tag, model, and title. Returns the sessionId to use in subsequent calls.

chat_session_list

List recent chat sessions. Filter by project, state, and limit. Sessions are sorted by most recently updated first.

chat_session_get

Get full session details including message history. Use this to inspect or resume a prior conversation.

chat_session_archive

Archive a session. Archived sessions are hidden from the default list but remain searchable and resumable.

chat_session_delete

Permanently delete a session and all its messages. This also removes vectors from Qdrant. Irreversible.

chat_message_send

Send a message in an existing session and get a Claude response. Reconstructs full conversation history for the API call. Stores both user message and assistant response. Returns the assistant reply with token usage.

chat_message_list

List messages in a session with pagination. Returns messages in chronological order.

chat_search

Semantic search over all stored conversation content using vector similarity. Finds messages relevant to the query even if exact words don't match. Optionally scope to a project or specific session.

chat_context_inject

Inject external context into a session before the next message send. Use this to pipe in Aiana memory recall, documentation snippets, or runtime state. The injected content is stored as a message and included in the next completion call.

chat_status

Return aggregate stats: total sessions, total messages, and tokens consumed today. Useful for quota monitoring and observability.

chat_health

Ping Anthropic and Qdrant services. Returns latency for each. Use to verify the app is operational before sending messages.

chat_thread_fork

Fork a session at a specific message to explore an alternative branch of conversation. Creates a new session with all history up to and including the fork point. The original session is unchanged.

Prompts

Interactive templates invoked by user choice

NameDescription

No prompts

Resources

Contextual data attached and managed by the client

NameDescription

No resources

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/git-fabric/chat'

If you have feedback or need assistance with the MCP directory API, please join our Discord server