@git-fabric/chat
OfficialClick on "Install Server".
Wait a few minutes for the server to deploy. Once ready, it will show a "Started" state.
In the chat, type
@followed by the MCP server name and your instructions, e.g., "@@git-fabric/chatsearch for past conversations about the API migration"
That's it! The server will respond to your query, and you can continue using it as needed.
Here is a step-by-step guide with screenshots.
@git-fabric/chat
Chat fabric app — AI conversation sessions, semantic history search, and context threading as a composable MCP layer.
Part of the git-fabric ecosystem.
What it is
A fabric app for AI conversation management — create and manage chat sessions with Claude, persist conversation history to Qdrant Cloud (semantic search over past conversations), support multi-turn threading, and provide context injection from external memory sources (e.g. Aiana).
This is the "conversation plane" of the fabric. Consumers (cortex agents, Claude Desktop, Claude Code via git-steer) use these tools to interact with Claude across sessions.
Tools
Tool | Description |
| Create a new chat session with optional system prompt, project, model, and title |
| List recent sessions, filtered by project and state |
| Get full session with message history |
| Mark a session as archived |
| Permanently delete a session and all its messages |
| Send a message and get a Claude response (full multi-turn context) |
| List messages in a session with pagination |
| Semantic search over all stored conversation content |
| Inject external context (e.g. Aiana memory recall) into a session |
| Aggregate stats: total sessions, messages, tokens today |
| Ping Anthropic and Qdrant, returns latency for each |
| Fork a session at a message point to explore an alternative branch |
Architecture
Follows the git-fabric layered pattern:
Detection / Query → layers/sessions.ts, layers/search.ts (reads)
Action → layers/messages.ts, layers/sessions.ts (effectful)
Adapter → adapters/env.ts (Anthropic + OpenAI + Qdrant + GitHub)
Surface → app.ts (FabricApp factory)State storage
Sessions + messages → GitHub repo
ry-ops/git-steer-state(same state repo as git-steer)Session metadata:
chat/sessions/{sessionId}.jsonMessage history:
chat/sessions/{sessionId}/messages.jsonl(JSONL, one message per line)Fast listing index:
chat/index.json
Semantic vectors → Qdrant Cloud collection
chat_fabric__messages__v1(1536-dim, text-embedding-3-small)Completions → Anthropic API (claude-sonnet-4-6 default, configurable per session)
Usage
Via gateway (recommended)
# gateway.yaml
apps:
- name: "@git-fabric/chat"
enabled: trueStandalone MCP server
ANTHROPIC_API_KEY=sk-ant-... \
OPENAI_API_KEY=sk-... \
QDRANT_URL=https://your-cluster.qdrant.io \
QDRANT_API_KEY=... \
GITHUB_TOKEN=ghp_... \
npx @git-fabric/chatProgrammatic
import { createApp } from "@git-fabric/chat";
const app = createApp();
// app.tools, app.health(), etc.Environment Variables
Variable | Required | Description |
| Yes | Anthropic API key for Claude completions |
| Yes | OpenAI API key for text-embedding-3-small |
| Yes | Qdrant Cloud cluster URL |
| Yes | Qdrant Cloud API key |
| Yes | GitHub PAT for state repo read/write |
| No | State repo (default: |
Models
Model | ID |
Claude Opus 4.6 |
|
Claude Sonnet 4.6 (default) |
|
Claude Haiku 4.5 |
|
License
MIT
Resources
Unclaimed servers have limited discoverability.
Looking for Admin?
If you are the server author, to access and configure the admin panel.
Appeared in Searches
Latest Blog Posts
MCP directory API
We provide all the information about MCP servers via our MCP API.
curl -X GET 'https://glama.ai/api/mcp/v1/servers/git-fabric/chat'
If you have feedback or need assistance with the MCP directory API, please join our Discord server