Provides AI-powered tools for managing Git repositories, including generating commit messages, creating branches from natural language, performing code reviews, and managing PR descriptions.
🌿 Gitrama MCP Server
AI-powered Git intelligence for your IDE — smart commits, branch names, PR descriptions, diffs, code review, push, and workflow management.
What is this?
Gitrama MCP Server exposes Gitrama's CLI as 15 MCP tools that any AI assistant can use. Instead of typing gtr commit in your terminal, your AI assistant calls the tool directly — analyzing your code changes, generating commit messages, suggesting branch names, reviewing code, and more.
Works with: Cursor · Claude Desktop · Claude Code · Windsurf · VS Code · Zed · any MCP-compatible client
What's new in v1.3.2
5 new tools —
gitrama_scan,gitrama_diff,gitrama_review,gitrama_status, andgitrama_pushVersion surfaced from health check —
gitrama_healthnow returns the MCP server version so you can confirm exactly what's runningInteractive HTML diff panel —
gitrama_difflaunches a browser panel with risk-annotated diffs, churn rates, coupling context, and contributor info overlaid on every changed filePush from chat —
gitrama_pushsupports upstream tracking, force-with-lease, and auto-resolves the current branchTool count updated from 10 → 15
Install (< 60 seconds)
Step 1: Install the package
pip install gitrama-mcpOr with uv:
uv pip install gitrama-mcpThis installs both the MCP server and the gitrama CLI.
Step 2: Connect to your IDE
Add to .cursor/mcp.json in your project (or global settings):
{
"mcpServers": {
"gitrama": {
"command": "gitrama-mcp"
}
}
}Add to claude_desktop_config.json:
macOS:
~/Library/Application Support/Claude/claude_desktop_config.jsonWindows:
%APPDATA%\Claude\claude_desktop_config.json
{
"mcpServers": {
"gitrama": {
"command": "gitrama-mcp"
}
}
}claude mcp add gitrama gitrama-mcpAdd to .vscode/mcp.json:
{
"mcpServers": {
"gitrama": {
"command": "gitrama-mcp"
}
}
}Add to ~/.codeium/windsurf/mcp_config.json:
{
"mcpServers": {
"gitrama": {
"command": "gitrama-mcp"
}
}
}Add to Zed settings (⌘,):
{
"context_servers": {
"gitrama": {
"command": {
"path": "gitrama-mcp"
}
}
}
}Step 3: Done.
Ask your AI: "Commit my staged changes" — and watch it call gitrama_commit.
Tools (15)
Health & Diagnostics
Tool | Description |
| Check AI server health and confirm MCP server version |
Repository Intelligence
Tool | Description |
| Ask any question about your repo — ownership, history, risk, changes |
| Full structural health scan — continuity risk, boundary entropy, recurrence patterns |
| Show working tree status with AI interpretation |
Code Review & Diff
Tool | Description |
| Risk-annotated diff with interactive HTML panel — churn, coupling, contributor context |
| AI code review before you push — security, correctness, risk, coupling |
Commit Intelligence
Tool | Description |
| Generate an AI commit message for staged changes |
| Stage files + commit in one step |
| Remove files from staging without discarding changes |
Branch & Push Management
Tool | Description |
| Create a branch from a natural language description |
| Push current branch to remote with upstream and force-with-lease support |
PR & Changelog
Tool | Description |
| Generate a PR description from branch diff |
| Generate a changelog between refs |
Stream (Workflow) Management
Tool | Description |
| Show current workflow stream |
| Switch to a different stream |
| List all streams in the repo |
Tool Details
gitrama_health
Check AI server connectivity and confirm the running MCP server version.
Example prompt: "Run a gitrama health check"
Example output:
✅ AI server is healthy!
🤖 Model: grok-4.20-reasoning
🌐 Connected to: https://api.x.ai/v1
🔖 Gitrama MCP Server: v1.3.2gitrama_scan
Run a full structural health scan of the repository. Scores every file for continuity risk, boundary entropy, and recurrence patterns. Results are cached in last_scan.json for use by gtr diff and gtr review.
Example prompt: "Run a full gitrama scan of my repo"
gitrama_diff
Show a risk-annotated diff of current changes. Opens an interactive HTML browser panel with Gitrama's structural intelligence overlaid — risk scores, churn rates, coupling gaps, and contributor context for every changed file.
Parameters:
Parameter | Type | Default | Description |
| string |
| Branch or commit to diff against |
| bool |
| Diff staged changes only |
Example prompts:
"Show me a diff of my staged changes"
"Diff my branch against main"
gitrama_review
Run an AI code review on current changes before committing. Returns severity-graded findings — security, correctness, risk, coupling — plus a verdict and suggested commit message.
Parameters:
Parameter | Type | Default | Description |
| string |
|
|
Example prompts:
"Review my staged changes"
"Do a full review of all my uncommitted changes"
gitrama_push
Push the current branch to a remote repository. Uses --force-with-lease for safe force pushes and auto-resolves the current branch if none is specified.
Parameters:
Parameter | Type | Default | Description |
| string |
| Remote to push to |
| string |
| Branch to push (default: current branch) |
| bool |
| Force push with |
| bool |
| Set upstream tracking branch ( |
Example prompts:
"Push my changes"
"Push this branch and set upstream"
"Force push the current branch"
gitrama_status
Show the working tree status with AI interpretation of staged, unstaged, and untracked files.
Example prompt: "What's my current git status?"
gitrama_ask
Ask a natural language question about your repository. Gitrama analyzes commit history, file structure, blame data, and diffs to answer.
Parameters:
Parameter | Type | Default | Description |
| string | required | Any question about your repo |
| string |
| Optional stream context override |
| bool |
| Enable full repo history access |
Example prompts:
"Who owns the auth module?"
"What's the riskiest file in this repo?"
"What changed in the last 3 days?"
"Explain the purpose of src/utils/retry.py"
gitrama_commit
Generate an AI-powered commit message for staged changes.
Parameters:
Parameter | Type | Default | Description |
| string |
| Optional custom message (skips AI generation) |
Example prompt: "Commit my staged changes"
gitrama_stage_and_commit
Stage files and commit in one step.
Parameters:
Parameter | Type | Default | Description |
| string |
| Files to stage ( |
| string |
| Optional custom message |
Example prompt: "Stage and commit all my changes"
gitrama_unstage
Remove files from the staging area without discarding changes.
Parameters:
Parameter | Type | Default | Description |
| string |
| Space-separated file paths to unstage |
| bool |
| Unstage everything currently staged |
Example prompt: "Unstage src/auth.py"
gitrama_branch
Generate an AI-powered branch name from a description and optionally create it.
Parameters:
Parameter | Type | Default | Description |
| string | required | What you're working on |
| bool |
| Create and switch to the branch |
Example prompts:
"Create a branch for adding OAuth2 support"
"Suggest a branch name for fixing the payment timeout, don't create it"
gitrama_pr
Generate a PR description from the diff between the current branch and base.
Parameters:
Parameter | Type | Default | Description |
| string |
| Target branch (default: main/master) |
Example prompt: "Write a PR description for this branch"
gitrama_changelog
Generate a changelog between refs.
Parameters:
Parameter | Type | Default | Description |
| string |
| Start ref (tag, branch, hash) |
| string |
| End ref (default: HEAD) |
| string |
|
|
Example prompt: "Generate a changelog since v1.1.4"
Stream Tools
Tool | Parameters | Example prompt |
| none | "What stream am I on?" |
|
| "Switch to the hotfix stream" |
| none | "List all my gitrama streams" |
The v1.3.2 Workflow
With all 15 tools connected, your full dev loop runs from chat:
describe intent → stream switch
write code
ask gitrama what changed → diff (HTML panel)
review before push
commit with AI message
push
PR description generatedNo terminal. No manual git commands.
Configuration
Environment Variables
Variable | Default | Description |
|
| Working directory for git operations |
|
| Transport: |
|
| HTTP host (when using streamable-http) |
|
| HTTP port (when using streamable-http) |
HTTP Transport (for CI/CD)
GTR_MCP_TRANSPORT=streamable-http GTR_MCP_PORT=8765 gitrama-mcpThen connect your client to http://localhost:8765/mcp.
Requirements
Python 3.10+
Git installed and in PATH
A Gitrama API key or local Ollama instance
Set your API key:
gtr config --key YOUR_API_KEYOr use a local model:
gtr config --provider ollama --model llama3Development
git clone https://github.com/ahmaxdev/gitrama-mcp.git
cd gitrama-mcp
pip install -e ".[dev]"
# Test with MCP Inspector
mcp dev src/gitrama_mcp/server.pyLicense
Proprietary — see LICENSE.
Built by Alfonso Harding · gitrama.ai
🌿