Legal Contract Review Agent
Enables OCR text extraction from scanned PDFs and images via Google Cloud Vision API.
Provides AI-powered contract analysis using OpenAI's language models for clause risk assessment and suggestion generation.
Stores orders, reports, and cost data, and supports pgvector-based RAG for retrieval of Japanese legal articles.
Collects product analytics and tracks user behavior to improve the application.
Caches analysis reports for 72 hours and implements rate limiting for API requests.
Sends transactional emails such as order confirmations and report notifications.
Monitors application errors and performance issues in production.
Click on "Install Server".
Wait a few minutes for the server to deploy. Once ready, it will show a "Started" state.
In the chat, type
@followed by the MCP server name and your instructions, e.g., "@Legal Contract Review AgentAnalyze this Japanese NDA for potential liability risks and suggest improvements."
That's it! The server will respond to your query, and you can continue using it as needed.
Here is a step-by-step guide with screenshots.
ContractGuard
Japanese contract risk analysis built as an AI engineering case study — LangGraph workflow + pgvector RAG + multi-modal ingestion + recoverable streaming UX.
⚠️ Not a legal service. This repository has never been operated commercially — Japan Attorney Act §72 (弁護士法第72条) reserves paid legal advice for licensed attorneys. The codebase is published as an open-source technical artifact only. Outputs are not legal opinions.
Status
Production-ready open-source reference implementation. The full stack — frontend, backend, OCR, payment, email, Postgres, Redis, error tracking — is wired with real integrations and ready to deploy. It has simply never been launched, by design (Attorney Act §72).
A synthetic Japanese contract sits in docs/samples/ so the local flow can be exercised end-to-end immediately after clone.
Architecture
flowchart LR
U[React/Vite UI<br/>text, PDF, image upload] --> API[FastAPI routers]
API --> Q[Quote + PII + OCR budget guards]
Q --> PAY[KOMOJU checkout<br/>reference implementation]
PAY --> JOB[Persistent analysis job]
JOB --> SSE[Recoverable SSE stream<br/>status + events + after_seq]
JOB --> LG[LangGraph pipeline]
LG --> P[parse_contract]
P --> A[clause-by-clause risk analysis]
A --> T[tool call: analyze_clause_risk]
T --> RAG[(PostgreSQL pgvector<br/>331 Japanese legal articles)]
A --> S[tool call: generate_suggestion<br/>medium/high risks only]
S --> REP[report generation + translation]
REP --> CACHE[(Redis 72h report cache)]
REP --> DB[(PostgreSQL orders/reports/costs)]Tech Stack
Layer | Stack |
Frontend | React, Vite, TypeScript, i18next (9 languages) |
Backend | FastAPI, SQLAlchemy async, Alembic, APScheduler |
AI workflow | LangGraph + OpenAI tool calling, MCP server |
RAG | PostgreSQL |
OCR | Google Cloud Vision ( |
Storage | PostgreSQL (orders / reports / events), Redis (72h cache + rate limiting) |
Payment | KOMOJU checkout |
Resend | |
Observability | Sentry + PostHog |
Infra | Docker Compose (local), Fly.io + Vercel (deployment reference) |
Quick Start (local)
Local run only requires an OpenAI API key.
cp .env.example .env
# Edit .env: set OPENAI_API_KEY
docker compose up --buildThen open http://localhost:5173 and upload docs/samples/sample-contract-ja.txt.
In this minimal mode:
✅ Plain-text contracts and text-based PDFs (selectable text) work end-to-end.
❌ Image / scanned-PDF OCR is disabled. To enable it, add
GOOGLE_APPLICATION_CREDENTIALS_JSONandGOOGLE_VISION_PROJECT_ID.KOMOJU / Resend auto-bypass in dev — no real payment, no real email.
Production Setup
The repository is shaped to deploy to production by setting APP_ENV=production and supplying credentials for each external service:
Service | Required env vars |
OpenAI |
|
Google Cloud Vision (OCR) |
|
KOMOJU (payment) |
|
Resend (email) |
|
Sentry |
|
PostHog |
|
Database / Cache |
|
App |
|
When APP_ENV=production, the app refuses to boot if any of the above is missing or FRONTEND_URL still points at localhost. Strict-validation logic lives in backend/config.py (validate_runtime()).
fly.toml and vercel.json describe the deployment topology used during development. The service is not currently hosted.
Flow
Upload a contract (text, PDF, or image). The upload route runs text extraction, PII checks, token estimation, non-contract detection, and OCR budget guards.
Checkout reference path creates an order. Empty KOMOJU credentials trigger a local bypass in dev.
/review/:orderIdstarts or resumes the persistent analysis job and streams progress events that survive page refresh.LangGraph parses clauses, analyzes each clause with RAG-grounded tool calls, and generates suggestions only where the risk warrants it.
/report/:orderIdshows the saved report, clause excerpts, risk filters, and PDF export — retained for 72 hours.
User contract text is deleted after analysis. The vector store contains only public e-Gov statutes; user contracts are never embedded.
Demo

Repository Map
backend/agent/graph.py— LangGraph pipeline.backend/agent/tools.py— RAG-grounded tool calls.backend/services/analysis_executor.py— persistent analysis job + event sourcing.backend/rag/store.py— pgvector storage and search.backend/config.py— runtime configuration and strict validation.frontend/src/pages/ReviewPage.tsx— recoverable analysis progress UI.frontend/src/pages/ReportPage.tsx— report UI with risk filters and PDF export.tests/— backend pytest suites.scripts/smoke_local_flow.sh— end-to-end local smoke test.
This server cannot be installed
Resources
Unclaimed servers have limited discoverability.
Looking for Admin?
If you are the server author, to access and configure the admin panel.
Latest Blog Posts
MCP directory API
We provide all the information about MCP servers via our MCP API.
curl -X GET 'https://glama.ai/api/mcp/v1/servers/WIndFate/legal-ai-agent'
If you have feedback or need assistance with the MCP directory API, please join our Discord server