# Bi-Temporal Knowledge Graph MCP Server
A production-ready MCP (Model Context Protocol) server that gives your AI agents persistent memory with full temporal tracking. Save facts, extract entities using AI, and query historical data with time-travel capabilities.
**Build intelligent AI agents with persistent memory that understands time and context**
## Architecture
This server uses a **single-file "Database-Blind" architecture**:
- **main.py** - Everything in one file: FalkorDB driver, session management, entity extraction, memory tools, and your custom automation tools
**Structure:**
1. Configuration & Database Driver
2. Session Store & Entity Extractor
3. Graphiti Memory Core
4. Core MCP Memory Tools
5. **CUSTOM AUTOMATION TOOLS** section (add your webhook tools here!)
6. Server Startup
**Note:** This server focuses solely on memory operations. For advanced workflow orchestration, see the optional [Automation Engine OS](#-optional-automation-engine-os) section.
---
## β Star This Repo
If you find this project useful, please give it a star! It helps others discover the project and motivates continued development.

---
## π Links
- π **[Get Started](#-quick-start)** - Ready in 5 minutes
- π₯ **[Video Tutorial](YOUR_YOUTUBE_VIDEO_LINK)** - Watch how to set it up
- β **[FAQs](#-frequently-asked-questions)** - Common questions answered
- π **[Report Bugs](https://github.com/YOUR_USERNAME/bitemporal-mcp-server/issues)** - Found an issue?
- π **[Request Features](https://github.com/YOUR_USERNAME/bitemporal-mcp-server/issues)** - Have an idea?
### Resources
- π¬ **[Community](https://www.skool.com/knowledge-engineering-hub-9993)** - High Ticket AI Builders community
- π **[Full Documentation](./README.md)** - Complete guide
- π **[Deployment Guide](./DEPLOYMENT.md)** - Deploy anywhere
- π§ͺ **[Examples](./examples.py)** - Interactive scenarios
---
## π Table of Contents
- [Features](#-features)
- [How It Works](#-how-it-works)
- [Screenshots](#-screenshots)
- [Video Tutorial](#-video-tutorial)
- [Quick Start](#-quick-start)
- [Adding Custom Tools](#-adding-custom-automation-tools)
- [API Reference - Memory Tools](#-api-reference---memory-tools)
- [Use Cases](#-use-cases)
- [FAQ](#-frequently-asked-questions)
- [Changelog](#-changelog)
- [Support](#-support)
- [Optional: Automation Engine OS](#-optional-automation-engine-os)
- [License](#-license)
---
## β¨ Features
### π§ Bi-Temporal Knowledge Graph
- **Smart Memory**: Automatically tracks when facts were created AND when they became true in reality
- **Conflict Resolution**: When you move locations or change jobs, old facts are automatically invalidated
- **Time Travel Queries**: Ask "Where did John live in March 2024?" and get accurate historical answers
- **Session Tracking**: Maintains context across conversations with automatic cleanup
### π€ AI-Powered Entity Extraction
- **Natural Language Understanding**: Just tell it in plain English - "Alice moved to San Francisco and started working at Google"
- **Automatic Relationship Discovery**: AI extracts entities and relationships without manual input
- **OpenAI Integration**: Uses GPT-4 for intelligent entity extraction
- **Graceful Degradation**: Works without AI - just add facts manually
### π οΈ Simple Tool Extension
- **Single-File Architecture**: Everything in one `main.py` file for easy customization
- **Direct @mcp.tool() Pattern**: Add tools with a simple decorator - no config files needed
- **Single & Multi-Webhook**: Execute one webhook or fire multiple in parallel
- **Clear Custom Section**: Marked section in main.py shows exactly where to add your tools
### π Production Ready
- **Docker Support**: Complete docker-compose setup included
- **Replit Optimized**: Built specifically for Replit Autoscale environments
- **Resource Management**: Automatic session cleanup and connection pooling
- **Health Checks**: Built-in monitoring and status endpoints
- **100% Privacy-Friendly**: Your data stays in your database
---
## π¬ How It Works
```
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
β 1. Natural Language Input β
β "Bob moved to NYC and joined Google as a PM" β
ββββββββββββββββββ¬βββββββββββββββββββββββββββββββββββββββββ
β
βΌ
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
β 2. AI Entity Extraction (OpenAI) β
β β’ Bob -> lives in -> NYC β
β β’ Bob -> works at -> Google β
β β’ Bob -> has role -> PM β
ββββββββββββββββββ¬βββββββββββββββββββββββββββββββββββββββββ
β
βΌ
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
β 3. Bi-Temporal Storage (FalkorDB) β
β β’ Fact: Bob works at Google β
β β’ created_at: 2024-12-19T10:00:00Z β
β β’ valid_at: 2024-12-19T10:00:00Z β
β β’ invalid_at: null (still true) β
ββββββββββββββββββ¬βββββββββββββββββββββββββββββββββββββββββ
β
βΌ
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
β 4. Query Anytime β
β β’ "Where does Bob work now?" β Google β
β β’ "What was Bob's job history?" β All past jobs β
β β’ "Where did Bob live in 2023?" β Historical data β
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
```
---
## πΈ Screenshots
### Memory in Action

### AI Entity Extraction

### Dynamic Tool Generation

### Temporal Queries

---
## π₯ Video Tutorial
**Watch the complete setup and usage guide:**
[](YOUR_YOUTUBE_VIDEO_LINK)
**Topics covered:**
- Installation & setup (0:00)
- Adding your first facts (2:30)
- Using AI entity extraction (5:15)
- Creating automation tools (8:45)
- Temporal queries (12:20)
- Deployment to production (15:00)
---
## π Quick Start
### Option 1: Docker Compose (Recommended)
```bash
# 1. Download and extract
wget https://github.com/YOUR_USERNAME/bitemporal-mcp-server/archive/main.zip
unzip main.zip
cd bitemporal-mcp-server-main
# 2. Configure
echo "OPENAI_API_KEY=sk-your-key" > .env
# 3. Start everything (FalkorDB + MCP Server)
docker-compose up -d
# 4. Verify it's running
curl http://localhost:8080/health
```
**That's it! π** Your server is now running at `http://localhost:8080/sse`
### Option 2: Python (Local Development)
```bash
# 1. Install dependencies
pip install -r requirements.txt
# 2. Configure
cp .env.example .env
# Edit .env with your settings
# 3. Start FalkorDB (Docker)
docker run -d -p 6379:6379 falkordb/falkordb:latest
# 4. Run the server
python main.py
```
### Option 3: One-Click Deploy
[](https://replit.com/github/YOUR_USERNAME/bitemporal-mcp-server)
---
## π οΈ Adding Custom Automation Tools
Add your custom automation tools directly in `main.py` in the **CUSTOM AUTOMATION TOOLS** section.
### Step 1: Find the Custom Tools Section
Open `main.py` and scroll to around **line 800** - look for this clearly marked section:
```python
# =============================================================================
#
# βββββββ βββ βββ ββββββββ βββββββββ βββββββ ββββ ββββ
# ββββββββ βββ βββ ββββββββ βββββββββ βββββββββ βββββ βββββ
# βββ βββ βββ ββββββββ βββ βββ βββ βββββββββββ
# βββ βββ βββ ββββββββ βββ βββ βββ βββββββββββ
# ββββββββ βββββββββ ββββββββ βββ βββββββββ βββ βββ βββ
# βββββββ βββββββ ββββββββ βββ βββββββ βββ βββ
#
# AUTOMATION TOOLS
#
# ADD YOUR CUSTOM AUTOMATION TOOLS BELOW
#
# =============================================================================
```
This is where you'll add your webhook tools using the `@mcp.tool()` decorator.
### Step 2: Add Your Tool
Add a decorated async function with `@mcp.tool()`:
```python
@mcp.tool()
async def send_slack_notification(message: str, channel: str = "#general") -> str:
"""Send a notification to Slack."""
import httpx
payload = {"text": message, "channel": channel}
url = "https://hooks.slack.com/services/YOUR/WEBHOOK/URL"
async with httpx.AsyncClient() as client:
try:
resp = await client.post(url, json=payload)
return f"Success: Slack notification sent ({resp.status_code})"
except Exception as e:
return f"Error: {str(e)}"
```
The function's **docstring becomes the tool description** that the AI sees.
### Step 3: Restart the Server
Restart the MCP server to load your new tools.
### Example: LinkedIn Poster Tools
The Automation Engine App generates tools like this:
```python
@mcp.tool()
async def linkedin_post_image(caption: str, imageurl: str) -> str:
"""Posts an image with a caption to your LinkedIn page."""
import httpx
payload = {"caption": caption, "imageUrl": imageurl}
url = "https://webhook.latenode.com/YOUR/WEBHOOK/URL"
async with httpx.AsyncClient() as client:
try:
resp = await client.post(url, json=payload)
return f"Success: LinkedIn image posted ({resp.status_code})"
except Exception as e:
return f"Error: {str(e)}"
```
### Example: Multi-Webhook Broadcast
Fire multiple webhooks in parallel:
```python
@mcp.tool()
async def broadcast_alert(message: str) -> str:
"""Send alerts to multiple platforms in parallel."""
import httpx
import asyncio
webhooks = [
("https://hooks.slack.com/...", {"text": message}),
("https://discord.com/api/webhooks/...", {"content": message}),
]
async def send(url, data):
async with httpx.AsyncClient() as client:
return await client.post(url, json=data)
results = await asyncio.gather(*[send(url, data) for url, data in webhooks])
return f"Broadcast complete: {len(results)} webhooks fired"
```
---
## π API Reference - Memory Tools
All available MCP tools for managing your knowledge graph:
### Core Memory Operations
#### `add_fact`
Add a new fact to the knowledge graph with bi-temporal tracking.
```python
await add_fact(
source_entity="John",
relation="works at",
target_entity="Google",
group_id="my_org", # Optional
session_id="session_123", # Optional
valid_at="2024-01-15T00:00:00Z" # Optional - when fact became true
)
```
**Smart Conflict Resolution:** When adding location or employment facts, previous facts of the same type are automatically invalidated.
#### `add_message`
Add a natural language message and automatically extract entities using AI.
```python
await add_message(
content="Alice moved to San Francisco and started working at OpenAI",
session_id="session_123",
group_id="my_org", # Optional
extract_entities=True # Uses OpenAI for extraction
)
```
**Returns:** Extracted entities and relationships as facts.
#### `query_facts`
Query facts from the knowledge graph.
```python
await query_facts(
entity_name="John", # Optional - filter by entity
group_id="my_org", # Optional
include_invalid=False, # Include invalidated facts
max_facts=20
)
```
#### `query_at_time`
Time-travel query - get facts valid at a specific point in time.
```python
await query_at_time(
timestamp="2024-01-15T00:00:00Z",
entity_name="John", # Optional
group_id="my_org", # Optional
max_facts=20
)
```
**Use Case:** "Where did John work in January 2024?"
#### `get_episodes`
Get recent conversation sessions/episodes.
```python
await get_episodes(
group_ids=["my_org"], # Optional
max_episodes=10
)
```
#### `clear_graph`
Clear all data for specified groups. **Warning: Permanent deletion!**
```python
await clear_graph(
group_ids=["my_org"] # Optional - defaults to DEFAULT_GROUP_ID
)
```
### Server Management
#### `get_status`
Get comprehensive server status and statistics.
```python
await get_status()
# Returns: node counts, relationship types, session stats, connection status
```
#### `force_cleanup`
Manually trigger cleanup of expired sessions and idle connections.
```python
await force_cleanup()
# Returns: cleanup statistics
```
---
## π‘ Use Cases
### Personal Knowledge Management
Track your life events, relationships, and locations with full history:
```python
await add_message(
"I met Sarah at the tech conference. She works at OpenAI.",
session_id="my_life"
)
# Later: "Where did I meet Sarah?" β "At the tech conference"
```
### Customer Relationship Management
Monitor customer interactions with automatic conflict resolution:
```python
await add_fact("CustomerA", "status", "premium")
# Automatically invalidates previous "status" facts
# Query history: "What was CustomerA's status in January?"
```
### AI Agent Memory
Give your AI agents persistent, queryable memory:
```python
# Agent learns from conversation
await add_message(
"User prefers morning meetings and uses Slack",
session_id="agent_123"
)
# Agent recalls later: "What are the user's preferences?"
```
### Workflow Automation
Combine knowledge with actions:
```python
# When fact changes, trigger automation
if customer_upgraded_to_premium:
await notify_sales_team(customer_name=name)
await update_crm(customer_id=id, tier="premium")
```
---
## β Frequently Asked Questions
### Q: Does this require OpenAI?
**A:** No! OpenAI is optional for AI entity extraction. You can add facts manually without it.
### Q: Can I use this with Claude Desktop?
**A:** Yes! Add the server URL to your `claude_desktop_config.json`:
```json
{
"mcpServers": {
"knowledge-graph": {
"url": "http://localhost:8080/sse"
}
}
}
```
### Q: How do I query historical data?
**A:** Use the `query_at_time` tool:
```python
await query_at_time(
timestamp="2024-01-15T00:00:00Z",
entity_name="John"
)
```
### Q: Can I deploy this to production?
**A:** Absolutely! See [DEPLOYMENT.md](./DEPLOYMENT.md) for guides on:
- Replit Autoscale
- Railway
- Render
- Fly.io
- Docker
- VPS
### Q: How does fact invalidation work?
**A:** When you add a fact about location or employment, the system automatically finds previous facts of the same type and marks them as `invalid_at: current_time`. Your query results only show current facts unless you specifically request historical data.
### Q: Can I create multi-webhook tools?
**A:** Yes! Add a tool to the Custom Tools section in `main.py` using `asyncio.gather()` to fire multiple webhooks simultaneously. See the [Adding Custom Tools](#-adding-custom-automation-tools) section for examples.
### Q: Is my data secure?
**A:** Yes! Everything runs in your infrastructure. No data is sent anywhere except:
- OpenAI (only if you use entity extraction)
- Your configured webhooks (only when you call them)
### Q: How much does it cost to run?
**A:** Free for self-hosting! Only costs:
- FalkorDB hosting (free tier available)
- OpenAI API usage (optional, ~$0.001 per extraction)
---
## π Changelog
### [1.0.0] - 2024-12-19
#### Added
- β
Full bi-temporal tracking (created_at, valid_at, invalid_at, expired_at)
- β
Smart conflict resolution for location and employment changes
- β
Session-aware episodic memory with 30-minute TTL
- β
OpenAI-powered entity extraction from natural language
- β
Dynamic tool generator for automation workflows
- β
Single webhook tool template
- β
Multi-webhook parallel execution template
- β
Docker and Docker Compose support
- β
Replit Autoscale optimization
- β
Background cleanup manager
- β
Comprehensive documentation and examples
#### Supported Features
| Feature | Status | Notes |
|---------|--------|-------|
| Bi-Temporal Tracking | β
| Full implementation |
| AI Entity Extraction | β
| OpenAI GPT-4 |
| Smart Invalidation | β
| Location, employment, relationships |
| Session Management | β
| Auto-cleanup after 30 min |
| Custom Tools | β
| Single & multi-webhook via @mcp.tool() |
| Parallel Webhooks | β
| asyncio.gather |
| Docker Support | β
| Complete stack included |
| Health Checks | β
| Built-in monitoring |
---
## π Support
### Need Help?
1. **Check Documentation**: Start with [QUICKSTART.md](./QUICKSTART.md)
2. **Join Community**: [High Ticket AI Builders](https://www.skool.com/knowledge-engineering-hub-9993) - Free access!
3. **Watch Tutorial**: [Video Guide](YOUR_YOUTUBE_VIDEO_LINK)
4. **Report Bugs**: [GitHub Issues](https://github.com/YOUR_USERNAME/bitemporal-mcp-server/issues)
---
## π§ Optional: Automation Engine OS
**Need a visual tool to orchestrate your workflows?**
If you want to manage webhook configurations, generate tools automatically, and orchestrate complex workflows without writing code, check out **Automation Engine OS** - it's free when you join our community!
**What Automation Engine OS provides:**
- Visual webhook configuration builder
- Automatic MCP tool code generation
- Workflow orchestration dashboard
- Multi-webhook template management
- One-click tool deployment to your MCP server
**Get free access:** [Join High Ticket AI Builders](https://www.skool.com/knowledge-engineering-hub-9993)
**Note:** Automation Engine OS is completely optional. This MCP server works standalone - you can manually add tools to the Custom Tools section in `main.py` as shown in the [Adding Custom Automation Tools](#-adding-custom-automation-tools) section.
---
## π€ Contributing
Contributions are welcome! Areas for improvement:
- π Additional temporal query operators
- π§ Enhanced entity extraction prompts
- π§ More webhook authentication methods
- π Performance optimizations
- π Additional deployment platforms
- π More examples and tutorials
**To contribute:**
1. Fork the repository
2. Create your feature branch (`git checkout -b feature/AmazingFeature`)
3. Commit your changes (`git commit -m 'Add some AmazingFeature'`)
4. Push to the branch (`git push origin feature/AmazingFeature`)
5. Open a Pull Request
---
## π License
This project is licensed under the MIT License - see the [LICENSE](LICENSE) file for details.
**TL;DR:** You can use this commercially, modify it, distribute it. Just keep the license notice.
---
## π Acknowledgments
- Built with [FastMCP](https://github.com/jlowin/fastmcp)
- Powered by [FalkorDB](https://www.falkordb.com/)
- AI features via [OpenAI](https://openai.com/)
- Inspired by the High Ticket AI Builders community
---
## β Star History
[](https://star-history.com/#YOUR_USERNAME/bitemporal-mcp-server&Date)
---
## π Connect
- π¬ **Community**: [High Ticket AI Builders](https://www.skool.com/knowledge-engineering-hub-9993)
- π
**Want this implemented for your business?** [Book a Meeting](https://calendly.com/aiagentready/meeting)
---
<div align="center">
**Built with β€οΈ for the High Ticket AI Builders ecosystem**
If this project helps you, please consider giving it a β!
[β¬ Back to Top](#bi-temporal-knowledge-graph-mcp-server)
</div>