The Obsidian Diary MCP Server enables AI-powered smart journaling within Obsidian by automating entry creation, content management, and intelligent backlinking.
Core Features:
Generate diary templates with AI-powered reflection prompts based on analysis of recent entries and writing patterns
Save diary entries with automatic intelligent backlink generation to related entries using AI theme detection
Read existing entries by date in YYYY-MM-DD format
List recent entries with configurable count to track journaling history
Update backlinks for individual entries or refresh all backlinks across the diary based on current content
Create meaningful connections between entries through automatic
[[YYYY-MM-DD]]format backlinks that link thematically related diary entries
Integrates with GitHub Copilot CLI to enable natural language commands for creating diary templates and journaling assistance
Provides smart journaling capabilities for Obsidian vaults with AI-powered reflection prompts, automatic backlink generation between diary entries, and adaptive templates that learn from writing patterns
Obsidian Diary MCP Server
AI-powered journaling with local processing, automatic backlinks, and smart prompts.
Features
AI-generated reflection prompts based on past 3 calendar days
Day citations with automatic
[[YYYY-MM-DD]]backlinksBrain dump prioritization (analyzes your writing, not prompts)
Smart
#tagextraction using theme similarityTodo extraction to organized checklists
Memory trace analysis with theme evolution
Sunday synthesis (weekly reflection prompts)
Requirements
uv (Python package manager)
Ollama (llama3.1 or compatible model)
MCP client (e.g., GitHub Copilot CLI)
Obsidian vault (for markdown files)
Setup
1. Clone and install:
2. Configure:
Edit .env: set DIARY_PATH and PLANNER_PATH (required)
3. Add to MCP client config (e.g., GitHub Copilot CLI):
Name:
diaryCommand:
/full/path/to/obsidian-diary-mcp/start-server.sh
Configuration (.env):
Required: DIARY_PATH, PLANNER_PATH
Optional: OLLAMA_MODEL (default: llama3.1:latest), OLLAMA_TIMEOUT (60s), OLLAMA_TEMPERATURE (0.7), OLLAMA_NUM_PREDICT (1000 tokens)
Usage
Create:
"create a memory log for today"→ AI prompts based on past 3 daysWrite: Open in Obsidian, write freely in Brain Dump section
Extract:
"extract todos from today's entry"→ Action items to plannerLink:
"link today's memory log"→ Auto-generates[[YYYY-MM-DD]]&#tagsExplore: Use Obsidian's backlinks panel and graph view
More Commands: "show themes from last week", "create memory trace for 30 days", "refresh memory links for 30 days"
Debugging
Logs in logs/ directory: server-YYYY-MM-DD.log (protocol), debug-YYYY-MM-DD.log (operations)
Troubleshooting
Server issues: Check .env exists with DIARY_PATH and PLANNER_PATH set. Run ./start-server.sh directly to test.
Ollama issues: Verify running with curl http://localhost:11434/api/tags. Pull model: ollama pull llama3.1:latest
No backlinks: Need 2+ entries with similar themes (>8% overlap). Ensure Brain Dump section has substantial content (>50 chars). Check: grep "Brain Dump" logs/debug-*.log
Timeouts: Increase OLLAMA_TIMEOUT (90+) and OLLAMA_NUM_PREDICT (2000+) for reasoning models.
How It Works
Local AI: Ollama processes entries locally—content never leaves your machine
Calendar-Based: Analyzes past 3 calendar days (not just last 3 entries)
Brain Dump Focus: Prioritizes your writing over answered prompts for themes
Day Citations: AI cites
[Day 1]/[Day 2]→ converts to[[2025-10-07]]backlinksSmart Linking: Jaccard similarity connects entries with >8% theme overlap
Sundays: 5 weekly synthesis prompts (vs 3 daily)
Todo Extraction: AI identifies action items from brain dumps
Entry Format
Each entry (YYYY-MM-DD.md) has plain text headers:
License
MIT • Python 3.13+ • FastMCP 2.12.4+ • Ollama