Enables querying and management of email signal analytics stored in a SQLite database, providing tools to analyze topic distributions, sender patterns, and urgency metrics, as well as managing asynchronous extraction jobs.
Click on "Install Server".
Wait a few minutes for the server to deploy. Once ready, it will show a "Started" state.
In the chat, type
@followed by the MCP server name and your instructions, e.g., "@email-insightsShow me a breakdown of email topics and their urgency"
That's it! The server will respond to your query, and you can continue using it as needed.
Here is a step-by-step guide with screenshots.
email-insights
An MCP server that exposes email signal analytics to Claude Desktop, with a background worker for scheduled async extraction jobs and structured logging.
Project Structure
email-insights/
├── data/
│ └── emails.csv # Raw email data (id, from, subject, body, date)
├── database/
│ └── signals.db # SQLite database (created after running ingestion)
├── ingestion/
│ ├── parse_csv.py # Step 1: Load emails from CSV
│ ├── extract_signals.py # Step 2: Call local LLM to extract signals
│ └── store_signals.py # Step 3: Write signals to SQLite (run this)
├── logs/
│ └── worker.log # Rotating log file (auto-created, 5 MB max, 3 backups)
├── mcp_server/
│ ├── server.py # MCP server: registers tools and starts listening
│ └── tools.py # SQLite query functions + job scheduling tools
├── utils/
│ └── logger.py # Shared structured logger (stderr + rotating file)
├── worker/
│ └── job_runner.py # Background worker: polls SQLite and runs extraction jobs
├── requirements.txt
└── README.mdSetup
1. Install dependencies
pip install -r requirements.txt2. Start LM Studio
Open LM Studio and load any instruction-following model (Llama 3, Mistral, etc.)
Start the local server: Local Server → Start Server
Default URL:
http://127.0.0.1:10101Copy the model identifier string and paste it into
ingestion/extract_signals.pyasLOCAL_MODEL
3. Run the ingestion pipeline
python ingestion/store_signals.pyThis reads data/emails.csv, sends each email to your local LLM for signal extraction,
and stores the results in database/signals.db.
4. Start the background worker
The worker is a separate process that polls for scheduled extraction jobs. Run it in a dedicated terminal:
python worker/job_runner.pyThe worker logs all activity to logs/worker.log and to stderr. It polls SQLite every 10 seconds and picks up any pending or due-scheduled jobs automatically.
5. Connect Claude Desktop
Add this server to your Claude Desktop config:
Mac: ~/Library/Application Support/Claude/claude_desktop_config.json
{
"mcpServers": {
"email-insights": {
"command": "python",
"args": ["/absolute/path/to/email-insights/mcp_server/server.py"]
}
}
}Restart Claude Desktop. You should see email-insights in the tools list.
MCP Tools
Query tools
Tool | Description |
| Query signals with optional date/topic/tone filters |
| Count of emails per topic category |
| Breakdown by sender type with urgency stats |
| Search signals by keyword |
Job scheduling tools
Tool | Description |
| Create an extraction job — runs now, at a scheduled time, or at midnight |
| Get real-time progress for a job (updates after every email) |
| Requeue only the emails that failed in a previous job |
All scheduling tools return immediately. Extraction runs asynchronously in the worker process.
schedule_extraction_tool run modes
| Behavior |
|
| Worker picks it up on the next poll (default) | not used |
| Runs at a specific time |
|
| Runs tonight at 00:00:00 | not used |
Architecture
Claude Desktop ──stdio──▶ mcp_server/server.py
│
mcp_server/tools.py
│
SQLite signals.db
│
worker/job_runner.py ◀── runs separately
│
LM Studio (local LLM)The MCP server and worker are two completely separate processes that share only the SQLite database. The MCP server never waits for extraction to finish — it creates a job record and returns immediately. The worker owns all writes to the jobs and failed_extractions tables (status updates, progress, failures); the MCP server only reads job status.
SQLite Schema
CREATE TABLE signals (
id INTEGER PRIMARY KEY AUTOINCREMENT,
email_id TEXT UNIQUE,
topic TEXT, -- job application | recruiter outreach | rejection | interview | networking | other
tone TEXT, -- positive | neutral | negative
sender_type TEXT, -- recruiter | company HR | networking contact | university | other
urgency TEXT, -- high | medium | low
requires_action INTEGER, -- 0 or 1
date TEXT -- ISO format: YYYY-MM-DD
);
CREATE TABLE jobs (
job_id INTEGER PRIMARY KEY AUTOINCREMENT,
schema_id INTEGER,
status TEXT NOT NULL DEFAULT 'pending', -- pending | scheduled | running | completed | failed
run_at TEXT, -- ISO datetime; NULL means run immediately
total_emails INTEGER DEFAULT 0,
processed_emails INTEGER DEFAULT 0,
created_at TEXT DEFAULT (datetime('now')),
completed_at TEXT,
error_message TEXT,
retry_of_job_id INTEGER -- set for retry jobs; links back to source job
);
CREATE TABLE failed_extractions (
id INTEGER PRIMARY KEY AUTOINCREMENT,
job_id INTEGER NOT NULL,
email_id TEXT NOT NULL,
error_message TEXT,
created_at TEXT DEFAULT (datetime('now'))
);Both jobs and failed_extractions are created automatically on first use — no manual migration needed.
Structured Logging
All worker activity is written to logs/worker.log (created automatically) and to stderr.
Log format:
[2026-03-05 14:22:01] [INFO] Worker started, polling every 10 seconds
[2026-03-05 14:22:11] [INFO] Job 1 picked up: schema_id=None, 10 emails to process
[2026-03-05 14:22:13] [INFO] [1/10] email_id=e001 extracted: topic=recruiter outreach, tone=positive
[2026-03-05 14:22:14] [WARNING] [2/10] email_id=e002 retrying after error: JSONDecodeError
[2026-03-05 14:22:16] [ERROR] [2/10] email_id=e002 failed after retry, saved to failed_extractions
[2026-03-05 14:22:45] [INFO] Job 1 completed in 34.2s: 9 success, 1 failedThe log file rotates at 5 MB and keeps the last 3 files (worker.log, worker.log.1, worker.log.2).
What to Learn from the Code
mcp_server/server.py
FastMCP("email-insights")— creates the server instance with a display name@mcp.tool()— registers the decorated function as a callable MCP toolDocstrings matter — Claude reads them to decide when and how to call each tool
Type hints — FastMCP uses them to build the JSON input schema Claude receives
mcp.run()— starts the stdio loop; Claude Desktop communicates via stdin/stdout
mcp_server/tools.py
Completely separate from MCP — plain Python functions returning JSON strings
Parameterized SQL queries prevent injection:
WHERE topic LIKE ?withparamssqlite3.Rowfactory lets you access columns by name:row["topic"]_ensure_jobs_tables()usesCREATE TABLE IF NOT EXISTS— safe to call on every tool invocation
worker/job_runner.py
Polls SQLite every 10 seconds — no message broker needed, just a shared DB
PRAGMA journal_mode=WALallows the MCP server to read while the worker writesRetry logic: one re-attempt on timeout or bad JSON, then
failed_extractionsprocessed_emailsupdated after every email socheck_job_status_toolalways reflects live progress
utils/logger.py
get_logger(name)is idempotent — safe to call from any module, no duplicate handlersRotatingFileHandlerprevents unbounded disk growthUses
sys.stderrfor the stream handler —sys.stdoutis reserved for MCP's JSON-RPC protocol
ingestion/extract_signals.py
OpenAI(base_url="http://127.0.0.1:10101/v1")— points the client at LM StudioLow
temperature=0.1— more deterministic output, better for structured JSONStrips markdown code fences the LLM might wrap around its JSON response
Falls back to safe defaults if parsing fails — pipeline never crashes on one bad email
This server cannot be installed
Resources
Unclaimed servers have limited discoverability.
Looking for Admin?
If you are the server author, to access and configure the admin panel.