Server is distributed through PyPI package manager for installation
Provides a Python API for query execution and catalog services
Provides SQL query execution, table previewing, metadata catalog building, and dependency graph analysis for Snowflake data warehouses with built-in safety checks to prevent destructive operations
Igloo MCP - Snowflake MCP Server for Agentic Native Workflows
Igloo MCP is a standalone MCP server for Snowflake operations, designed for agentic native workflows with AI assistants. Built from the ground up with SnowCLI integration for maximum simplicity and performance.
✨ Features
🛡️ SQL Guardrails: Blocks write + DDL operations (INSERT, UPDATE, CREATE, ALTER, DELETE, DROP, TRUNCATE) with safe alternatives
⏱️ Timeouts + Cancellation: Per‑request timeouts with best‑effort server‑side cancel; captures query ID when available
📝 Always-On Query History: Automatically capture JSONL audit events (success, timeout, error) with SHA-indexed SQL artifacts, even outside a git repo
📦 Result Cache: Default-on CSV/JSON cache per SQL + session context for instant replays without rerunning Snowflake
📊 Auto Insights: Every successful query returns lightweight
key_metrics+insightsderived from the seen rows—no extra SQL required🧠 Smart Errors: Compact by default; turn on verbose mode for actionable optimization hints
🧩 MCP‑Only Tooling: Clean set of MCP tools for query, preview, catalog, dependency graph, health, and connection tests
✅ MCP Protocol Compliant: Standard exception‑based error handling and robust health checks
📖 See Release Notes for details.
Available MCP Tools
Igloo MCP Tools
execute_query- Execute SQL queries with safety checkspreview_table- Preview table contents with LIMIT supportbuild_catalog- Build comprehensive metadata catalog from Snowflake INFORMATION_SCHEMAget_catalog_summary- Get catalog overview with object counts and statisticssearch_catalog- Search locally built catalog artifacts for tables, views, and columnsbuild_dependency_graph- Build dependency graph for data lineage analysistest_connection- Test Snowflake connection and profile validationhealth_check- Get system health status and configuration details
See MCP Documentation for details.
Tool Overview
Tool | Purpose | Key Parameters |
| Run SQL with validation, timeouts, cancellation |
,
,
,
,
,
,
,
|
| Quick table preview without writing SQL |
,
,
,
,
|
| Export comprehensive Snowflake metadata |
,
,
,
|
| Read catalog statistics and health |
|
| Search locally built catalog artifacts |
,
,
,
,
,
,
|
| Build dependency relationships (JSON/DOT) |
,
,
,
|
| Validate Snowflake connectivity | — |
| Comprehensive system, profile, and resource health |
,
,
|
| Poll asynchronous
jobs and retrieve cached results |
,
|
Query Log History (doc.jsonl + SQL artifacts)
Every execution writes a compact JSONL record to logs/doc.jsonl (created on demand). If the workspace path is unavailable, Igloo falls back to ~/.igloo_mcp/logs/doc.jsonl so history is always captured. Each record references the full SQL stored once by SHA-256 under logs/artifacts/queries/by_sha/.
Configure Paths
Purpose | Default | Override |
History file |
|
|
Artifact root |
|
|
Cache root |
|
|
Set these env vars to change locations. Use IGLOO_MCP_QUERY_HISTORY=disabled (or off/false/0) to disable history entirely. When disabled, neither JSONL records nor SQL artifacts are written to disk.
Logged Fields (per line)
ts— Unix timestamp (seconds)execution_id— Stable UUID per request (ties history, cache, audit info together)status—success|timeout|errorprofile— Snowflake profile usedstatement_preview— First 200 characters of the SQLtimeout_seconds— Effective timeout appliedsql_sha256— SHA-256 digest of the full SQL textartifacts—{ "sql_path": "logs/artifacts/queries/by_sha/<sha>.sql" }rowcount,duration_ms,query_id— When available (success only)overrides— Session overrides{ warehouse, database, schema, role }reason— Optional short reason (also stored in SnowflakeQUERY_TAG)post_query_insight— Optional structured insight summarising what the query discoveredkey_metrics,insights— Automatically generated summaries of the returned rows (non-null ratios, numeric ranges, categorical top values, etc.)cache_key,cache_manifest— Present on cache hits/saves for traceabilitysession_context— Effective warehouse/database/schema/role used for executionerror— Error message (timeout/error only)
Examples
Success:
Timeout (server-side cancel attempted):
Notes:
Query ID may be unavailable if a timeout triggers early cancellation.
History writes are best-effort; logging never raises to the caller.
Full SQL is stored once by hash; use the MCP resource
igloo://queries/by-sha/{sql_sha256}.sqlor the exporter (below) to read it.Cached executions log the
cache_keyand manifest path so you can open the saved CSV/JSON without rerunning Snowflake.Use
reasonfor human context only; avoid sensitive data.
Bundle SQL for Audits
Export a self-contained bundle (full SQL + minimal provenance) straight from doc.jsonl:
Or select by reason substring and keep only the latest run per SQL hash:
Each bundle entry includes:
sql_sha256,mcp_uri, and the fullsql_textAny
query_id,reason,rowcount,duration_ms, and overridesGenerator metadata with the selection criteria used
Result Caching (rows.jsonl + CSV)
Successful executions (up to
IGLOO_MCP_CACHE_MAX_ROWS, default 5 000 rows) are cached under<artifact_root>/cache/<cache_key>/with bothrows.jsonland a human-friendlyrows.csvplus a manifest.Subsequent calls with the same SQL, profile, and session overrides return the cached payload instantly; Snowflake is bypassed and history records a
cache_hit.Configure behaviour via:
IGLOO_MCP_CACHE_MODE=enabled|refresh|read_only|disabledIGLOO_MCP_CACHE_ROOT=/custom/cacheIGLOO_MCP_CACHE_MAX_ROWS=2000
History entries include
cache_key/cache_manifest, and tool responses exposeresult.cache+audit_info.cacheso you always know when cached data was served.Cache manifests now persist the generated
key_metrics+insights, so cache hits return the same summaries without recomputation.
Fixture-Based Regression Testing
A deterministic cache/history scenario lives under
tests/fixtures/cache_scenarios/baseline/(history JSONL, manifest, rows CSV/JSONL, SQL text).Regenerate locally via
python -m tests.helpers.cache_fixture_builder(see helper for details) and validate withpython -m pytest tests/test_cache_golden_fixtures.py.CI consumers and log-processing scripts can rely on these fixtures to ensure compatibility with fields such as
execution_id,cache_key,post_query_insight, and artifact paths.
Inspect Local Logs Quickly
View latest history line –
tail -n 1 logs/doc.jsonlOpen full SQL text –
cat $(jq -r '.artifacts.sql_path' <<< "$(tail -n 1 logs/doc.jsonl)")Check cache manifest –
jq '.' logs/artifacts/cache/<cache_key>/manifest.jsonDisable logging/caching (when debugging) – set
IGLOO_MCP_QUERY_HISTORY=disabledand/orIGLOO_MCP_CACHE_MODE=disabledfor that session.
Search Built Catalogs
Run
build_catalogonce (for examplebuild_catalog --output_dir ./artifacts/catalog --database ANALYTICS).Query the local snapshot via
search_catalog:search_catalog --catalog_dir ./artifacts/catalog --object_types table --name_contains customersFilter by columns (
--column_contains revenue) or schemas (--schema REPORTING) to rapidly find the objects you need without hitting Snowflake.
Installation
For End Users (Recommended)
Install from PyPI for stable releases:
Editor Setup (Cursor, Codex, Claude Code)
Quick wiring for common MCP clients. See the full guide in docs/installation.md.
• Cursor (~/.cursor/mcp.json)
• Claude Code (settings snippet)
• Codex / Other MCP Clients (generic block; consult client docs for config path)
After editing, restart your client and run a quick smoke test: igloo-mcp --profile my-profile --help.
⚡ 5-Minute Quickstart
Get igloo-mcp running with Cursor in under 5 minutes!
Who this is for: Users new to Snowflake and MCP who want to get started quickly.
How It Works
Your LLM calls MCP tools (execute_query, preview_table, build_catalog, etc.) exposed by igloo-mcp.
igloo-mcp uses your Snowflake CLI profile for authentication and session context.
Built-in guardrails block destructive SQL; timeouts and best‑effort cancellation keep runs responsive.
Optional JSONL query history records success/timeout/error with minimal fields for auditing.
Configure your editor (Cursor or Claude Code) to launch igloo-mcp with your Snowflake profile.
Prerequisites Check (30 seconds)
What you'll need:
Snowflake account with username/password (or ask your admin)
Cursor installed
Your Snowflake account identifier (looks like:
mycompany-prod.us-east-1)
Step 1: Install igloo-mcp (1 minute)
Note: igloo-mcp bundles the Snowflake CLI, so
snow --versionshould succeed after installation. If it does not, check that your environment PATH includes the uv-managed scripts directory or that you’re using the same virtual environment.
Step 2: Create Snowflake Profile (2 minutes)
Recommended: use your organization's SSO (Okta) via external browser.
Notes:
If your org requires an explicit Okta URL, use:
--authenticator https://<your_okta_domain>.okta.comIf your org doesn’t use SSO, see the password fallback below
Finding your account identifier:
Your Snowflake URL:
https://abc12345.us-east-1.snowflakecomputing.comYour account identifier:
abc12345.us-east-1(remove.snowflakecomputing.com)
Finding your warehouse:
Trial accounts: Usually
COMPUTE_WH(default warehouse)Enterprise: Check Snowflake UI → Admin → Warehouses, or ask your admin
Common names:
COMPUTE_WH,WH_DEV,ANALYTICS_WH
Don't have these? Ask your Snowflake admin for:
Account identifier
Username & password
Warehouse name
Fallback (no SSO): password authentication
Step 3: Configure Cursor MCP (1 minute)
Edit ~/.cursor/mcp.json:
Note: No
service_config.ymlneeded! igloo-mcp uses Snowflake CLI profiles directly.
Restart Cursor after configuring.
Claude Code (alternative)
Codex / Other MCP Clients (alternative)
Most MCP clients support the same server block; place it in your client’s MCP config file:
Consult your client’s documentation for the specific config path and restart the client after changes.
Add this to your Claude Code MCP settings:
Then ask Claude to test the connection or list databases.
Global MCP Configuration (All Projects)
Cursor, Codex, and most MCP clients support a global configuration that applies to every workspace on your machine. Use this when you want igloo-mcp available everywhere without copying blocks into each repo.
Edit the global config:
Cursor:
~/.cursor/mcp.jsonCodex CLI:
~/.factory/mcp.jsonClaude Code:
~/Library/Application Support/Claude/mcp.json
Register igloo-mcp once:
{ "mcpServers": { "igloo-mcp": { "command": "igloo-mcp", "args": ["--profile", "quickstart"], "env": { "SNOWFLAKE_PROFILE": "quickstart", "IGLOO_MCP_QUERY_HISTORY": "~/.igloo_mcp/logs/doc.jsonl", "IGLOO_MCP_ARTIFACT_ROOT": "~/.igloo_mcp/logs/artifacts" } } } }The
envblock keeps history/artifacts in a stable global directory so every project reuses the same cache + audit trail.Switch profiles per workspace: override
SNOWFLAKE_PROFILEor pass--profilewhen launching igloo-mcp (Cursor adds per-project overrides via.cursor/mcp.json). You can also define multiple servers:"igloo-mcp-prd": { "command": "igloo-mcp", "args": ["--profile", "prod"], "env": {"SNOWFLAKE_PROFILE": "prod"} }Manage secrets centrally: export
SNOWFLAKE_PRIVATE_KEY_PATH,IGLOO_MCP_CACHE_MODE, etc., in your shell RC so every project inherits the same behaviour. The MCP config should only reference non-sensitive identifiers.
Troubleshooting tips:
If a project needs a different artifact root, set
IGLOO_MCP_ARTIFACT_ROOTin that repo’s.envor launch script—it overrides the global default.Use
igloo-mcp --profile <name> --health-checkto validate the profile once; all editors re-use the cached session metadata.When sharing machines, prefer per-user global configs under your home directory so other accounts keep separate Snowflake profiles and audit logs.
Step 4: Test Your Setup (30 seconds)
Verify Snowflake Connection
Verify MCP Server
Step 5: Test It! (30 seconds)
In Cursor, try these prompts:
Expected: ✅ Connection successful message
Expected: List of your databases
Expected: List of tables (if you have access)
Success! 🎉
You've successfully:
✅ Installed igloo-mcp
✅ Configured Snowflake connection
✅ Connected Cursor to igloo-mcp
✅ Ran your first Snowflake queries via AI
Time taken: ~5 minutes
What's Next?
Explore MCP Tools
Try these prompts in Cursor:
Alternate: Key‑Pair (advanced)
Use RSA key‑pair auth when required by security policy or for headless automation:
Generate keys:
Upload public key to Snowflake:
Update your profile:
Troubleshooting
"Profile not found"
Fix:
"Connection failed"
Fix:
Verify account format:
org-account.region(nothttps://...)Check username/password are correct
Ensure warehouse exists and you have access
Try:
snow sql -q "SELECT 1" --connection quickstart
"MCP tools not showing up"
Fix:
Verify igloo-mcp is installed:
which igloo-mcpCheck MCP config JSON syntax is valid
Restart Cursor completely
Check Cursor logs for errors
"Permission denied"
Fix:
Ensure you have
USAGEon warehouseCheck database/schema access:
SHOW GRANTS TO USER <your_username>Contact your Snowflake admin for permissions
"SQL statement type 'Union' is not permitted"
Fix:
Upgrade to the latest igloo-mcp; UNION/INTERSECT/EXCEPT now inherit SELECT permissions
If you override SQL permissions, ensure
selectremains enabled in your configuration
Still stuck?
💬 GitHub Discussions - Community help
🐛 GitHub Issues - Report bugs
📖 Full Documentation - Comprehensive guides
🔐 Authentication Options - SSO/Okta, password, key‑pair
Complete Setup Guide
For Cursor Users
See Getting Started Guide for detailed setup instructions.
MCP Server (MCP-Only Interface)
Task | Command | Notes |
Start MCP server |
| For AI assistant integration |
Start with profile |
| Specify profile explicitly |
Configure |
| Interactive setup |
🐻❄️ MCP-Only Architecture Igloo MCP is MCP-only. All functionality is available through MCP tools.
Profile Selection Options:
Command flag:
igloo-mcp --profile PROFILE_NAME(explicit)Environment variable:
export SNOWFLAKE_PROFILE=PROFILE_NAME(session)Default profile: Set with
snow connection set-default PROFILE_NAME(implicit)
Python API
Documentation
Getting Started Guide - Recommended for all users
MCP Server User Guide - Advanced MCP configuration
API Reference - All available MCP tools
Examples
Query Execution via MCP
Data Catalog Building
Table Preview
Testing
Offline (default):
python -m pytest– runs the offline suite backed by stored Snowflake CLI fixtures and fake connectors. This is the command we run in CI.Live Snowflake checks (optional):
python -m pytest --snowflake -m requires_snowflakeafter setting up credentials. Without--snowflake, tests markedrequires_snowflakeare skipped automatically.
Fixtures that capture sanitized Snowflake CLI output live under tests/fixtures/snowflake_cli/. Update them as the schema evolves, then rerun the offline suite to ensure coverage stays green.
This server cannot be installed