SqlAugur
SqlAugur is a read-only MCP server that gives AI assistants safe, structured access to SQL Server databases via AST-based query validation, schema exploration, diagram generation, and optional DBA diagnostics.
Secure Read-Only Access: Executes only
SELECTand CTE queries, validated at the AST level to prevent injection attacks; write operations are blocked across all tools.Server & Database Discovery: List configured SQL Server instances (
list_servers) and databases on a given server (list_databases).Data Querying: Execute read-only SELECT queries against any configured database (
read_data).Execution Plan Analysis: Retrieve estimated or actual XML execution plans for SELECT queries (
get_query_plan).Schema Exploration: Get a Markdown schema overview (
get_schema_overview), detailed table metadata including indexes, constraints, and defaults (describe_table), list stored procedures/views/functions/triggers (list_programmable_objects), retrieve T-SQL source definitions (get_object_definition), read extended properties (get_extended_properties), and analyze object dependency graphs (get_object_dependencies).ER Diagram Generation: Generate PlantUML (
get_plantuml_diagram) or Mermaid (get_mermaid_diagram) ER diagrams with tables, columns, PKs, FKs, and smart cardinality detection.DBA Diagnostic Tools: Optionally integrate First Responder Kit (sp_Blitz, etc.), DarlingData (sp_PressureDetector, etc.), and sp_WhoIsActive for health checks, performance analysis, index analysis, deadlock analysis, and session monitoring.
Progressive Tool Discovery: Dynamically load toolsets on demand (
list_toolsets,get_toolset_tools,enable_toolset) to reduce initial AI context window usage.Multi-Server Support: Configure and query multiple named SQL Server instances independently.
Rate Limiting: Configurable concurrency and throughput limits prevent runaway AI queries from overwhelming production servers.
Generates concise database schema maps and detailed table documentation (columns, indexes, and constraints) in Markdown format for easy documentation.
Enables the generation of Entity Relationship (ER) diagrams in Mermaid format with smart cardinality detection based on the SQL Server schema.
Supports generating Entity Relationship (ER) diagrams using PlantUML to visualize database structures and relationships.
Provides estimated or actual SQL Server execution plans in XML format for detailed query performance analysis and optimization.
SqlAugur
An MCP server that gives AI assistants safe, read-only access to SQL Server databases. Every query is parsed into a full AST using Microsoft's official T-SQL parser — not regex — so comment injection, string literal tricks, and encoding bypasses are blocked at the syntax level.
┌──────────────┐ ┌───────────────────────────────────────────┐ ┌──────────────┐
│ │ stdio │ SqlAugur │ │ │
│ AI Client │◄────────►│ │───────►│ SQL Server │
│ │ │ ┌────────────┐ ┌──────────────────────┐ │ │ │
└──────────────┘ │ │ Query │ │ Schema / Diagram / │ │ └──────────────┘
│ │ Validator │ │ DBA Services │ │
│ └────────────┘ └──────────────────────┘ │
│ ┌────────────────────────────────────┐ │
│ │ Rate Limiter │ │
│ └────────────────────────────────────┘ │
└───────────────────────────────────────────┘Quick Start
Use this order for all install methods:
Install SqlAugur
Save
appsettings.jsonin the correct locationAdd SqlAugur to your MCP client config
Verify by asking your assistant to call
list_servers
Start with Installation for exact commands and file paths.
Why This Approach
AST-level query validation — Most MCP database servers use keyword blocking or no validation at all. This project parses every query into a full syntax tree using Microsoft's official
TSql180Parser. Comment injection, string literal tricks, and encoding bypasses are blocked at the syntax level, not with fragile regex patterns.Rate limiting — Token bucket throughput limiting and concurrency control prevent runaway AI query loops from overwhelming production SQL Servers. No other MCP database server offers this.
DBA diagnostic tooling — Integrated support for First Responder Kit, DarlingData, and sp_WhoIsActive with parameter blocking that prevents write operations. This is an entirely new MCP capability category.
Response size optimisation — DBA tools exclude verbose columns (XML query plans, deadlock graphs, metric breakdowns) and truncate long strings by default, reducing response sizes by 90–99%. Use
verboseandincludeQueryPlansparameters to get full untruncated output when needed.Progressive discovery — Up to 29 tools organized into toolsets that load on demand. Only 6 core tools are exposed initially, keeping the AI's context window small and reducing token usage. Additional toolsets are discovered and enabled as needed.
Features
Security
Read-only by design — only SELECT and CTE queries are permitted
AST-based query validation using ScriptDom (not regex)
Parameter blocking on all diagnostic stored procedures to prevent writes
Concurrency and throughput rate limiting
Database Tooling
Multi-server support — named connections to multiple SQL Server instances
Schema overview — concise Markdown schema maps with PKs, FKs, constraints, and defaults
Table documentation — Markdown descriptions of columns, indexes, foreign keys, and constraints
ER diagram generation — PlantUML and Mermaid diagrams with smart cardinality detection
Schema exploration — list programmable objects, view definitions, extended properties, dependency graphs
Query plan analysis — estimated or actual XML execution plans
DBA diagnostics — optional integration with First Responder Kit, DarlingData, and sp_WhoIsActive with automatic response size optimisation
Progressive discovery — dynamic toolset mode reduces initial context window usage by exposing tools on demand
Installation
All methods produce the same MCP server. Follow this order: install, save config, wire client, verify.
NuGet Global Tool (recommended)
1. Install (prerequisite: .NET 10.0 runtime)
dotnet tool install -g SqlAugur2. Save config file
# Linux/macOS
mkdir -p ~/.config/sqlaugur
# Edit ~/.config/sqlaugur/appsettings.json with your server connections
# Windows (PowerShell)
mkdir "$env:APPDATA\sqlaugur" -Force
# Edit %APPDATA%\sqlaugur\appsettings.json with your server connectionsExample appsettings.json to save at that location:
{
"SqlAugur": {
"Servers": {
"production": {
"ConnectionString": "Server=myserver;Database=master;Integrated Security=True;TrustServerCertificate=False;Encrypt=True;"
}
}
}
}3. Add to MCP client
{
"mcpServers": {
"sqlaugur": {
"command": "sqlaugur"
}
}
}To update: dotnet tool update -g SqlAugur
Docker / Podman
1. Run SqlAugur container
# Volume-mount a config file
docker run -i --rm \
-v /path/to/appsettings.json:/app/appsettings.json:ro,Z \
ghcr.io/mbentham/sqlaugur:latest
# Or use environment variables (no config file needed)
docker run -i --rm \
-e SqlAugur__Servers__production__ConnectionString="Server=host.docker.internal;Database=master;..." \
ghcr.io/mbentham/sqlaugur:latestNote: To reach a SQL Server on the host machine, use
host.docker.internal(Docker Desktop) or--network=host(Linux). Replacedockerwithpodman— all commands are identical. The:Zflag on volume mounts is required for SELinux-enabled systems (Fedora, RHEL); Docker Desktop users on macOS/Windows can omit it.
If you mount a config file, save it as /path/to/appsettings.json and mount it to /app/appsettings.json.
2. Add to MCP client
{
"mcpServers": {
"sqlaugur": {
"command": "docker",
"args": ["run", "-i", "--rm",
"-v", "/path/to/appsettings.json:/app/appsettings.json:ro,Z",
"ghcr.io/mbentham/sqlaugur:latest"]
}
}
}services:
sqlaugur:
image: ghcr.io/mbentham/sqlaugur:latest
stdin_open: true
volumes:
- ./appsettings.json:/app/appsettings.json:ro,ZMCP client configuration:
{
"mcpServers": {
"sqlaugur": {
"command": "docker",
"args": ["compose", "run", "-i", "--rm", "sqlaugur"]
}
}
}Build from Source
1. Build (prerequisite: .NET 10.0 SDK)
git clone git@github.com:mbentham/SqlAugur.git
cd SqlAugur
dotnet publish SqlAugur -c Release -o SqlAugur/publish2. Save config file
# Linux/macOS
cp SqlAugur/appsettings.example.json SqlAugur/publish/appsettings.json
# Edit SqlAugur/publish/appsettings.json with your server connections
# Windows (PowerShell)
Copy-Item SqlAugur\appsettings.example.json SqlAugur\publish\appsettings.json
# Edit SqlAugur\publish\appsettings.json with your server connections3. Add to MCP client
{
"mcpServers": {
"sqlaugur": {
"command": "dotnet",
"args": ["/absolute/path/to/SqlAugur/publish/SqlAugur.dll"]
}
}
}Verify the MCP connection (LLM-first)
After restarting your MCP client, ask the assistant:
Call list_serversCall list_databases for server "production"
Expected result:
list_serversreturns your configured server name (for exampleproduction)list_databasesreturns a JSON array of databases, not a connection or authentication error
If verification fails:
Confirm MCP config runs the expected command (
sqlaugur,docker run ..., ordotnet /path/to/SqlAugur.dll)Confirm
appsettings.jsonis saved where your install method expects it:Local tool:
~/.config/sqlaugur/appsettings.json(Linux/macOS) or%APPDATA%\sqlaugur\appsettings.json(Windows)Container: mounted to
/app/appsettings.jsonSource build: next to the published DLL (
SqlAugur/publish/appsettings.json)
Confirm the tool call uses a configured server key (for example
production)Confirm SQL connectivity and authentication in the connection string
Configuration
The server loads configuration from multiple sources. Higher-priority sources override lower ones:
Command-line arguments
Environment variables — using
__as section delimiter (e.g.,SqlAugur__Servers__production__ConnectionString=...)Current working directory —
appsettings.jsonin the directory you run the command fromUser config directory —
~/.config/sqlaugur/appsettings.jsonon Linux,%APPDATA%\sqlaugur\appsettings.jsonon WindowsAzure Key Vault — when
AzureKeyVaultUriis set (see below)App directory —
appsettings.jsonnext to the DLL
Example configuration (Windows Authentication — recommended):
{
"SqlAugur": {
"Servers": {
"production": {
"ConnectionString": "Server=myserver;Database=master;Integrated Security=True;TrustServerCertificate=False;Encrypt=True;"
}
},
"MaxRows": 1000,
"CommandTimeoutSeconds": 30,
"MaxConcurrentQueries": 5,
"MaxQueriesPerMinute": 60,
"EnableFirstResponderKit": false,
"EnableDarlingData": false,
"EnableWhoIsActive": false,
"EnableDynamicToolsets": false
}
}Option | Default | Description |
| — | Named SQL Server connections (name → connection string) |
| 1000 | Maximum rows returned per query |
| 30 | SQL command timeout for all queries and procedures |
| 5 | Maximum number of SQL queries that can execute concurrently |
| 60 | Maximum queries allowed per minute (token bucket rate limit) |
| false | Enable First Responder Kit diagnostic tools (sp_Blitz, sp_BlitzFirst, sp_BlitzCache, sp_BlitzIndex, sp_BlitzWho, sp_BlitzLock) |
| false | Enable DarlingData diagnostic tools (sp_PressureDetector, sp_QuickieStore, sp_HealthParser, sp_LogHunter, sp_HumanEventsBlockViewer, sp_IndexCleanup, sp_QueryReproBuilder) |
| false | Enable sp_WhoIsActive session monitoring |
| false | Enable progressive tool discovery — DBA tools load on demand via 3 meta-tools instead of at startup. Reduces initial context window usage. The |
| — | Azure Key Vault URI (e.g., |
Security Note:
appsettings.jsonis gitignored to prevent accidental credential commits. See SECURITY.md for recommended authentication methods including Windows Authentication, Azure Managed Identity, and secure credential storage options.
Tools
The server provides 30 tools organized into toolsets. Six core tools are always available. Additional toolsets are loaded at startup (static mode) or on demand (dynamic mode).
Core Tools
Tool | Description |
| Lists available SQL Server instances configured in |
| Lists all databases on a named server with names, IDs, states, and creation dates. |
| Executes a read-only SQL SELECT query. Only |
| Returns the estimated or actual XML execution plan for a SELECT query. |
| Concise Markdown schema overview: tables, columns, PKs, FKs, unique/check constraints, defaults. Supports |
| Comprehensive table metadata in Markdown: columns, data types, nullability, defaults, identity, computed expressions, indexes, FKs, constraints. |
Tool | Description |
| Lists views, stored procedures, functions, and triggers. Filterable by type and schema. |
| Returns the source definition (CREATE statement) of a programmable object. |
| Reads extended properties (descriptions, metadata) on tables, columns, and other objects. |
| Shows what an object references and what references it — upstream and downstream dependency graphs. |
Tool | Description |
| Generates a PlantUML ER diagram with tables, columns, PKs, and FK relationships. Saves to a |
| Generates a Mermaid ER diagram with tables, columns, PKs, and FK relationships. Saves to a |
DBA Diagnostic Tools
Each toolkit is enabled independently via config flags and requires the corresponding stored procedures installed on the target SQL Server.
All DBA tools apply response size optimisation by default — XML query plan columns are excluded and long string values are truncated to keep responses within AI context window limits. Every tool supports these optional parameters:
Parameter | Description |
| Return all columns with no truncation. |
| Include XML execution plan columns in the output. |
| Maximum rows to return per result set. Available on tools with variable-length output: BlitzIndex, BlitzLock, HealthParser, LogHunter (default 200), IndexCleanup, QueryReproBuilder. |
Some tools have additional parameters: includeXmlReports (BlitzLock, HealthParser, HumanEventsBlockViewer), compact (sp_WhoIsActive), verboseMetrics (QuickieStore).
Install from: github.com/BrentOzarULTD/SQL-Server-First-Responder-Kit
Tool | Description |
| Overall SQL Server health check — prioritized findings for performance, configuration, and security. |
| Real-time performance diagnostics — samples DMVs over an interval for waits, file latency, and perfmon counters. |
| Plan cache analysis — top queries by CPU, reads, duration, executions, or memory grants. |
| Index analysis — missing, unused, and duplicate indexes with usage patterns. |
| Active query monitor — what's running, blocking info, tempdb usage, query plans. |
| Deadlock analysis from the |
| Cross-server query plan comparison — captures a plan snapshot on one server and compares it to the cached plan on a second server without using linked servers. Requires the demon_hunters branch until merged to main. |
Install from: github.com/erikdarling/DarlingData
Tool | Description |
| Diagnoses CPU and memory pressure — resource bottlenecks, high-CPU queries, memory grants, disk latency. |
| Query Store analysis — top resource-consuming queries, plan regressions, wait statistics. |
| Parses the |
| Searches SQL Server error logs for errors, warnings, and custom messages. |
| Analyzes blocking events from |
| Finds unused and duplicate indexes that are candidates for removal. |
| Generates reproduction scripts for Query Store queries with parameter values. |
Install from: whoisactive.com
Tool | Description |
| Monitors active sessions and queries — wait info, blocking details, tempdb usage, resource consumption. |
Progressive Discovery
When EnableDynamicToolsets is true, only core tools load at startup. Three meta-tools let the AI discover and enable additional toolsets on demand, reducing initial context window usage:
Tool | Description |
| Lists available toolsets with status (available, enabled, not configured) and tool counts. |
| Returns detailed tool and parameter info for a specific toolset before enabling it. |
| Enables a toolset, making its tools available. Only works if the admin has enabled the toolset via the corresponding |
Example flow:
AI calls
list_toolsets— seesfirst_responder_kitis "available" (configured but not yet enabled)AI calls
get_toolset_tools("first_responder_kit")— reviews the 6 tools and their parametersAI calls
enable_toolset("first_responder_kit")— the 6 tools are now registered and usableAI calls
sp_blitz— runs the health check as normal
In static mode (EnableDynamicToolsets: false), all enabled toolsets load at startup and the discovery tools are not registered. Schema Exploration and Diagrams toolsets are always loaded regardless of mode.
Known limitation: Progressive discovery relies on the MCP
notifications/tools/list_changednotification to inform clients that new tools have been registered. Claude Code does not currently handle this notification (anthropics/claude-code#4118), so dynamically enabled toolsets will not appear. Use static mode (EnableDynamicToolsets: false) when using Claude Code.
Security
Query Validation
Every query is parsed into an Abstract Syntax Tree (AST) using Microsoft's official TSql180Parser and must pass these rules:
Single statement only — multiple statements are rejected
SELECT only — INSERT, UPDATE, DELETE, DROP, EXEC, CREATE, ALTER, and all other statement types are blocked
No SELECT INTO — prevents table creation via SELECT
No external data access — OPENROWSET (all variants including BULK, Cosmos DB, and internal), OPENQUERY, OPENDATASOURCE, OPENXML blocked
No linked servers — four-part name references are rejected
No MAXRECURSION hint — prevents overriding the default recursion limit
Cross-database queries are allowed — three-part names work by design; the security boundary is the server, not the database. To restrict to a single database, limit the login's permissions.
Because validation operates on the parsed AST, it correctly handles edge cases that defeat string-based approaches: keywords inside comments, string literals, nested block comments, and encoding tricks.
Parameter Blocking
Diagnostic stored procedures execute via whitelisted procedure names with blocked parameters that prevent writes:
First Responder Kit — all
@Output*parameters blocked (prevents writing results to server tables)DarlingData — logging and output parameters blocked (prevents table creation and data retention)
sp_WhoIsActive —
@destination_table,@return_schema,@schema,@helpblocked
Rate Limiting
All tool executions are subject to concurrency limiting (MaxConcurrentQueries, default 5) and throughput limiting (MaxQueriesPerMinute, default 60). Excess requests are rejected with a retry message.
Connection Security
Use Windows Authentication or Azure Managed Identity where possible to avoid storing credentials in config files. When SQL Authentication is required, use environment variable overrides to inject credentials at runtime. See SECURITY.md for detailed guidance including credential stores and connection string encryption.
Known Risks
This project depends on the official Microsoft MCP C# SDK (
ModelContextProtocolNuGet package, version 1.2.0). As the MCP framework handles all protocol I/O, any vulnerability in it directly affects this application's security boundary. Monitor the package for updates and upgrade when new versions are released.The data returned from a SQL Server query could include malicious prompt injection targeting AIs. This is a risk of all AI use and cannot be mitigated by this project. Ensure you're following best practices for AI security and only connecting to trusted data sources.
Contributing
Contributions are welcome. See CONTRIBUTING.md for architecture details, development setup, testing instructions, and guidelines for adding new tools.
License
Maintenance
Latest Blog Posts
MCP directory API
We provide all the information about MCP servers via our MCP API.
curl -X GET 'https://glama.ai/api/mcp/v1/servers/mbentham/SqlAugur'
If you have feedback or need assistance with the MCP directory API, please join our Discord server