Skip to main content
Glama

DBHub

by bytebase

Note


Brought to you by Bytebase, open-source database DevSecOps platform.

cursor://anysphere.cursor-deeplink/mcp/install?name=dbhub&config=eyJjb21tYW5kIjoibnB4IEBieXRlYmFzZS9kYmh1YiIsImVudiI6eyJUUkFOU1BPUlQiOiJzdGRpbyIsIkRTTiI6InBvc3RncmVzOi8vdXNlcjpwYXNzd29yZEBsb2NhbGhvc3Q6NTQzMi9kYm5hbWU%2Fc3NsbW9kZT1kaXNhYmxlIiwiUkVBRE9OTFkiOiJ0cnVlIn19

DBHub is a universal database gateway implementing the Model Context Protocol (MCP) server interface. This gateway allows MCP-compatible clients to connect to and explore different databases.

+------------------+ +--------------+ +------------------+ | | | | | | | | | | | | | Claude Desktop +--->+ +--->+ PostgreSQL | | | | | | | | Claude Code +--->+ +--->+ SQL Server | | | | | | | | Cursor +--->+ DBHub +--->+ SQLite | | | | | | | | Other Clients +--->+ +--->+ MySQL | | | | | | | | | | +--->+ MariaDB | | | | | | | | | | | | | +------------------+ +--------------+ +------------------+ MCP Clients MCP Server Databases

Demo HTTP Endpoint

https://demo.dbhub.ai/message connects a sample employee database. You can point Cursor or MCP Inspector to it to see it in action.

mcp-inspector

Supported Matrix

Database Resources

Resource NameURI FormatPostgreSQLMySQLMariaDBSQL ServerSQLite
schemasdb://schemas
tables_in_schemadb://schemas/{schemaName}/tables
table_structure_in_schemadb://schemas/{schemaName}/tables/{tableName}
indexes_in_tabledb://schemas/{schemaName}/tables/{tableName}/indexes
procedures_in_schemadb://schemas/{schemaName}/procedures
procedure_details_in_schemadb://schemas/{schemaName}/procedures/{procedureName}

Database Tools

ToolCommand NameDescriptionPostgreSQLMySQLMariaDBSQL ServerSQLite
Execute SQLexecute_sqlExecute single or multiple SQL statements (separated by semicolons)

Prompt Capabilities

PromptCommand NamePostgreSQLMySQLMariaDBSQL ServerSQLite
Generate SQLgenerate_sql
Explain DB Elementsexplain_db

Installation

Docker

# PostgreSQL example docker run --rm --init \ --name dbhub \ --publish 8080:8080 \ bytebase/dbhub \ --transport http \ --port 8080 \ --dsn "postgres://user:password@localhost:5432/dbname?sslmode=disable"
# Demo mode with sqlite sample employee database docker run --rm --init \ --name dbhub \ --publish 8080:8080 \ bytebase/dbhub \ --transport http \ --port 8080 \ --demo

NPM

# PostgreSQL example npx @bytebase/dbhub --transport http --port 8080 --dsn "postgres://user:password@localhost:5432/dbname?sslmode=disable" # Demo mode with sqlite sample employee database npx @bytebase/dbhub --transport http --port 8080 --demo
# Demo mode with sample employee database npx @bytebase/dbhub --transport http --port 8080 --demo

Note: The demo mode includes a bundled SQLite sample "employee" database with tables for employees, departments, salaries, and more.

Claude Desktop

claude-desktop

  • Claude Desktop only supports stdio transport https://github.com/orgs/modelcontextprotocol/discussions/16
// claude_desktop_config.json { "mcpServers": { "dbhub-postgres-docker": { "command": "docker", "args": [ "run", "-i", "--rm", "bytebase/dbhub", "--transport", "stdio", "--dsn", // Use host.docker.internal as the host if connecting to the local db "postgres://user:password@host.docker.internal:5432/dbname?sslmode=disable" ] }, "dbhub-postgres-npx": { "command": "npx", "args": [ "-y", "@bytebase/dbhub", "--transport", "stdio", "--dsn", "postgres://user:password@localhost:5432/dbname?sslmode=disable" ] }, "dbhub-demo": { "command": "npx", "args": ["-y", "@bytebase/dbhub", "--transport", "stdio", "--demo"] } } }

Claude Code

Check https://docs.anthropic.com/en/docs/claude-code/mcp

Cursor

cursor://anysphere.cursor-deeplink/mcp/install?name=dbhub&config=eyJjb21tYW5kIjoibnB4IEBieXRlYmFzZS9kYmh1YiIsImVudiI6eyJUUkFOU1BPUlQiOiJzdGRpbyIsIkRTTiI6InBvc3RncmVzOi8vdXNlcjpwYXNzd29yZEBsb2NhbGhvc3Q6NTQzMi9kYm5hbWU%2Fc3NsbW9kZT1kaXNhYmxlIiwiUkVBRE9OTFkiOiJ0cnVlIn19

cursor

Usage

Read-only Mode

You can run DBHub in read-only mode, which restricts SQL query execution to read-only operations:

# Enable read-only mode npx @bytebase/dbhub --readonly --dsn "postgres://user:password@localhost:5432/dbname"

In read-only mode, only readonly SQL operations are allowed.

This provides an additional layer of security when connecting to production databases.

SSL Connections

You can specify the SSL mode using the sslmode parameter in your DSN string:

Databasesslmode=disablesslmode=requireDefault SSL Behavior
PostgreSQLCertificate verification
MySQLCertificate verification
MariaDBCertificate verification
SQL ServerCertificate verification
SQLiteN/A (file-based)

SSL Mode Options:

  • sslmode=disable: All SSL/TLS encryption is turned off. Data is transmitted in plaintext.
  • sslmode=require: Connection is encrypted, but the server's certificate is not verified. This provides protection against packet sniffing but not against man-in-the-middle attacks. You may use this for trusted self-signed CA.

Without specifying sslmode, most databases default to certificate verification, which provides the highest level of security.

Example usage:

# Disable SSL postgres://user:password@localhost:5432/dbname?sslmode=disable # Require SSL without certificate verification postgres://user:password@localhost:5432/dbname?sslmode=require # Standard SSL with certificate verification (default) postgres://user:password@localhost:5432/dbname

SSH Tunnel Support

DBHub supports connecting to databases through SSH tunnels, enabling secure access to databases in private networks or behind firewalls.

DBHub can read SSH connection settings from your ~/.ssh/config file. Simply use the host alias from your SSH config:

# If you have this in ~/.ssh/config: # Host mybastion # HostName bastion.example.com # User ubuntu # IdentityFile ~/.ssh/id_rsa npx @bytebase/dbhub \ --dsn "postgres://dbuser:dbpass@database.internal:5432/mydb" \ --ssh-host mybastion

DBHub will automatically use the settings from your SSH config, including hostname, user, port, and identity file. If no identity file is specified in the config, DBHub will try common default locations (~/.ssh/id_rsa, ~/.ssh/id_ed25519, etc.).

SSH with Password Authentication
npx @bytebase/dbhub \ --dsn "postgres://dbuser:dbpass@database.internal:5432/mydb" \ --ssh-host bastion.example.com \ --ssh-user ubuntu \ --ssh-password mypassword
SSH with Private Key Authentication
npx @bytebase/dbhub \ --dsn "postgres://dbuser:dbpass@database.internal:5432/mydb" \ --ssh-host bastion.example.com \ --ssh-user ubuntu \ --ssh-key ~/.ssh/id_rsa
SSH with Private Key and Passphrase
npx @bytebase/dbhub \ --dsn "postgres://dbuser:dbpass@database.internal:5432/mydb" \ --ssh-host bastion.example.com \ --ssh-port 2222 \ --ssh-user ubuntu \ --ssh-key ~/.ssh/id_rsa \ --ssh-passphrase mykeypassphrase
Using Environment Variables
export SSH_HOST=bastion.example.com export SSH_USER=ubuntu export SSH_KEY=~/.ssh/id_rsa npx @bytebase/dbhub --dsn "postgres://dbuser:dbpass@database.internal:5432/mydb"

Note: When using SSH tunnels, the database host in your DSN should be the hostname/IP as seen from the SSH server (bastion host), not from your local machine.

Configure your database connection

You can use DBHub in demo mode with a sample employee database for testing:

npx @bytebase/dbhub --demo

Warning

If your user/password contains special characters, you need to escape them first. (e.g. pass#word should be escaped as pass#word)

For real databases, a Database Source Name (DSN) is required. You can provide this in several ways:

  • Command line argument (highest priority):
    npx @bytebase/dbhub --dsn "postgres://user:password@localhost:5432/dbname?sslmode=disable"
  • Environment variable (second priority):
    export DSN="postgres://user:password@localhost:5432/dbname?sslmode=disable" npx @bytebase/dbhub
  • Environment file (third priority):
    • For development: Create .env.local with your DSN
    • For production: Create .env with your DSN
    DSN=postgres://user:password@localhost:5432/dbname?sslmode=disable

Warning

When running in Docker, use host.docker.internal instead of localhost to connect to databases running on your host machine. For example: mysql://user:password@host.docker.internal:3306/dbname

DBHub supports the following database connection string formats:

DatabaseDSN FormatExample
MySQLmysql://[user]:[password]@[host]:[port]/[database]mysql://user:password@localhost:3306/dbname?sslmode=disable
MariaDBmariadb://[user]:[password]@[host]:[port]/[database]mariadb://user:password@localhost:3306/dbname?sslmode=disable
PostgreSQLpostgres://[user]:[password]@[host]:[port]/[database]postgres://user:password@localhost:5432/dbname?sslmode=disable
SQL Serversqlserver://[user]:[password]@[host]:[port]/[database]sqlserver://user:password@localhost:1433/dbname?sslmode=disable
SQLitesqlite:///[path/to/file] or sqlite:///:memory:sqlite:///path/to/database.db, sqlite:C:/Users/YourName/data/database.db (windows) or sqlite:///:memory:
SQL Server

Extra query parameters:

authentication
  • authentication=azure-active-directory-access-token. Only applicable when running from Azure. See DefaultAzureCredential.

Transport

  • stdio (default) - for direct integration with tools like Claude Desktop:
    npx @bytebase/dbhub --transport stdio --dsn "postgres://user:password@localhost:5432/dbname?sslmode=disable"
  • http - for browser and network clients:
    npx @bytebase/dbhub --transport http --port 5678 --dsn "postgres://user:password@localhost:5432/dbname?sslmode=disable"

Command line options

OptionEnvironment VariableDescriptionDefault
dsnDSNDatabase connection stringRequired if not in demo mode
transportTRANSPORTTransport mode: stdio or httpstdio
portPORTHTTP server port (only applicable when using --transport=http)8080
readonlyREADONLYRestrict SQL execution to read-only operationsfalse
demoN/ARun in demo mode with sample employee databasefalse
ssh-hostSSH_HOSTSSH server hostname for tunnel connectionN/A
ssh-portSSH_PORTSSH server port22
ssh-userSSH_USERSSH usernameN/A
ssh-passwordSSH_PASSWORDSSH password (for password authentication)N/A
ssh-keySSH_KEYPath to SSH private key fileN/A
ssh-passphraseSSH_PASSPHRASEPassphrase for SSH private keyN/A

The demo mode uses an in-memory SQLite database loaded with the sample employee database that includes tables for employees, departments, titles, salaries, department employees, and department managers. The sample database includes SQL scripts for table creation, data loading, and testing.

Development

  1. Install dependencies:
    pnpm install
  2. Run in development mode:
    pnpm dev
  3. Build for production:
    pnpm build pnpm start --transport stdio --dsn "postgres://user:password@localhost:5432/dbname?sslmode=disable"

Testing

The project uses Vitest for comprehensive unit and integration testing:

  • Run all tests: pnpm test
  • Run tests in watch mode: pnpm test:watch
  • Run integration tests: pnpm test:integration
Integration Tests

DBHub includes comprehensive integration tests for all supported database connectors using Testcontainers. These tests run against real database instances in Docker containers, ensuring full compatibility and feature coverage.

Prerequisites
  • Docker: Ensure Docker is installed and running on your machine
  • Docker Resources: Allocate sufficient memory (recommended: 4GB+) for multiple database containers
  • Network Access: Ability to pull Docker images from registries
Running Integration Tests

Note: This command runs all integration tests in parallel, which may take 5-15 minutes depending on your system resources and network speed.

# Run all database integration tests pnpm test:integration
# Run only PostgreSQL integration tests pnpm test src/connectors/__tests__/postgres.integration.test.ts # Run only MySQL integration tests pnpm test src/connectors/__tests__/mysql.integration.test.ts # Run only MariaDB integration tests pnpm test src/connectors/__tests__/mariadb.integration.test.ts # Run only SQL Server integration tests pnpm test src/connectors/__tests__/sqlserver.integration.test.ts # Run only SQLite integration tests pnpm test src/connectors/__tests__/sqlite.integration.test.ts # Run JSON RPC integration tests pnpm test src/__tests__/json-rpc-integration.test.ts

All integration tests follow these patterns:

  1. Container Lifecycle: Start database container → Connect → Setup test data → Run tests → Cleanup
  2. Shared Test Utilities: Common test patterns implemented in IntegrationTestBase class
  3. Database-Specific Features: Each database includes tests for unique features and capabilities
  4. Error Handling: Comprehensive testing of connection errors, invalid SQL, and edge cases
Troubleshooting Integration Tests

Container Startup Issues:

# Check Docker is running docker ps # Check available memory docker system df # Pull images manually if needed docker pull postgres:15-alpine docker pull mysql:8.0 docker pull mariadb:10.11 docker pull mcr.microsoft.com/mssql/server:2019-latest

SQL Server Timeout Issues:

  • SQL Server containers require significant startup time (3-5 minutes)
  • Ensure Docker has sufficient memory allocated (4GB+ recommended)
  • Consider running SQL Server tests separately if experiencing timeouts

Network/Resource Issues:

# Run tests with verbose output pnpm test:integration --reporter=verbose # Run single database test to isolate issues pnpm test:integration -- --testNamePattern="PostgreSQL" # Check Docker container logs if tests fail docker logs <container_id>
Pre-commit Hooks (for Developers)

The project includes pre-commit hooks to run tests automatically before each commit:

  1. After cloning the repository, set up the pre-commit hooks:
    ./scripts/setup-husky.sh
  2. This ensures the test suite runs automatically whenever you create a commit, preventing commits that would break tests.

Debug with MCP Inspector

stdio
# PostgreSQL example TRANSPORT=stdio DSN="postgres://user:password@localhost:5432/dbname?sslmode=disable" npx @modelcontextprotocol/inspector node /path/to/dbhub/dist/index.js
HTTP
# Start DBHub with HTTP transport pnpm dev --transport=http --port=8080 # Start the MCP Inspector in another terminal npx @modelcontextprotocol/inspector

Connect to the DBHub server /message endpoint

Contributors

Star History

-
security - not tested
A
license - permissive license
-
quality - not tested

remote-capable server

The server can be hosted and run remotely because it primarily relies on remote services or has no dependency on the local environment.

Universal database MCP server connecting to MySQL, PostgreSQL, SQLite, DuckDB and etc.

  1. Supported Matrix
    1. Database Resources
    2. Database Tools
    3. Prompt Capabilities
  2. Installation
    1. Docker
    2. NPM
    3. Claude Desktop
    4. Cursor
  3. Usage
    1. SSL Connections
    2. Read-only Mode
    3. Configure your database connection
    4. Transport
    5. Command line options
  4. Development
    1. Testing
    2. Debug with MCP Inspector
  5. Contributors
    1. Star History

      Related MCP Servers

      • -
        security
        A
        license
        -
        quality
        MCP to access any database accessible via JDBC such as Postgres, Oracle, mysql, mariadb, sqlite etc.
        Last updated -
        163
        Apache 2.0
      • -
        security
        A
        license
        -
        quality
        An MCP server that connects to Supabase PostgreSQL databases, exposing table schemas as resources and providing tools for data analysis through SQL queries.
        Last updated -
        1
        JavaScript
        MIT License
      • -
        security
        A
        license
        -
        quality
        MCP-Server from your Database optimized for LLMs and AI-Agents. Supports PostgreSQL, MySQL, ClickHouse, Snowflake, MSSQL, BigQuery, Oracle Database, SQLite, ElasticSearch, DuckDB
        Last updated -
        448
        Go
        Apache 2.0
        • Linux
      • -
        security
        A
        license
        -
        quality
        Open source MCP server specializing in easy, fast, and secure tools for Databases.
        Last updated -
        9,413
        Go
        Apache 2.0
        • Linux

      View all related MCP servers

      MCP directory API

      We provide all the information about MCP servers via our MCP API.

      curl -X GET 'https://glama.ai/api/mcp/v1/servers/bytebase/dbhub'

      If you have feedback or need assistance with the MCP directory API, please join our Discord server