Provides data storage integration with examples for database operations and analytics workloads
Powers the HTTP transport layer with streamable HTTP server capabilities for web-based MCP client connections
Enables distributed tracing and metrics collection with auto-instrumentation for monitoring MCP server operations and tool executions
Provides database administration capabilities through the containerized deployment stack with pgAdmin interface
Offers caching services and session management through the containerized deployment with Redis Insight administration interface
ibmi-mcp-server
MCP server for IBM i
📚
✨ Key Features
Feature Area | Description | Key Components / Location |
🔌 MCP Server | A functional server with example tools and resources. Supports
and a Streamable HTTP transport built with . |
,
|
🔭 Observability | Built-in OpenTelemetry for distributed tracing and metrics. Auto-instrumentation for core modules and custom tracing for all tool executions. |
|
🚀 Production Utilities | Logging, Error Handling, ID Generation, Rate Limiting, Request Context tracking, Input Sanitization. |
|
🔒 Type Safety/Security | Strong type checking via TypeScript & Zod validation. Built-in security utilities (sanitization, auth middleware for HTTP). | Throughout,
,
|
⚙️ Error Handling | Consistent error categorization (
), detailed logging, centralized handling (
). |
,
|
📚 Documentation | Comprehensive
, structured JSDoc comments, API references. |
, Codebase,
,
|
🕵️ Interaction Logging | Captures raw requests and responses for all external LLM provider interactions to a dedicated
file for full traceability. |
|
🤖 Agent Ready | Includes a developer cheatsheet tailored for LLM coding agents. |
|
🛠️ Utility Scripts | Scripts for cleaning builds, setting executable permissions, generating directory trees, and fetching OpenAPI specs. |
|
🧩 Services | Reusable modules for LLM (OpenRouter) and data storage (DuckDB) integration, with examples. |
,
|
🧪 Integration Testing | Integrated with Vitest for fast and reliable integration testing. Includes example tests for core logic and a coverage reporter. |
,
|
⏱️ Performance Metrics | Built-in utility to automatically measure and log the execution time and payload size of every tool call. |
|
Quick Start
1. Installation
Clone the repository and install dependencies:
2. Build the Project
3. Create Server .env File
Fill out the Db2 for i connection details in the .env
file:
See more on configuration options in the Configuration section.
4. Running the Server
Via Stdio (Default):
npm run start:stdioVia Streamable HTTP:
npm run start:httpBy Default, the server registers SQL tools stored in the
prebuiltconfigs
directory. This path is set in the.env
file (TOOLS_YAML_PATH
). You can override the SQL tools path using the CLI:CLI Option:
--tools <path>
npm run start:http -- --tools <path>Transport Options:
--transport <type>
npm run start:http -- --transport http # or stdio
5. Run Example Agent
Make sure that the server is running in http
mode:
In another terminal, navigate to the tests/agents
directory and follow the setup instructions in the README.
Run the example Agent:
Run the Example Scripts:
6. Running Tests
This template uses Vitest for testing, with a strong emphasis on integration testing to ensure all components work together correctly.
Run all tests once:
npm testRun tests in watch mode:
npm run test:watchRun tests and generate a coverage report:
npm run test:coverage
⚙️ Configuration
Configure the server using these environment variables (or a .env
file):
Variable | Description | Default |
| Server transport:
or
. |
|
| Session mode for HTTP:
,
, or
. |
|
| Port for the HTTP server. |
|
| Host address for the HTTP server. |
|
| Comma-separated allowed origins for CORS. | (none) |
| Authentication mode for HTTP:
,
,
, or
. |
|
| Required for Secret key (min 32 chars) for signing/verifying auth tokens. | (none - MUST be set in production ) |
| Required for The issuer URL of your authorization server. | (none) |
| Required for The audience identifier for this MCP server. | (none) |
| API key for OpenRouter.ai service. | (none) |
| Set to
to enable OpenTelemetry instrumentation. |
|
| The OTLP endpoint for exporting traces (e.g.,
). | (none; logs to file) |
| The OTLP endpoint for exporting metrics (e.g.,
). | (none) |
| Path to YAML tool definitions (file or directory). Supports directories or globs. | (none) |
| When merging multiple YAML files, merge arrays (
) instead of replacing them. |
|
| Allow duplicate tool names across merged YAML files. |
|
| Allow duplicate source names across merged YAML files. |
|
| Validate the merged YAML configuration before use. |
|
| Enable automatic reloading of YAML tools when configuration files change. |
|
| Comma-separated list of toolset names to load/filter tools (overrides full load). | (none) |
| IBM i Db2 for i host (Mapepire daemon or gateway host). | (none) |
| IBM i user profile for Db2 for i connections. | (none) |
| Password for the IBM i user profile. | (none) |
| Port for the Mapepire daemon/gateway used for Db2 for i. |
|
| If
, skip TLS certificate verification for Mapepire (self-signed certs, etc.). |
|
| Required for Enable IBM i HTTP authentication endpoints. |
|
| Allow HTTP requests for authentication (development only, use HTTPS in production). |
|
| Default token lifetime in seconds for IBM i authentication tokens. |
(1 hour) |
| How often to clean expired tokens (in seconds). |
(5 minutes) |
| Maximum number of concurrent authenticated sessions allowed. |
|
To set the server environment variables, create a .env
file in the root of this project:
Then edit the .env
file with your IBM i connection details.
IBM i HTTP Authentication (Beta)
The server supports IBM i HTTP authentication that allows clients to obtain access tokens for authenticated SQL tool execution. This enables per-user connection pooling and secure access to IBM i resources.
Authentication Flow
Client Authentication: Clients authenticate with IBM i credentials via HTTP Basic Auth
Token Generation: Server creates a secure Bearer token and establishes a dedicated connection pool
Tool Execution: Subsequent tool calls use the Bearer token for authenticated execution
Pool Management: Each token maintains its own connection pool for isolation and security
Configuration
To enable IBM i HTTP authentication, we need to set up Encryption keys and configure the server environment. To protect IBM i credentials during transmission, the authentication flow uses RSA and AES encryption. You need to generate an RSA keypair for the server:
Create or update your .env
file with the following settings:
Getting Access Tokens
Option 1: Using the Token Script (Recommended)
Use the included get-access-token.js
script to obtain authentication tokens:
The script automatically:
Loads IBM i credentials from
.env
with CLI fallbackFetches the server's public key
Encrypts credentials client-side
Requests an access token
Sets
IBMI_MCP_ACCESS_TOKEN
environment variableProvides copy-paste export commands
Sequence Overview
Client Integration
Once you have a token, use it in your MCP client to authenticate requests:
Security Considerations
Development Environment:
IBMI_AUTH_ALLOW_HTTP=true
allows HTTP for testingUse localhost/trusted networks only
Shorter token lifetimes for testing
Production Environment:
IBMI_AUTH_ALLOW_HTTP=false
enforces HTTPSUse proper TLS certificates
Longer token lifetimes for stability
Network security and access controls
Monitor
IBMI_AUTH_MAX_CONCURRENT_SESSIONS
for resource usage
Authentication Endpoints
When enabled (IBMI_HTTP_AUTH_ENABLED=true
), the server provides these endpoints:
Endpoint | Method | Description |
| POST | Authenticate with IBM i credentials and receive Bearer token |
SQL Tool Configuration
The Primary way to confgure tools used by this MCP server is through tools.yaml
files (see prebuiltconfigs/
for examples). There are 3 main sections to each yaml file: sources
, tools
, and toolsets
. Below is a breakdown of each section
Sources
The sources section of your tools.yaml
defines the data sources the MCP server has access to
The environment variablesDB2i_HOST
, DB2i_USER
, DB2i_PASS
, and DB2i_PORT
can be set in the server .env
file. see Configuration
Tools
The tools section of your tools.yaml defines the actions your agent can take: what kind of tool it is, which source(s) it affects, what parameters it uses, etc.
Toolsets
The toolsets section of your tools.yaml
allows you to define groups of tools that you want to be able to load together. This can be useful for defining different sets for different agents or different applications.
More documentation on SQL tools coming soon!
Running the Server (Development)
The server supports multiple transport modes and session configurations for different development scenarios. Use the appropriate startup command based on your needs.
Transport Modes
HTTP Transport (Recommended for Development)
Stdio Transport (for CLI tools and MCP Inspector)
Session Modes (HTTP Only)
The MCP_SESSION_MODE
environment variable controls how the HTTP server handles client sessions:
auto
: Automatically detects client capabilities and uses the best session modestateful
: Maintains persistent sessions with connection statestateless
: Each request is independent, no session state maintained
CLI Options
Both transport modes support these command-line options:
Note: CLI arguments override corresponding settings in
.env
file when provided.
Option | Short | Description | Example |
| Override YAML tools configuration path (overrides
) |
| |
|
| Load only specific toolsets (comma-separated) (overrides
) |
|
|
| Force transport type (
or
) (overrides
) |
|
|
| Show help information |
|
| List available toolsets from YAML configuration |
|
Common Development Scenarios
1. Standard Development Server
2. Custom Tools Path
3. Specific Toolsets Only
Development Tips
Hot Reloading: Enable
YAML_AUTO_RELOAD=true
in.env
for automatic tool configuration updatesVerbose Logging: Set
MCP_LOG_LEVEL=debug
for detailed operation logsCORS: Configure
MCP_ALLOWED_ORIGINS
for web-based clientsAuthentication: Use
MCP_AUTH_MODE=ibmi
with IBM i HTTP auth for token-based access
Troubleshooting
Port Already in Use
Tools Not Loading
MCP Inspector
The MCP Inspector is a tool for exploring and debugging the MCP server's capabilities. It provides a user-friendly interface for interacting with the server, viewing available tools, and testing queries.
Here are the steps to run the MCP Inspector:
Make sure to build the server
cd ibmi-mcp-server/ npm run buildCreate an
mcp.json
file:cp template_mcp.json mcp.jsonFill out the connection details in
mcp.json
with your IBM i system information. You should use the same credentials as in your.env
file:{ "mcpServers": { "default-server": { "command": "node", "args": ["dist/index.js"], "env": { "TOOLS_YAML_PATH": "prebuiltconfigs", "NODE_OPTIONS": "--no-deprecation", "DB2i_HOST": "<DB2i_HOST>", "DB2i_USER": "<DB2i_USER>", "DB2i_PASS": "<DB2i_PASS>", "DB2i_PORT": "<DB2i_PORT>", "MCP_TRANSPORT_TYPE": "stdio" } } } }Start the MCP Inspector
npm run mcp-inspectorClick on the URL displayed in the terminal to open the MCP Inspector in your web browser.
Starting MCP inspector... ⚙️ Proxy server listening on 127.0.0.1:6277 🔑 Session token: EXAMPLE_TOKEN Use this token to authenticate requests or set DANGEROUSLY_OMIT_AUTH=true to disable auth 🔗 Open inspector with token pre-filled: http://localhost:6274/?MCP_PROXY_AUTH_TOKEN=EXAMPLE_TOKEN 🔍 MCP Inspector is up and running at http://127.0.0.1:6274 🚀
Use the MCP Inspector to explore and test your MCP server's capabilities
View available tools and their parameters
Test queries against the server
Debug issues with tool execution
Docker & Podman Deployment
The project includes a comprehensive docker-compose.yml
that sets up the complete MCP gateway with the IBM i MCP Server.
ContextForge MCP Gateway is a feature-rich gateway, proxy and MCP Registry that federates MCP and REST services - unifying discovery, auth, rate-limiting, observability, virtual servers, multi-transport protocols, and an optional Admin UI into one clean endpoint for your AI clients.
Read more about it here.
Prerequisites
Choose one of the following container platforms:
Docker
Docker Desktop (macOS/Windows): Download here
Docker Engine (Linux): Installation guide
Podman (Alternative to Docker)
Podman Desktop (macOS/Windows): Download here
Podman CLI (Linux): Installation guide
podman-compose:
pip install podman-compose
Build MCP Gateway Image
The docker-compose.yml
uses a local build of the MCP Gateway image. To build it, clone the MCP Gateway repository and build the image:
This will create a local image named localhost/mcpgateway/mcpgateway
that the docker-compose.yml
can use. More details on building the MCP Gateway image can be found in the MCP Gateway Docs.
Configure MCP environment
Create a .env
file in the ibmi-mcp-server
directory with your IBM i connection details:
make sure to set the follow variables in your .env
file:
Note: You need to generate an RSA keypair for the server if you haven't already done so. See the IBM i HTTP Authentication section for instructions.
Once you have your .env
file configured, you can start the complete stack using Docker or Podman.
Quick Start with Docker
Start the complete stack:
# Start all services in background docker-compose up -d # Or start specific services docker-compose up -d gateway ibmi-mcp-server postgres redisVerify services are running:
docker-compose ps
Quick Start with Podman
Start the complete stack:
# Start all services in background podman compose up -d # Or start specific services podman compose up -d gateway ibmi-mcp-server postgres redisVerify services are running:
podman compose ps
Container Architecture
The docker-compose setup includes these services:
Service | Port | Description | Access URL |
gateway | 4444 | MCP Context Forge main API | |
ibmi-mcp-server | 3010 | IBM i SQL tools MCP server | |
postgres | - | PostgreSQL database (internal) | - |
redis | 6379 | Cache service | redis://localhost:6379 |
pgadmin | 5050 | Database admin UI | |
redis_insight | 5540 | Cache admin UI |
🔧 Service Management
Start Services
Stop Services
View Logs
Rebuild Services
MCP Gateway UI:
After the Containers are up and running, you can access the MCP Context Forge UI at http://localhost:4444
Enter the demo credentials:
User:
admin
Password:
changeme
To Configure the IBM i MCP server is the admin ui, navigate to the "Gateways/MCP Servers" tab. and enter the mcp server endpoint:
IBM i mcp server endpoint:
http://ibmi-mcp-server:3010
Once the MCP server is connect, you can then manage the tools provided by the server:
Virtual Server Catalog Demo (Comming soon!!)
Architecture Overview
This template is built on a set of architectural principles to ensure modularity, testability, and operational clarity.
Core Server (: The central point where tools and resources are registered. It uses a
ManagedMcpServer
wrapper to provide enhanced introspection capabilities. It acts the same way as the native McpServer, but with additional features like introspection and enhanced error handling.Transports (: The transport layer connects the core server to the outside world. It supports both
stdio
for direct process communication and a streamable Hono-basedhttp
server."Logic Throws, Handler Catches": This is the immutable cornerstone of our error-handling strategy.
Core Logic (: This layer is responsible for pure, self-contained business logic. It throws a structured
McpError
on any failure.Handlers (: This layer interfaces with the server, invokes the core logic, and catches any errors. It is the exclusive location where errors are processed and formatted into a final response.
Structured, Traceable Operations: Every operation is traced from initiation to completion via a
RequestContext
that is passed through the entire call stack, ensuring comprehensive and structured logging.
🏗️ Project Structure
src/mcp-server/
: Contains the core MCP server, tools, resources, and transport handlers.src/config/
: Handles loading and validation of environment variables.src/services/
: Reusable modules for integrating with external services (DuckDB, OpenRouter).src/types-global/
: Defines shared TypeScript interfaces and type definitions.src/utils/
: Core utilities (logging, error handling, security, etc.).src/index.ts
: The main entry point that initializes and starts the server.
Explore the full structure yourself:
See the current file tree in docs/tree.md or generate it dynamically:
🧩 Extending the System
The template enforces a strict, modular pattern for adding new tools and resources, as mandated by the Architectural Standard. The echoTool
(src/mcp-server/tools/echoTool/
) serves as the canonical example.
The "Logic Throws, Handler Catches" Pattern
This is the cornerstone of the architecture:
logic.ts
: This file contains the pure business logic.It defines the Zod schemas for input and output, which serve as the single source of truth for the tool's data contract.
The core logic function is pure: it takes validated parameters and a request context, and either returns a result or throws a structured
McpError
.It never contains
try...catch
blocks for formatting a final response.
registration.ts
: This file is the "handler" that connects the logic to the MCP server.It imports the schemas and logic function from
logic.ts
.It calls
server.registerTool()
, providing the tool's metadata and the runtime handler.The runtime handler always wraps the call to the logic function in a
try...catch
block. This is the only place where errors are caught, processed by theErrorHandler
, and formatted into a standardized error response.
This pattern ensures that core logic remains decoupled, pure, and easily testable, while the registration layer handles all transport-level concerns, side effects, and response formatting.
🌍 Explore More MCP Resources
Looking for more examples, guides, and pre-built MCP servers? Check out the companion repository:
➡️ cyanheads/model-context-protocol-resources
📜 License
This project is licensed under the Apache License 2.0. See the LICENSE file for details.
This server cannot be installed
hybrid server
The server is able to function both locally and remotely, depending on the configuration or use case.
Enables AI assistants to interact with IBM i systems through SQL tools and authenticated connections. Provides system monitoring, database queries, and performance analysis capabilities with secure token-based authentication and connection pooling.