Skip to main content
Glama
IBM

IBM i MCP Server

Official
by IBM
.env.example•4.17 kB
# ============================================================ # IBM i MCP Server Configuration # ============================================================ # Transport protocol: "stdio" (local) or "http" (remote) # Default: http # For streamble http MCP connection use MCP_TRANSPORT_TYPE= 'http' along with below IBMI_MCP_SERVER_URL and IBMI_MCP_ACCESS_TOKEN MCP_TRANSPORT_TYPE=http # Required: Access token for IBM i MCP server authentication IBMI_MCP_ACCESS_TOKEN=your_access_token_here # Optional: MCP server URL (default: http://127.0.0.1:3010/mcp) IBMI_MCP_SERVER_URL=http://127.0.0.1:3010/mcp # ============================================================ # LLM Model Configuration # ============================================================ # Optional: LLM model to use (default: watsonx/meta-llama/llama-3-3-70b-instruct) # Supported models: # - Gemini: gemini-2.0-flash, gemini-2.0-pro-exp, gemini-1.5-pro # - Watsonx: watsonx/meta-llama/llama-3-3-70b-instruct, watsonx/ibm/granite-13b-chat-v2 # - Ollama: ollama_chat/gpt-oss:20b, ollama_chat/granite4:tiny-h # - OpenAI: gpt-4, gpt-3.5-turbo IBMI_AGENT_MODEL=gemini-2.5-flash # ============================================================ # Logging Configuration # ============================================================ # Optional: Logging level (default: INFO) # Options: DEBUG, INFO, WARNING, ERROR, CRITICAL IBMI_AGENT_LOG_LEVEL=INFO # ============================================================ # Google Cloud / Vertex AI Configuration (for Gemini models) # ============================================================ # Optional: Google Cloud project ID GOOGLE_CLOUD_PROJECT=your-project-id # Optional: Path to Google Cloud service account credentials GOOGLE_APPLICATION_CREDENTIALS=secrets/credentials.json # Optional: Vertex AI location (default: us-central1) VERTEX_AI_LOCATION=us-central1 # Optional: Use Vertex AI instead of Google AI (default: FALSE) GOOGLE_GENAI_USE_VERTEXAI=FALSE # Optional: Google API key (for Google AI) GOOGLE_API_KEY=your_google_api_key_here # ============================================================ # Watsonx Configuration (for Watsonx models) # ============================================================ # Required for Watsonx models: Watsonx API key WATSONX_API_KEY=your_watsonx_api_key_here # Required for Watsonx models: Watsonx project ID WATSONX_PROJECT_ID=your_watsonx_project_id_here # Optional: Watsonx API base URL (default: https://us-south.ml.cloud.ibm.com) WATSONX_API_BASE=https://us-south.ml.cloud.ibm.com # ============================================================ # Ollama Configuration (for local Ollama models) # ============================================================ # Optional: Ollama API base URL (default: http://localhost:11434) OLLAMA_API_BASE=http://localhost:11434 # ============================================================ # Development & Testing # ============================================================ # Optional: Enable debug mode for tool filtering DEBUG_TOOL_FILTERING=false # Optional: Enable LiteLLM debug mode LITELLM_DEBUG=false # ----------------------------------------------------------------------------- # 🗄️ IBM i Database Connection for stdio connection # ----------------------------------------------------------------------------- # Required for SQL tools to connect to IBM i Db2 for i via Mapepire # MCP_TRANSPORT_TYPE should be 'stdio' along with below variables # IBM i system hostname or IP address (REQUIRED) DB2i_HOST= # IBM i user profile for database connections (REQUIRED) DB2i_USER= # Password for IBM i user profile (REQUIRED) DB2i_PASS= # Mapepire daemon/gateway port # Default: 8076 DB2i_PORT=8076 # Transport protocol: "stdio" (local) or "http" (remote) # Default: http MCP_TRANSPORT_TYPE=stdio # Path to YAML tool configurations (file, directory, or glob pattern) # Examples: # - File: tools/performance.yaml # - Directory: tools/ (loads all .yaml/.yml files) # - Glob: tools/**/*.yaml # Default: none (no tools loaded) TOOLS_YAML_PATH=tools # Node.js options (for Ollama or other Node.js-based tools) NODE_OPTIONS=--no-deprecation

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/IBM/ibmi-mcp-server'

If you have feedback or need assistance with the MCP directory API, please join our Discord server