Skip to main content
Glama
IBM

IBM i MCP Server

Official
by IBM
.env.example3.96 kB
# IBM i LangChain Agents - Environment Configuration # Copy this file to .env and fill in your actual values # ============================================================================= # LLM Provider API Keys # ============================================================================= # OpenAI API Key (required if using OpenAI models like gpt-4o, gpt-4o-mini) # Get your API key from: https://platform.openai.com/api-keys OPENAI_API_KEY= # Anthropic API Key (required if using Claude models) # Get your API key from: https://console.anthropic.com/ ANTHROPIC_API_KEY= # ============================================================================= # IBM i MCP Server Configuration # ============================================================================= # MCP Server URL - The endpoint where your IBM i MCP server is running # Default: http://127.0.0.1:3010/mcp MCP_URL=http://127.0.0.1:3010/mcp # MCP Transport Type - Communication protocol for MCP server # Options: streamable_http, stdio # Default: streamable_http MCP_TRANSPORT=streamable_http # ============================================================================= # Model Configuration # ============================================================================= # Default LLM Model - Specify which model to use by default # # Ollama Models (local, no API key needed): # - ollama:llama3.2 (recommended for local development) # - ollama:llama3.1:8b # - ollama:mistral # - ollama:qwen2.5:32b # - ollama:gpt-oss:20b # # OpenAI Models (requires OPENAI_API_KEY): # - openai:gpt-4o (most capable) # - openai:gpt-4o-mini (faster, cheaper) # - openai:gpt-3.5-turbo # # Anthropic Models (requires ANTHROPIC_API_KEY): # - anthropic:claude-3-7-sonnet-20250219 # - anthropic:claude-3-opus-20240229 # - anthropic:claude-3-sonnet-20240229 # # Default: ollama:llama3.2 (free, local) DEFAULT_MODEL=ollama:llama3.2 # ============================================================================= # LangSmith Configuration (Optional - for debugging and tracing) # ============================================================================= # Enable LangSmith tracing for debugging agent interactions # Set to "true" to enable, "false" to disable # Default: false LANGCHAIN_TRACING_V2=false # LangSmith API Key (required if LANGCHAIN_TRACING_V2=true) # Get your API key from: https://smith.langchain.com/ LANGCHAIN_API_KEY= # LangSmith Project Name - Organize your traces by project # Default: ibmi-agents LANGCHAIN_PROJECT=ibmi-agents # ============================================================================= # Logging Configuration # ============================================================================= # Enable verbose logging to see detailed agent interactions, tool calls, and responses # Set to "true" for detailed logs, "false" for minimal output # Default: true VERBOSE_LOGGING=true # ============================================================================= # Security Configuration # ============================================================================= # Enable human-in-the-loop approval for security operations # When enabled, non-readonly security tools will require manual approval before execution # Default: true ENABLE_HUMAN_IN_LOOP=true # ============================================================================= # Notes # ============================================================================= # # 1. For local development, use Ollama models (no API key required) # Install Ollama from: https://ollama.ai # Pull a model: ollama pull llama3.2 # # 2. Ensure IBM i MCP Server is running before starting agents # Check with: curl http://127.0.0.1:3010/mcp # # 3. Never commit your .env file with actual API keys to version control # The .env file is already in .gitignore # # 4. For production use, consider using environment-specific configuration # and secure secret management solutions

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/IBM/ibmi-mcp-server'

If you have feedback or need assistance with the MCP directory API, please join our Discord server