Skip to main content
Glama
IBM
by IBM
.env.example993 B
# ============================================================ # MCP Server (Required) - IBM i database access # ============================================================ # The MCP server runs automatically in Docker Compose # Use ibmi-mcp-server:3010 for container-to-container communication MCP_URL=http://ibmi-mcp-server:3010/mcp MCP_TRANSPORT=streamable-http # ============================================================ # AI Model Provider (Choose at least one) # ============================================================ # Option 1: watsonx (IBM Cloud) # Get keys from: https://cloud.ibm.com WATSONX_API_KEY=your_ibm_cloud_api_key WATSONX_PROJECT_ID=your_project_id WATSONX_URL=https://us-south.ml.cloud.ibm.com WATSONX_MODEL_ID=meta-llama/llama-3-3-70b-instruct # Option 2: OpenAI # Get key from: https://platform.openai.com/api-keys OPENAI_API_KEY=sk-your_openai_key # Option 3: Anthropic # Get key from: https://console.anthropic.com ANTHROPIC_API_KEY=sk-your_anthropic_key

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/IBM/ibmi-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server