Skip to main content
Glama

Knowledge Graph Builder

.env.example2.93 kB
# KGB-MCP Environment Configuration # ============================================================================= # MODEL PROVIDER CONFIGURATION # ============================================================================= # Default AI provider: "ollama", "lmstudio", or "hosted" MODEL_PROVIDER=ollama # ============================================================================= # OLLAMA CONFIGURATION # ============================================================================= # Ollama server URL (default: http://localhost:11434) OLLAMA_BASE_URL=http://192.168.0.173:11434 # Default Ollama model to use LOCAL_MODEL=deepshr1t:latest # ============================================================================= # LM STUDIO CONFIGURATION # ============================================================================= # LM Studio server URL (default: http://localhost:1234) LMSTUDIO_BASE_URL=http://localhost:1234 # ============================================================================= # HOSTED API CONFIGURATION # ============================================================================= # Hosted API URL (e.g., OpenAI, Anthropic, etc.) HOSTED_API_URL=https://api.openai.com # API key for hosted service HOSTED_API_KEY=your_api_key_here # Default model for hosted API HOSTED_MODEL=gpt-4o-mini # ============================================================================= # CONTENT PROCESSING CONFIGURATION # ============================================================================= # Size of text chunks for AI processing (in characters) CHUNK_SIZE=2000 # Overlap between chunks (in characters) CHUNK_OVERLAP=200 # Maximum number of chunks to process (0 = unlimited) MAX_CHUNKS=0 # ============================================================================= # HUGGING FACE CONFIGURATION (for original Mistral integration) # ============================================================================= # HuggingFace API token (if using Mistral AI via HF) # HF_TOKEN=your_huggingface_token_here # ============================================================================= # USAGE EXAMPLES # ============================================================================= # Example 1: Local Ollama setup # MODEL_PROVIDER=ollama # OLLAMA_BASE_URL=http://localhost:11434 # LOCAL_MODEL=llama3.2:latest # Example 2: Remote Ollama setup # MODEL_PROVIDER=ollama # OLLAMA_BASE_URL=http://192.168.1.100:11434 # LOCAL_MODEL=deepseek-r1:latest # Example 3: LM Studio setup # MODEL_PROVIDER=lmstudio # LMSTUDIO_BASE_URL=http://localhost:1234 # Example 4: OpenAI API setup # MODEL_PROVIDER=hosted # HOSTED_API_URL=https://api.openai.com # HOSTED_API_KEY=sk-your-openai-key-here # HOSTED_MODEL=gpt-4o-mini # Example 5: Anthropic API setup # MODEL_PROVIDER=hosted # HOSTED_API_URL=https://api.anthropic.com # HOSTED_API_KEY=sk-your-anthropic-key-here # HOSTED_MODEL=claude-3-sonnet-20240229

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/rebots-online/hKG-ontologizer-KGB-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server