# Physics MCP Server Environment Configuration
# Copy this file to .env and customize as needed
# Natural Language Interface (NLI) Configuration
# Optional: Set these to enable faster NLI parsing with local LM server
LM_BASE_URL=http://localhost:1234/v1
LM_API_KEY=
DEFAULT_MODEL=llama-3.2-3b-instruct
# Development Configuration
NODE_ENV=development
DEBUG_VERBOSE=0
# Python Worker Configuration
PYTHON_PATH=python
VENV_PATH=packages/python-worker/venv
# Server Configuration
MCP_SERVER_PORT=3000
LOG_LEVEL=info
# GPU Acceleration (auto-detected if not set)
# ACCEL_MODE=auto
# ACCEL_DEVICE=auto
# Cache Configuration
ENABLE_CACHE=true
CACHE_DIR=.cache
# Artifact Storage
ARTIFACTS_DIR=artifacts
MCP directory API
We provide all the information about MCP servers via our MCP API.
curl -X GET 'https://glama.ai/api/mcp/v1/servers/BlinkZer0/Phys-MCP'
If you have feedback or need assistance with the MCP directory API, please join our Discord server