Skip to main content
Glama

Lspace MCP Server

Official
by Lspace-io
.env.example903 B
# Server Configuration PORT=3001 # Node Environment # Used to differentiate behavior, e.g., 'development', 'production', 'test'. # The Lspace server currently checks if NODE_ENV is 'test' to modify some behaviors. NODE_ENV=development # OpenAI API Configuration # IMPORTANT: Replace with your actual OpenAI API key. OPENAI_API_KEY="sk-YOUR_OPENAI_API_KEY_HERE" # OpenAI Model Configuration # Specifies the model to be used for LLM operations (e.g., gpt-4o, gpt-3.5-turbo). OPENAI_MODEL=gpt-4o # Optional: Path to the local configuration file for Lspace # Default is ./config.local.json if not specified (especially for lspace-mcp-server.js) # CONFIG_PATH=./config.local.json # Optional: Base directory for storing local repositories and cloned GitHub repositories # Defaults to ./repos for local and ./cloned-github-repos for GitHub clones if not specified # REPO_BASE_PATH=./my_lspace_repositories

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/Lspace-io/lspace-server'

If you have feedback or need assistance with the MCP directory API, please join our Discord server