Skip to main content
Glama
config.ts754 B
import { config } from 'dotenv'; import { z } from 'zod'; config(); const configSchema = z.object({ PORT: z.string().transform(Number).default('3000'), OLLAMA_BASE_URL: z.string().default('http://localhost:11434'), LMSTUDIO_BASE_URL: z.string().default('http://localhost:1234'), ALLOWED_URLS: z.string().default('http://localhost:*'), ALLOWED_EXECUTABLES: z.string().default('node,python,npm') }); const parsed = configSchema.parse(process.env); export const CONFIG = { PORT: parsed.PORT, OLLAMA_BASE_URL: parsed.OLLAMA_BASE_URL, LMSTUDIO_BASE_URL: parsed.LMSTUDIO_BASE_URL, ALLOWED_URLS: parsed.ALLOWED_URLS.split(',').map(url => url.trim()), ALLOWED_EXECUTABLES: parsed.ALLOWED_EXECUTABLES.split(',').map(exec => exec.trim()) };

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/bermingham85/mcp-puppet-pipeline'

If you have feedback or need assistance with the MCP directory API, please join our Discord server