Skip to main content
Glama

Voice Mode

by mbailey
next.config.mjs•767 B
/** @type {import('next').NextConfig} */ const nextConfig = { // Enable standalone output for production builds // This creates a self-contained server that doesn't need the full Next.js framework output: process.env.BUILD_STANDALONE === 'true' ? 'standalone' : undefined, // Optimize for production deployment trailingSlash: false, compress: true, // Configure static asset serving assetPrefix: process.env.ASSET_PREFIX || '', // Ensure compatibility with Python static file serving experimental: { outputFileTracingExcludes: { '*': [ 'node_modules/@swc/core-linux-x64-gnu', 'node_modules/@swc/core-linux-x64-musl', 'node_modules/@esbuild/linux-x64', ], }, }, }; export default nextConfig;

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/mbailey/voicemode'

If you have feedback or need assistance with the MCP directory API, please join our Discord server