Skip to main content
Glama

Couchbase MCP Server for LLMs

by Aniket310101
cli.ts663 B
#!/usr/bin/env node const path = require("path"); const fs = require("fs"); // Check for config file const configFile = process.argv[2]; if (configFile) { try { const config = JSON.parse(fs.readFileSync(configFile, "utf8")); // Handle nested configuration structure for (const [serverType, serverConfigs] of Object.entries(config.mcpServers)) { for (const [key, value] of Object.entries((serverConfigs as any).env)) { process.env[key] = value as string; } } } catch (error: any) { console.error(`Error reading config file: ${error.message}`); process.exit(1); } } // Run the server require("../index.js");

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/Aniket310101/MCP-Server-Couchbase'

If you have feedback or need assistance with the MCP directory API, please join our Discord server