Skip to main content
Glama
cantian-ai

Bazi MCP

httpServer.ts778 B
import { StreamableHTTPServerTransport } from '@modelcontextprotocol/sdk/server/streamableHttp.js'; import express from 'express'; import { server } from './mcp.js'; const app = express(); app.use(express.json()); app.post('/mcp', async (req, res) => { const transport = new StreamableHTTPServerTransport({ sessionIdGenerator: undefined, enableJsonResponse: true, }); res.on('close', () => { transport.close(); }); await server.connect(transport); await transport.handleRequest(req, res, req.body); }); const port = parseInt(process.env.PORT || '3000'); app .listen(port, () => { console.log(`MCP is running on http://localhost:${port}/mcp`); }) .on('error', (error) => { console.error('Server error', error); process.exit(1); });

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/cantian-ai/bazi-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server