Skip to main content
Glama

Jokes MCP Server

by shekhar-ai99
package.json875 B
{ "name": "clinical-mcp-server", "version": "1.0.0", "description": "MCP Server for Clinical Intelligence using a local LLM (llama.cpp).", "main": "dist/server.js", "type": "module", "scripts": { "build": "mkdir -p dist && tsc && cp -f src/openapi.yaml dist/ || true", "start": "node dist/server.js", "dev": "nodemon src/server.ts" }, "dependencies": { "@modelcontextprotocol/sdk": "latest", "express": "^4.19.2", "js-yaml": "^4.1.0", "swagger-ui-express": "^5.0.1", "zod": "^3.23.8", "node-llama-cpp": "^2.8.0", "fhir-kit-client": "^1.9.2", "sqlite3": "^5.1.6" }, "devDependencies": { "@types/express": "^4.17.21", "@types/js-yaml": "^4.0.9", "@types/swagger-ui-express": "^4.1.6", "@types/node": "^20.14.11", "nodemon": "^3.1.4", "ts-node": "^10.9.2", "typescript": "^5.5.3" } }

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/shekhar-ai99/clinical-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server