Skip to main content
Glama

LLM Responses MCP Server

by kstrikis
package.json975 B
{ "name": "mcp-llm-responses", "version": "1.0.0", "description": "Model Context Protocol server for sharing LLM responses", "main": "dist/index.js", "type": "module", "scripts": { "build": "tsc", "start": "node dist/index.js", "dev": "tsx src/index.ts", "lint": "eslint src --ext .ts", "test": "jest", "inspect": "npx @modelcontextprotocol/inspector node dist/index.js", "inspect:dev": "npx @modelcontextprotocol/inspector tsx src/index.ts --stdio", "inspect:build": "npm run build && npm run inspect" }, "keywords": [ "mcp", "llm", "ai", "model-context-protocol" ], "author": "", "license": "MIT", "dependencies": { "@modelcontextprotocol/sdk": "^1.7.0" }, "devDependencies": { "@modelcontextprotocol/inspector": "^0.6.0", "@types/express": "^5.0.1", "@types/node": "^22.13.11", "express": "^4.21.2", "tsx": "^4.19.3", "typescript": "^5.8.2", "zod": "^3.24.2" } }

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/kstrikis/ephor-mcp-collaboration'

If you have feedback or need assistance with the MCP directory API, please join our Discord server