Skip to main content
Glama

Cross-LLM MCP Server

package.json•1.08 kB
{ "name": "cross-llm-mcp", "version": "1.0.2", "type": "module", "bin": { "cross-llm-mcp": "./build/index.js" }, "scripts": { "build": "tsc && chmod 755 build/index.js", "postinstall": "node scripts/postinstall.js" }, "files": [ "build", "scripts" ], "main": "index.js", "keywords": [ "mcp", "model-context-protocol", "llm", "chatgpt", "claude", "gemini", "grok", "deepseek", "ai", "openai", "anthropic", "xai" ], "author": "James Sangalli", "license": "MIT", "description": "A Model Context Protocol (MCP) server that provides access to multiple Large Language Model (LLM) APIs including ChatGPT, Claude, DeepSeek, Gemini, and Grok", "repository": { "type": "git", "url": "https://github.com/JamesANZ/cross-llm-mcp.git" }, "dependencies": { "@modelcontextprotocol/sdk": "^1.15.0", "superagent": "^10.2.2", "zod": "^3.25.75" }, "devDependencies": { "@types/node": "^24.0.10", "@types/superagent": "^8.1.9", "typescript": "^5.8.3" } }

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/JamesANZ/cross-llm-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server