Skip to main content
Glama
package.json1.28 kB
{ "name": "@ramgeart/llm-mcp-bridge", "version": "1.0.1", "description": "MCP Server for any OpenAI-compatible LLM API - Model quality analysis tools (LM Studio, Ollama, vLLM, OpenAI, etc.)", "type": "module", "main": "dist/index.js", "bin": { "llm-mcp-bridge": "dist/index.js" }, "repository": { "type": "git", "url": "git+https://github.com/ramgeart/llm-mcp-bridge.git" }, "publishConfig": { "registry": "https://npm.pkg.github.com" }, "scripts": { "build": "tsc", "dev": "tsx watch src/index.ts", "start": "node dist/index.js", "prepare": "npm run build" }, "keywords": [ "mcp", "llm", "openai", "lmstudio", "ollama", "vllm", "model-analysis", "openai-compatible", "local-llm" ], "author": "ramgeart", "license": "MIT", "bugs": { "url": "https://github.com/ramgeart/llm-mcp-bridge/issues" }, "homepage": "https://github.com/ramgeart/llm-mcp-bridge#readme", "files": [ "dist", "README.md" ], "dependencies": { "@modelcontextprotocol/sdk": "^1.0.0", "openai": "^4.20.0", "zod": "^3.22.0" }, "devDependencies": { "@types/node": "^20.10.0", "tsx": "^4.6.0", "typescript": "^5.3.0" }, "engines": { "node": ">=18.0.0" } }

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/ramgeart/llm-mcp-bridge'

If you have feedback or need assistance with the MCP directory API, please join our Discord server