Skip to main content
Glama

Ollama MCP Server

package.json802 B
{ "name": "@rawveg/ollama-mcp", "version": "1.0.9", "type": "module", "description": "MCP Server for Ollama integration", "main": "dist/index.js", "bin": { "ollama-mcp": "./dist/cli.js" }, "files": [ "dist", "mcp.json" ], "scripts": { "build": "tsc && chmod +x dist/cli.js", "start": "node dist/cli.js", "prepare": "npm run build" }, "repository": { "type": "git", "url": "git+https://github.com/rawveg/ollama-mcp.git" }, "keywords": [ "ollama", "mcp", "ai", "claude" ], "author": "tigreen", "license": "AGPL-3.0-only", "dependencies": { "express": "^4.18.2", "node-fetch": "^3.3.2" }, "devDependencies": { "@types/express": "^4.17.21", "@types/node": "^20.11.19", "typescript": "^5.3.3" } }

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/rawveg/ollama-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server