Skip to main content
Glama

Ollama MCP Server

by etnlbck
package.jsonโ€ข1.24 kB
{ "name": "ollama-mcp-server", "version": "1.0.0", "description": "MCP server for interacting with Ollama locally", "main": "dist/main.js", "type": "module", "scripts": { "build": "npx --package=typescript tsc", "dev": "tsx src/main.ts", "dev:stdio": "tsx src/main.ts", "dev:http": "MCP_TRANSPORT=http tsx src/main.ts", "start": "node dist/main.js", "start:stdio": "node dist/main.js", "start:http": "MCP_TRANSPORT=http node dist/main.js", "test": "echo \"No tests yet\" && exit 0", "lint": "echo \"No linter configured\" && exit 0", "clean": "rm -rf dist", "docker:build": "docker build -t ollama-mcp .", "docker:run": "docker run --rm -p 11434:11434 -v ${PWD}/ollama-data:/data/ollama ollama-mcp", "railway:up": "railway up", "railway:shell": "railway shell", "railway:logs": "railway logs" }, "dependencies": { "@modelcontextprotocol/sdk": "^1.19.1", "express": "^4.19.2", "node-fetch": "^3.3.2" }, "devDependencies": { "@types/express": "^4.17.21", "@types/node": "^20.19.19", "tsx": "^4.6.2", "typescript": "^5.3.0" }, "keywords": [ "mcp", "ollama", "llm", "ai" ], "author": "", "license": "MIT" }

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/etnlbck/ollama-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server