Skip to main content
Glama
101,503 tools. Last updated 2026-04-11 21:56
  • Install and configure Ollama on macOS via Homebrew to enable local AI inference for processing large datasets with the Massive Context MCP server.
    MIT
  • Verify system compatibility for Ollama with gemma3:12b. Checks macOS, Apple Silicon (M1/M2/M3/M4), 16GB+ RAM, and Homebrew installation before setup.
    MIT
  • Check the current version of a package from official registries to identify outdated dependencies. Supports npm, PyPI, Packagist, Crates.io, Maven, Go, RubyGems, NuGet, Hex, CRAN, CPAN, pub.dev, Homebrew, Conda, Clojars, Hackage, Julia, Swift PM, and Chocolatey.
    MIT

Matching MCP Servers