Skip to main content
Glama

Homebrew MCP tools

Production-ready MCP servers that extend AI capabilities through file access, database connections, APIs, and contextual services.

70,721 tools. Last updated 2026-02-08 08:58
  • Install and configure Ollama on macOS via Homebrew to enable local AI inference for processing large datasets with the Massive Context MCP server.
    MIT
  • Verify system compatibility for Ollama with gemma3:12b. Checks macOS, Apple Silicon (M1/M2/M3/M4), 16GB+ RAM, and Homebrew installation before setup.
    MIT
  • Check the current version of a package from official registries to identify outdated dependencies. Supports npm, PyPI, Packagist, Crates.io, Maven, Go, RubyGems, NuGet, Hex, CRAN, CPAN, pub.dev, Homebrew, Conda, Clojars, Hackage, Julia, Swift PM, and Chocolatey.
    MIT
  • Install and configure Ollama locally on macOS without Homebrew or sudo. Downloads directly to ~/Applications, works on locked-down systems, and sets up the default model for local inference.
    MIT

Interested in MCP?

Join the MCP community for support and updates.

RedditDiscord