Skip to main content
Glama
egoughnour
by egoughnour

firewall_setup_ollama

Install and configure Ollama on macOS to enable vector embedding capabilities for the Code Firewall MCP server's security analysis system.

Instructions

Install Ollama via Homebrew (macOS).

Args: install: Install Ollama via Homebrew start_service: Start Ollama as a background service pull_model: Pull the embedding model (nomic-embed-text) model: Model to pull (default: nomic-embed-text)

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
installNo
start_serviceNo
pull_modelNo
modelNo

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/egoughnour/code-firewall-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server