Integrates Ollama's locally-run language models (like Llama, Mistral, CodeLlama) with Apify workflows, enabling AI-driven analysis and processing of scraped data while preserving privacy and reducing costs through on-device inference.
Click on "Install Server".
Wait a few minutes for the server to deploy. Once ready, it will show a "Started" state.
In the chat, type
@followed by the MCP server name and your instructions, e.g., "@Apify MCP Server Templatesummarize the scraped product data from our e-commerce monitor"
That's it! The server will respond to your query, and you can continue using it as needed.
Here is a step-by-step guide with screenshots.
Ollama-Apify-MCP
Bring powerful local AI into your Apify workflows.
This project connects Ollama’s locally-run language models with the Model Context Protocol (MCP) and Apify’s scraping & automation platform. It enables you to process scraped data, extract insights, and generate intelligent responses — all without external APIs.
🧠 Overview
The Ollama-Apify-MCP Actor bridges Apify workflows with local LLMs via MCP, allowing AI-driven analysis and reasoning while preserving privacy and reducing costs.
🚀 Key Features
🔗 Local LLM integration — Run models like Llama, Mistral, CodeLlama, and more using Ollama
🧩 MCP-based communication — Standards-compliant protocol for tool interaction
⚙️ Automatic context & preprocessing — Improves model response quality
🛠️ Extensible tool architecture — Easily add custom MCP tools & resources
🔁 Robust error handling & retries — Reliable execution in workflows
📦 Quick Start
Use as in cursor, copilot, claude code or desktop
💻 Run Locally
Server runs at:
☁️ Deploy to Apify
Push the repo to GitHub
Add it as an Actor in Apify Console
Enable Standby Mode
Deploy
MCP endpoint:
Include your API token:
🎯 Use Cases
📊 Analyze & summarize scraped web data
🔐 Privacy-first local LLM processing
⚡ Low-latency on-device inference
🧱 Build AI tools inside Apify workflows
🧩 Requirements
Python 3.7+
Ollama installed locally
Apify CLI (for deployment)
❤️ Contributing
PRs and feature ideas are welcome — feel free to extend tools, improve docs, or share sample workflows.
📄 License
MIT License