Apify MCP Server Template
Integrates Ollama's locally-run language models (like Llama, Mistral, CodeLlama) with Apify workflows, enabling AI-driven analysis and processing of scraped data while preserving privacy and reducing costs through on-device inference.
Click on "Install Server".
Wait a few minutes for the server to deploy. Once ready, it will show a "Started" state.
In the chat, type
@followed by the MCP server name and your instructions, e.g., "@Apify MCP Server Templatesummarize the scraped product data from our e-commerce monitor"
That's it! The server will respond to your query, and you can continue using it as needed.
Here is a step-by-step guide with screenshots.
Ollama-Apify-MCP
Bring powerful local AI into your Apify workflows.
This project connects Ollama’s locally-run language models with the Model Context Protocol (MCP) and Apify’s scraping & automation platform. It enables you to process scraped data, extract insights, and generate intelligent responses — all without external APIs.
🧠 Overview
The Ollama-Apify-MCP Actor bridges Apify workflows with local LLMs via MCP, allowing AI-driven analysis and reasoning while preserving privacy and reducing costs.
🚀 Key Features
🔗 Local LLM integration — Run models like Llama, Mistral, CodeLlama, and more using Ollama
🧩 MCP-based communication — Standards-compliant protocol for tool interaction
⚙️ Automatic context & preprocessing — Improves model response quality
🛠️ Extensible tool architecture — Easily add custom MCP tools & resources
🔁 Robust error handling & retries — Reliable execution in workflows
📦 Quick Start
Use as in cursor, copilot, claude code or desktop
{
"mcpServers": {
"ollama-apify-mcp": {
"url": "https://lenticular-negative--ollama-apify-mcp.apify.actor/mcp?token={YOUR_TOKEN}"
}
}
}💻 Run Locally
pip install -r requirements.txt
APIFY_META_ORIGIN=STANDBY python -m srcServer runs at:
http://localhost:3000/mcp☁️ Deploy to Apify
Push the repo to GitHub
Add it as an Actor in Apify Console
Enable Standby Mode
Deploy
MCP endpoint:
https://lenticular-negative--ollama-apify-mcp.apify.actor/mcpInclude your API token:
Authorization: Bearer <APIFY_TOKEN>🎯 Use Cases
📊 Analyze & summarize scraped web data
🔐 Privacy-first local LLM processing
⚡ Low-latency on-device inference
🧱 Build AI tools inside Apify workflows
🧩 Requirements
Python 3.7+
Ollama installed locally
Apify CLI (for deployment)
❤️ Contributing
PRs and feature ideas are welcome — feel free to extend tools, improve docs, or share sample workflows.
📄 License
MIT License
This server cannot be installed
Resources
Unclaimed servers have limited discoverability.
Looking for Admin?
If you are the server author, to access and configure the admin panel.
Latest Blog Posts
MCP directory API
We provide all the information about MCP servers via our MCP API.
curl -X GET 'https://glama.ai/api/mcp/v1/servers/Anwesh43/ollama-apify-mcp'
If you have feedback or need assistance with the MCP directory API, please join our Discord server