Skip to main content
Glama
Anwesh43

Apify MCP Server Template

by Anwesh43

Ollama-Apify-MCP

Bring powerful local AI into your Apify workflows.

This project connects Ollama’s locally-run language models with the Model Context Protocol (MCP) and Apify’s scraping & automation platform. It enables you to process scraped data, extract insights, and generate intelligent responses — all without external APIs.


🧠 Overview

The Ollama-Apify-MCP Actor bridges Apify workflows with local LLMs via MCP, allowing AI-driven analysis and reasoning while preserving privacy and reducing costs.


🚀 Key Features

  • 🔗 Local LLM integration — Run models like Llama, Mistral, CodeLlama, and more using Ollama

  • 🧩 MCP-based communication — Standards-compliant protocol for tool interaction

  • ⚙️ Automatic context & preprocessing — Improves model response quality

  • 🛠️ Extensible tool architecture — Easily add custom MCP tools & resources

  • 🔁 Robust error handling & retries — Reliable execution in workflows


📦 Quick Start

Use as in cursor, copilot, claude code or desktop

{
  "mcpServers": {
      "ollama-apify-mcp": {
        "url": "https://lenticular-negative--ollama-apify-mcp.apify.actor/mcp?token={YOUR_TOKEN}"
      }
    }
  }

💻 Run Locally

pip install -r requirements.txt
APIFY_META_ORIGIN=STANDBY python -m src

Server runs at:

http://localhost:3000/mcp

☁️ Deploy to Apify

  1. Push the repo to GitHub

  2. Add it as an Actor in Apify Console

  3. Enable Standby Mode

  4. Deploy

MCP endpoint:

https://lenticular-negative--ollama-apify-mcp.apify.actor/mcp

Include your API token:

Authorization: Bearer <APIFY_TOKEN>

🎯 Use Cases

  • 📊 Analyze & summarize scraped web data

  • 🔐 Privacy-first local LLM processing

  • ⚡ Low-latency on-device inference

  • 🧱 Build AI tools inside Apify workflows


🧩 Requirements

  • Python 3.7+

  • Ollama installed locally

  • Apify CLI (for deployment)


❤️ Contributing

PRs and feature ideas are welcome — feel free to extend tools, improve docs, or share sample workflows.


📄 License

MIT License

-
security - not tested
F
license - not found
-
quality - not tested

Resources

Unclaimed servers have limited discoverability.

Looking for Admin?

If you are the server author, to access and configure the admin panel.

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/Anwesh43/ollama-apify-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server