Skip to main content
Glama
scrapfly

Scrapfly MCP

Scrapfly MCP Server


What is Scrapfly MCP?

The Scrapfly MCP Server connects your AI assistants to live web data through the Model Context Protocol. Transform your AI from being limited by training data to having real-time access to any website.

✨ What Your AI Can Do

Capability

Description

🌐 Scrape Live Data

Pull current prices, listings, news, or any webpage content in real-time

🛡️ Bypass Anti-Bot Systems

Automatically handle CAPTCHAs, proxies, JavaScript rendering, and rate limits

Extract Structured Data

Parse complex websites into clean JSON using AI-powered extraction

📸 Capture Screenshots

Take visual snapshots of pages or specific elements for analysis

🏆 Why Scrapfly?

Built on battle-tested infrastructure used by thousands of developers:

📖 Learn more: Why Scrapfly MCP?


🚀 Quick Install

Click one of the buttons below to install the MCP server in your preferred IDE:

Install in VS Code Install in VS Code Insiders Install in Visual Studio Install in Cursor


📦 Manual Installation

Standard Configuration

Works with most MCP-compatible tools:

{ "servers": { "scrapfly-cloud-mcp": { "type": "http", "url": "https://mcp.scrapfly.io/mcp" } } }

Cloud Configuration (NPX)

For tools that require a local process:

{ "mcpServers": { "scrapfly": { "command": "npx", "args": [ "mcp-remote", "https://mcp.scrapfly.io/mcp" ] } } }

🔧 IDE-Specific Setup

One-Click Install

Install in VS Code

Manual Install

Follow the VS Code MCP guide or use the CLI:

code --add-mcp '{"name":"scrapfly-cloud-mcp","type":"http","url":"https://mcp.scrapfly.io/mcp"}'

After installation, Scrapfly tools will be available in GitHub Copilot Chat.

📖 Full guide: VS Code Integration

One-Click Install

Install in VS Code Insiders

Manual Install

code-insiders --add-mcp '{"name":"scrapfly-cloud-mcp","type":"http","url":"https://mcp.scrapfly.io/mcp"}'

📖 Full guide: VS Code Integration

One-Click Install

Install in Visual Studio

Manual Install

  1. Open Visual Studio

  2. Navigate to GitHub Copilot Chat window

  3. Click the tools icon (🛠️) in the chat toolbar

  4. Click + Add Server to open the configuration dialog

  5. Configure:

    • Server ID: scrapfly-cloud-mcp

    • Type: http/sse

    • URL: https://mcp.scrapfly.io/mcp

  6. Click Save

📖 Full guide: Visual Studio MCP documentation

One-Click Install

Install in Cursor

Manual Install

  1. Go to Cursor SettingsMCPAdd new MCP Server

  2. Use the standard configuration above

  3. Click Edit to verify or add arguments

📖 Full guide: Cursor Integration

Use the Claude Code CLI:

claude mcp add scrapfly-cloud-mcp --url https://mcp.scrapfly.io/mcp

📖 Full guide: Claude Code Integration

Add to your Claude Desktop configuration file:

macOS: ~/Library/Application Support/Claude/claude_desktop_config.json Windows: %APPDATA%\Claude\claude_desktop_config.json

{ "mcpServers": { "scrapfly": { "command": "npx", "args": ["mcp-remote", "https://mcp.scrapfly.io/mcp"] } } }

📖 Full guide: Claude Desktop Integration

Add to your Cline MCP settings:

{ "scrapfly-cloud-mcp": { "type": "http", "url": "https://mcp.scrapfly.io/mcp" } }

📖 Full guide: Cline Integration

Follow the Windsurf MCP documentation using the standard configuration.

📖 Full guide: Windsurf Integration

Add to your Zed settings:

{ "context_servers": { "scrapfly-cloud-mcp": { "type": "http", "url": "https://mcp.scrapfly.io/mcp" } } }

📖 Full guide: Zed Integration

Create or edit ~/.codex/config.toml:

[mcp_servers.scrapfly-cloud-mcp] url = "https://mcp.scrapfly.io/mcp"

📖 More info: Codex MCP documentation

Follow the Gemini CLI MCP guide using the standard configuration.

Add to ~/.config/opencode/opencode.json:

{ "$schema": "https://opencode.ai/config.json", "mcp": { "scrapfly-cloud-mcp": { "type": "http", "url": "https://mcp.scrapfly.io/mcp", "enabled": true } } }

📖 More info: OpenCode MCP documentation


🛠️ Available Tools

The Scrapfly MCP Server provides 5 powerful tools covering 99% of web scraping use cases:

Tool

Description

Use Case

scraping_instruction_enhanced

Get best practices & POW token

Always call first!

web_get_page

Quick page fetch with smart defaults

Simple scraping tasks

web_scrape

Full control with browser automation

Complex scraping, login flows

screenshot

Capture page screenshots

Visual analysis, monitoring

info_account

Check usage & quota

Account management

📖 Full reference: Tools & API Specification

Example: Scrape a Page

User: "What are the top posts on Hacker News right now?" AI: Uses web_get_page to fetch https://news.ycombinator.com and returns current top stories

Example: Extract Structured Data

User: "Get all product prices from this Amazon page" AI: Uses web_scrape with extraction_model="product_listing" to return structured JSON

📖 More examples: Real-World Examples


🔐 Authentication

Scrapfly MCP supports multiple authentication methods:

Method

Best For

Documentation

OAuth2

Production, multi-user apps

OAuth2 Setup

API Key

Personal use, development

API Key Setup

Header Auth

Custom integrations

Header Auth

🔑 Get your API key: Scrapfly Dashboard


📊 Configuration Reference

Setting

Value

Server Name

scrapfly-cloud-mcp

Type

Remote HTTP Server

URL

https://mcp.scrapfly.io/mcp

Protocol

MCP over HTTP/SSE


🖥️ Self-Hosted / Local Deployment

You can run the Scrapfly MCP server locally or self-host it.

CLI Arguments

Flag

Description

-http <address>

Start HTTP server at the specified address (e.g., :8080). Takes precedence over PORT env var.

-apikey <key>

Use this API key instead of the SCRAPFLY_API_KEY environment variable.

Environment Variables

Variable

Description

PORT

HTTP port to listen on. Used if -http flag is not set.

SCRAPFLY_API_KEY

Default Scrapfly API key. Can also be passed via query parameter ?apiKey=xxx at runtime.

Examples

# Start HTTP server on port 8080 ./scrapfly-mcp -http :8080 # Start HTTP server using PORT env var PORT=8080 ./scrapfly-mcp # Start with API key ./scrapfly-mcp -http :8080 -apikey scp-live-xxxx # Start in stdio mode (for local MCP clients) ./scrapfly-mcp

Docker

# Build docker build -t scrapfly-mcp . # Run (Smithery compatible - uses PORT env var) docker run -p 8080:8080 scrapfly-mcp # Run with custom port docker run -e PORT=9000 -p 9000:9000 scrapfly-mcp

🤝 Framework Integrations

Scrapfly MCP also works with AI frameworks and automation tools:

📖 All integrations: Integration Index


📚 Resources


💬 Need Help?


-
security - not tested
F
license - not found
-
quality - not tested

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/scrapfly/scrapfly-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server