Click on "Install Server".
Wait a few minutes for the server to deploy. Once ready, it will show a "Started" state.
In the chat, type
@followed by the MCP server name and your instructions, e.g., "@Website Scraper MCP Serverscrape the main text and metadata from https://en.wikipedia.org/wiki/Web_scraping"
That's it! The server will respond to your query, and you can continue using it as needed.
Here is a step-by-step guide with screenshots.
Website Scraper MCP MCP Server
This is an MCP (Model Context Protocol) server that provides with authentication via Bearer tokens access to the Website Scraper MCP API. It enables AI agents and LLMs to interact with Website Scraper MCP through standardized tools.
Features
π§ MCP Protocol: Built on the Model Context Protocol for seamless AI integration
π Full API Access: Provides tools for interacting with Website Scraper MCP endpoints
π Secure Authentication: Supports API key authentication via Bearer tokens
π³ HTTP 402 Payment Protocol: Dual-mode operation (authenticated or paid access)
π D402 Integration: Uses traia_iatp.d402 for blockchain payment verification
π³ Docker Support: Easy deployment with Docker and Docker Compose
β‘ Async Operations: Built with FastMCP for efficient async handling
API Documentation
Website Scraper MCP Website: https://www.crummy.com/software/BeautifulSoup
API Documentation: None
Available Tools
This server provides the following tools:
example_tool: Placeholder tool (to be implemented)
Note: Replace
Installation
Using Docker (Recommended)
Clone this repository:
git clone https://github.com/Traia-IO/website-scraper-mcp-mcp-server.git cd website-scraper-mcp-mcp-serverSet your API key:
export WEB_SCRAPPING_API_KEY="your-api-key-here"Run with Docker:
./run_local_docker.sh
Using Docker Compose
Create a
.envfile with your configuration:
Server's internal API key (for payment mode)
WEB_SCRAPPING_API_KEY=your-api-key-here
Server payment address (for HTTP 402 protocol)
SERVER_ADDRESS=0x1234567890123456789012345678901234567890
Operator keys (for signing settlement attestations)
MCP_OPERATOR_PRIVATE_KEY=0x1234567890abcdef... MCP_OPERATOR_ADDRESS=0x9876543210fedcba...
Optional: Testing mode (skip settlement for local dev)
D402_TESTING_MODE=false PORT=8000
Manual Installation
Install dependencies using
uv:uv pip install -e .Run the server:
WEB_SCRAPPING_API_KEY="your-api-key-here" uv run python -m server
Using with CrewAI
Authentication & Payment (HTTP 402 Protocol)
This server supports two modes of operation:
Mode 1: Authenticated Access (Free)
Clients with their own Website Scraper MCP API key can use the server for free:
Flow:
Client provides their Website Scraper MCP API key
Server uses client's API key to call Website Scraper MCP API
No payment required
Client pays Website Scraper MCP directly
Mode 2: Payment Required (Paid Access)
Clients without an API key can pay-per-use via HTTP 402 protocol:
Flow:
Client makes initial request without payment
Server returns HTTP 402 with PaymentRequirements (token, network, amount)
Client creates EIP-3009 transferWithAuthorization payment signature
Client base64-encodes payment and sends in X-PAYMENT header
Server verifies payment via traia_iatp.d402.mcp_middleware
Server uses its INTERNAL Website Scraper MCP API key to call the API
Client receives result
D402 Protocol Details
This server uses the traia_iatp.d402 module for payment verification:
Payment Method: EIP-3009 transferWithAuthorization (gasless)
Supported Tokens: USDC, TRAIA, or any ERC20 token
Default Price: $0.001 per request (configurable via
DEFAULT_PRICE_USD)Networks: Base Sepolia, Sepolia, Polygon, etc.
Facilitator: d402.org (public) or custom facilitator
Environment Variables for Payment Mode
Operator Keys:
MCP_OPERATOR_PRIVATE_KEY: Used to sign settlement attestations (proof of service completion)
MCP_OPERATOR_ADDRESS: Public address corresponding to the private key
Required for on-chain settlement via IATP Settlement Layer
Can be the same as SERVER_ADDRESS or a separate operator key
Note on Per-Endpoint Configuration: Each endpoint's payment requirements (token address, network, price) are embedded in the tool code. They come from the endpoint configuration when the server is generated.
How It Works
Client Decision:
Has Website Scraper MCP API key? β Mode 1 (Authenticated)
No API key but willing to pay? β Mode 2 (Payment)
Server Response:
Mode 1: Uses client's API key (free for client)
Mode 2: Uses server's API key (client pays server)
Business Model:
Mode 1: No revenue (passthrough)
Mode 2: Revenue from pay-per-use (monetize server's API subscription)
Development
Testing the Server
Start the server locally
Run the health check:
python mcp_health_check.pyTest individual tools using the CrewAI adapter
Adding New Tools
To add new tools, edit server.py and:
Create API client functions for Website Scraper MCP endpoints
Add
@mcp.tool()decorated functionsUpdate this README with the new tools
Update
deployment_params.jsonwith the tool names in the capabilities array
Deployment
Deployment Configuration
The deployment_params.json file contains the deployment configuration for this MCP server:
Important: Always update the capabilities array when you add or remove tools!
Google Cloud Run
This server is designed to be deployed on Google Cloud Run. The deployment will:
Build a container from the Dockerfile
Deploy to Cloud Run with the specified configuration
Expose the
/mcpendpoint for client connections
Environment Variables
PORT: Server port (default: 8000)STAGE: Environment stage (default: MAINNET, options: MAINNET, TESTNET)LOG_LEVEL: Logging level (default: INFO)WEB_SCRAPPING_API_KEY: Your Website Scraper MCP API key (required)
Troubleshooting
Server not starting: Check Docker logs with
docker logs <container-id>Authentication errors: Ensure your API key is correctly set in the environment
API errors: Verify your API key has the necessary permissions3. Tool errors: Check the server logs for detailed error messages
Contributing
Fork the repository
Create a feature branch
Implement new tools or improvements
Update the README and deployment_params.json
Submit a pull request