Provides web search and content extraction capabilities for GitHub Copilot, allowing it to search the web and crawl URLs to retrieve up-to-date information and documentation.
Integrates real-time web search and crawling tools directly into OpenAI's API, enabling GPT models to search the web and extract content from URLs through the MCP protocol.
Crawleo MCP Server
Real-time web search and crawling capabilities for AI assistants through Model Context Protocol (MCP).
Overview
Crawleo MCP enables AI assistants to access live web data through two powerful tools:
web.search - Real-time web search with multiple output formats
web.crawl - Deep content extraction from any URL
Features
✅ Real-time web search from any country/language
✅ Multiple output formats - Enhanced HTML, Raw HTML, Markdown, Plain Text
✅ Device-specific results - Desktop, mobile, or tablet view
✅ Deep content extraction with JavaScript rendering
✅ Zero data retention - Complete privacy
✅ Auto-crawling option for search results
Installation
Option 1: NPM (Recommended for local usage)
Install globally via npm:
Or use npx without installing:
Option 2: Clone Repository
Option 3: Docker
Build and run using Docker:
Docker configuration for MCP clients:
Option 4: Remote Server (No installation needed)
Use the hosted version at https://api.crawleo.dev/mcp - see configuration examples below.
Getting Your API Key
Visit crawleo.dev
Sign up for a free account
Navigate to your dashboard
Copy your API key (starts with
sk_)
Setup Instructions
Using Local MCP Server (npm package)
After installing via npm, configure your MCP client to use the local server:
Claude Desktop / Cursor / Windsurf (Local):
Or if installed globally:
From cloned repository:
Using Remote Server (Hosted)
1. Claude Desktop
Location of config file:
macOS:
~/Library/Application Support/Claude/claude_desktop_config.jsonWindows:
%APPDATA%\Claude\claude_desktop_config.jsonLinux:
~/.config/Claude/claude_desktop_config.json
Configuration:
Replace YOUR_API_KEY_HERE with your actual API key from crawleo.dev.
Steps:
Open the config file in a text editor
Add the Crawleo MCP configuration
Save the file
Restart Claude Desktop completely (quit and reopen)
Start a new conversation and ask Claude to search the web!
Example usage:
2. Cursor IDE
Location of config file:
macOS:
~/.cursor/config.jsonor~/Library/Application Support/Cursor/config.jsonWindows:
%APPDATA%\Cursor\config.jsonLinux:
~/.config/Cursor/config.json
Configuration:
Steps:
Locate and open your Cursor config file
Add the Crawleo MCP configuration
Save the file
Restart Cursor
The MCP tools will be available in your AI assistant
Example usage in Cursor:
3. Windsurf IDE
Location of config file:
macOS:
~/Library/Application Support/Windsurf/config.jsonWindows:
%APPDATA%\Windsurf\config.jsonLinux:
~/.config/Windsurf/config.json
Configuration:
Steps:
Open the Windsurf config file
Add the Crawleo MCP server configuration
Save and restart Windsurf
Start using web search in your coding workflow
4. GitHub Copilot
Location of config file:
For GitHub Copilot in VS Code or compatible editors, you need to configure MCP servers.
Configuration:
Create or edit your MCP config file and add:
Complete example with multiple servers:
Steps:
Open your GitHub Copilot MCP configuration
Add the Crawleo server configuration
Save the file
Restart VS Code or your IDE
GitHub Copilot can now use Crawleo for web searches!
Example usage:
5. OpenAI Platform (Direct Integration)
OpenAI now supports MCP servers directly! Here's how to use Crawleo with OpenAI's API:
Python Example:
Key Parameters:
server_url- Crawleo MCP endpointauthorization- Your Crawleo API keyallowed_tools- Enableweb.searchand/orweb.crawlrequire_approval- Set to "always", "never", or "conditional"
Node.js Example:
Available Tools
web.search
Search the web in real-time with customizable parameters.
Parameters:
query(required) - Search termmax_pages- Number of result pages (default: 1)setLang- Language code (e.g., "en", "ar")cc- Country code (e.g., "US", "EG")device- Device type: "desktop", "mobile", "tablet" (default: "desktop")enhanced_html- Get clean HTML (default: true)raw_html- Get raw HTML (default: false)markdown- Get Markdown format (default: true)page_text- Get plain text (default: false)auto_crawling- Auto-crawl result URLs (default: false)
Example:
web.crawl
Extract content from specific URLs.
Parameters:
urls(required) - List of URLs to crawlrawHtml- Return raw HTML (default: false)markdown- Convert to Markdown (default: false)screenshot- Capture screenshot (optional)country- Geographic location
Example:
Troubleshooting
MCP server not appearing
Check config file location - Make sure you're editing the correct file
Verify JSON syntax - Use a JSON validator to check for syntax errors
Restart the application - Completely quit and reopen (not just reload)
Check API key - Ensure your API key is valid and active at crawleo.dev
Authentication errors
Verify your API key is correct (should start with
sk_)Make sure the key is wrapped in quotes
Check that "Bearer " prefix is included in the Authorization header (for Claude/Cursor/Windsurf)
For OpenAI Platform, use the key directly in the
authorizationfieldConfirm your account has available credits at crawleo.dev
No results returned
Check your internet connection
Verify the search query is not empty
Try a simpler search query first
Check API status at crawleo.dev
Tool names not recognized
Make sure you're using the correct tool names:
Use
web.search(notsearch_web)Use
web.crawl(notcrawl_web)
Usage Examples
Research Assistant
Content Analysis
Code Documentation
News Monitoring
Market Research
Pricing
Crawleo MCP uses the same affordable pricing as our API:
10,000 searches → $20
100,000 searches → $100
250,000 searches → $200
Check your usage and manage your subscription at crawleo.dev
Privacy & Security
✅ Zero data retention - We never store your search queries or results
✅ Secure authentication - API keys transmitted over HTTPS
✅ No tracking - Your usage patterns remain private
Support
Documentation: crawleo.dev/docs
API Status: crawleo.dev/status
Contact: support@crawleo.dev
Links
🌐 Website: crawleo.dev
📚 Documentation: crawleo.dev/docs
🔑 Get API Key: crawleo.dev
Is this better? Would you like me to add anything else or create additional guides?