The MCP Webscan Server is a tool for scanning and analyzing web content with the following capabilities:
Fetch web pages: Retrieve page content and convert it to Markdown format, with optional CSS selector targeting.
Extract and analyze hyperlinks: Organize links with URLs, anchor text, and contextual details, distinguishing between internal and external links.
Crawl websites: Recursively discover all URLs on a site up to a specified depth.
Check for broken links: Identify and report non-functional links on a given page.
Find links matching patterns: Use regex to locate URLs that match particular patterns.
Generate XML sitemaps: Create sitemaps by crawling a website, useful for SEO and site structure overview.
MCP Webscan Server
A Model Context Protocol (MCP) server for web content scanning and analysis. This server provides tools for fetching, analyzing, and extracting information from web pages.
Features
Page Fetching: Convert web pages to Markdown for easy analysis
Link Extraction: Extract and analyze links from web pages
Site Crawling: Recursively crawl websites to discover content
Link Checking: Identify broken links on web pages
Pattern Matching: Find URLs matching specific patterns
Sitemap Generation: Generate XML sitemaps for websites
Installation
Installing via Smithery
To install Webscan for Claude Desktop automatically via Smithery:
Manual Installation
Usage
Starting the Server
The server runs on stdio transport, making it compatible with MCP clients like Claude Desktop.
Available Tools
fetch-page
Fetches a web page and converts it to Markdown.
Parameters:
url
(required): URL of the page to fetch.selector
(optional): CSS selector to target specific content.
extract-links
Extracts all links from a web page with their text.
Parameters:
url
(required): URL of the page to analyze.baseUrl
(optional): Base URL to filter links.limit
(optional, default: 100): Maximum number of links to return.
crawl-site
Recursively crawls a website up to a specified depth.
Parameters:
url
(required): Starting URL to crawl.maxDepth
(optional, default: 2): Maximum crawl depth (0-5).
check-links
Checks for broken links on a page.
Parameters:
url
(required): URL to check links for.
find-patterns
Finds URLs matching a specific pattern.
Parameters:
url
(required): URL to search in.pattern
(required): JavaScript-compatible regex pattern to match URLs against.
generate-site-map
Generates a simple XML sitemap by crawling.
Parameters:
url
(required): Root URL for sitemap crawl.maxDepth
(optional, default: 2): Maximum crawl depth for discovering URLs (0-5).limit
(optional, default: 1000): Maximum number of URLs to include in the sitemap.
Example Usage with Claude Desktop
Configure the server in your Claude Desktop settings:
Use the tools in your conversations:
Development
Prerequisites
Node.js >= 18
npm
Project Structure (Post-Refactor)
Building
Development Mode
Running evals
The evals package loads an mcp client that then runs the index.ts file, so there is no need to rebuild between tests. You can load environment variables by prefixing the npx command. Full documentation can be found here.
Error Handling
The server implements comprehensive error handling:
Invalid parameters
Network errors
Content parsing errors
URL validation
All errors are properly formatted according to the MCP specification.
Contributing
Fork the repository
Create your feature branch (
git checkout -b feature/amazing-feature
)Commit your changes (
git commit -m 'Add some amazing feature'
)Push to the branch (
git push origin feature/amazing-feature
)Open a Pull Request
License
MIT License - see the LICENSE file for details
remote-capable server
The server can be hosted and run remotely because it primarily relies on remote services or has no dependency on the local environment.
Enables web content scanning and analysis by fetching, analyzing, and extracting information from web pages using tools like page fetching, link extraction, site crawling, and more.
- Features
- Installation
- Usage
- Example Usage with Claude Desktop
- Development
- Running evals
- Error Handling
- Contributing
- License
Related Resources
Related MCP Servers
- AsecurityAlicenseAqualityProvides functionality to fetch web content in various formats, including HTML, JSON, plain text, and Markdown.Last updated -4376610MIT License
- AsecurityAlicenseAqualityEnables text extraction from web pages and PDFs, and execution of predefined commands, enhancing content processing and automation capabilities.
- AsecurityFlicenseAqualityProvides functionality to fetch web content in various formats, including HTML, JSON, plain text, and Markdown.Last updated -4104,1622
- -securityFlicense-qualityProvides functionality to fetch and transform web content in various formats (HTML, JSON, plain text, and Markdown) through simple API calls.Last updated -104,1621