webpage_scrape
Retrieve content from any webpage by providing its URL, with optional markdown output.
Instructions
Scrape webpage by url
Input Schema
| Name | Required | Description | Default |
|---|---|---|---|
| url | Yes | The url to scrape | |
| includeMarkdown | No | Include markdown in the response (boolean value as string: 'true' or 'false') | false |
Implementation Reference
- src/serper_mcp_server/core.py:20-22 (handler)The async handler function that executes the webpage_scrape tool logic. It calls the Serper scrape endpoint (https://scrape.serper.dev) with the WebpageRequest payload via fetch_json.
async def scape(request: WebpageRequest) -> Dict[str, Any]: url = "https://scrape.serper.dev" return await fetch_json(url, request) - Pydantic schema/input validation for WebpageRequest. Defines 'url' (required) and 'includeMarkdown' (optional, default 'false').
class WebpageRequest(BaseModel): url: str = Field(..., description="The url to scrape") includeMarkdown: Optional[str] = Field( "false", pattern=r"^(true|false)$", description="Include markdown in the response (boolean value as string: 'true' or 'false')", ) - src/serper_mcp_server/server.py:54-58 (registration)Registration of the webpage_scrape tool in the MCP server's list_tools() function. Defines the tool name from SerperTools.WEBPAGE_SCRAPE enum and its input schema.
tools.append(Tool( name=SerperTools.WEBPAGE_SCRAPE, description="Scrape webpage by url", inputSchema=WebpageRequest.model_json_schema(), )) - src/serper_mcp_server/server.py:68-71 (registration)Tool call handler dispatching logic. When name matches 'webpage_scrape', it creates a WebpageRequest from arguments and calls the scape() core handler.
if name == SerperTools.WEBPAGE_SCRAPE.value: request = WebpageRequest(**arguments) result = await scape(request) return [TextContent(text=json.dumps(result, indent=2), type="text")] - src/serper_mcp_server/enums.py:17-17 (helper)Enum definition mapping SERPER_WEBPAGE_SCRAPE to the string 'webpage_scrape', used for consistent tool name references.
WEBPAGE_SCRAPE = "webpage_scrape"