Skip to main content
Glama
zizzfizzix

Bing Webmaster Tools MCP Server

by zizzfizzix

get_crawl_stats

Get daily crawl statistics for any site by submitting its URL. Monitor Bingbot activity and crawl performance across a custom date range.

Instructions

Retrieve crawl statistics for a specific site within a date range.

Args: site_url: The URL of the site

Returns: List[CrawlStats]: List of daily crawl statistics

Raises: BingWebmasterError: If statistics cannot be retrieved

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
selfYes
site_urlYes

Output Schema

TableJSON Schema
NameRequiredDescriptionDefault
resultYes

Implementation Reference

  • The `wrap_service_method` helper that dynamically registers service methods as MCP tools. When called for 'get_crawl_stats', it wraps the `get_crawl_stats` method from the `crawling.CrawlingService` class.
    def wrap_service_method(
        mcp: FastMCP, service: BingWebmasterService, service_attr: str, method_name: str
    ) -> Callable[..., Any]:
        """Helper function to wrap a service method with mcp.tool() while preserving its signature and docstring.
    
        Args:
            mcp: The MCP server instance
            service: The BingWebmasterService instance
            service_attr: The service attribute name (e.g., 'sites', 'submission')
            method_name: The method name to wrap
    
        Returns:
            The wrapped method as an MCP tool
        """
        # Get the service class from our mapping
        service_class = SERVICE_CLASSES[service_attr]
        # Get the original method
        original_method = getattr(service_class, method_name)
        # Get the signature
        sig = inspect.signature(original_method)
        # Remove 'self' parameter from signature
        parameters = list(sig.parameters.values())[1:]  # Skip 'self'
    
        # Create new signature without 'self'
        new_sig = sig.replace(parameters=parameters)
    
        # Create wrapper function with same signature
        @mcp.tool()
        @wraps(original_method)
        async def wrapper(*args: Any, **kwargs: Any) -> Any:
            # Filter out any 'self' arguments that might be passed by the MCP client
            kwargs = {k: v for k, v in kwargs.items() if k != "self"}
    
            async with service as s:
                service_obj = getattr(s, service_attr)
                # Get the method from the instance
                method = getattr(service_obj, method_name)
                # Call the method directly - it's already bound to the instance
                return await method(*args, **kwargs)
    
        # Copy signature and docstring
        wrapper.__signature__ = new_sig  # type: ignore
        wrapper.__doc__ = original_method.__doc__
    
        return wrapper
  • Registration of the 'get_crawl_stats' tool. It calls wrap_service_method(mcp, service, 'crawling', 'get_crawl_stats') to wrap the crawling service's get_crawl_stats method as an MCP tool.
    get_crawl_stats = wrap_service_method(mcp, service, "crawling", "get_crawl_stats")  # noqa: F841
  • The `BingWebmasterService` instantiates `crawling.CrawlingService(self.client)` as `self.crawling`, which is the service object whose `get_crawl_stats` method is ultimately called.
    self.crawling = crawling.CrawlingService(self.client)
    self.keywords = keyword_analysis.KeywordAnalysisService(self.client)
  • Mapping of 'crawling' service attribute name to `crawling.CrawlingService` class (from the external bing_webmaster_tools package), used by wrap_service_method.
    "crawling": crawling.CrawlingService,
    "keywords": keyword_analysis.KeywordAnalysisService,
  • Entry point where `add_bing_webmaster_tools` is called, which registers all tools including get_crawl_stats.
    add_bing_webmaster_tools(mcp, bing_service)
    
    
    def app() -> None:
        """Command-line interface entry point."""
        mcp.run(transport="stdio")
    
    
    if __name__ == "__main__":
        app()
Behavior2/5

Does the description disclose side effects, auth requirements, rate limits, or destructive behavior?

No annotations provided, and description does not disclose behavioral traits such as read-only nature, authentication needs, or rate limits. The only hint is the verb 'retrieve' implying read operations.

Agents need to know what a tool does to the world before calling it. Descriptions should go beyond structured annotations to explain consequences.

Conciseness3/5

Is the description appropriately sized, front-loaded, and free of redundancy?

Short description but includes unnecessary docstring format (Args, Returns, Raises). Omission of 'self' parameter reduces completeness, but overall length is acceptable.

Shorter descriptions cost fewer tokens and are easier for agents to parse. Every sentence should earn its place.

Completeness2/5

Given the tool's complexity, does the description cover enough for an agent to succeed on first attempt?

Output schema exists, but description fails to explain the missing 'self' parameter, the implied date range, or how site_url should be formatted. Lacks sufficient context for correct invocation.

Complex tools with many parameters or behaviors need more documentation. Simple tools need less. This dimension scales expectations accordingly.

Parameters2/5

Does the description clarify parameter syntax, constraints, interactions, or defaults beyond what the schema provides?

Schema description coverage is 0%. Description only explains 'site_url' (URL of the site) but omits the required 'self' parameter entirely. Also mentions a date range not present in schema, causing confusion.

Input schemas describe structure but not intent. Descriptions should explain non-obvious parameter relationships and valid value ranges.

Purpose3/5

Does the description clearly state what the tool does and how it differs from similar tools?

States 'Retrieve crawl statistics for a specific site' which identifies verb and resource, but mentions 'within a date range' despite no date parameters, causing ambiguity. Sibling tools like get_crawl_issues exist but no clear differentiation.

Agents choose between tools based on descriptions. A clear purpose with a specific verb and resource helps agents select the right tool.

Usage Guidelines2/5

Does the description explain when to use this tool, when not to, or what alternatives exist?

No guidance on when to use this tool over siblings like get_crawl_issues or get_crawl_settings. Does not specify prerequisites or alternatives.

Agents often have multiple tools that could apply. Explicit usage guidance like "use X instead of Y when Z" prevents misuse.

Install Server

Other Tools

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/zizzfizzix/mcp-server-bwt'

If you have feedback or need assistance with the MCP directory API, please join our Discord server