search_bookmarks_extended
Search Pinboard bookmarks by title, notes, or tags with configurable date range and result limits for comprehensive historical retrieval.
Instructions
Extended search for comprehensive historical results across titles, notes, and tags.
Args:
query: Search query to match against bookmark titles, notes, and tags
days_back: How many days back to search (1-730, default 365 = 1 year)
limit: Maximum number of results to return (1-200, default 100)
Note: Provides comprehensive results while being mindful of server load.
Use tag-based searches for most efficient access to historical bookmarks.
Input Schema
TableJSON Schema
| Name | Required | Description | Default |
|---|---|---|---|
| query | Yes | ||
| days_back | No | ||
| limit | No |
Implementation Reference
- src/pinboard_mcp_server/main.py:42-70 (handler)The MCP tool handler that performs input validation, invokes the client method, and returns a formatted dictionary with search results including bookmarks, total count, query, and days_back.@mcp.tool async def search_bookmarks_extended( query: str, days_back: int = 365, limit: int = 100 ) -> dict[str, Any]: """Extended search for comprehensive historical results across titles, notes, and tags. Args: query: Search query to match against bookmark titles, notes, and tags days_back: How many days back to search (1-730, default 365 = 1 year) limit: Maximum number of results to return (1-200, default 100) Note: Provides comprehensive results while being mindful of server load. Use tag-based searches for most efficient access to historical bookmarks. """ if not 1 <= days_back <= 730: raise ValueError("Days back must be between 1 and 730 (2 years max)") if not 1 <= limit <= 200: raise ValueError("Limit must be between 1 and 200") bookmarks = await client.search_bookmarks_extended( query=query, days_back=days_back, limit=limit ) return { "bookmarks": [bookmark.model_dump() for bookmark in bookmarks], "total": len(bookmarks), "query": query, "days_back": days_back, }
- Core implementation in PinboardClient that handles caching, efficient tag searches, extended API calls to Pinboard for historical data, filtering logic, and sorting results by recency.async def search_bookmarks_extended( self, query: str, days_back: int = 365, limit: int = 100 ) -> list[Bookmark]: """Extended search that looks further back in time for comprehensive results. Args: query: Search query to match against bookmark titles, notes, and tags days_back: How many days back to search (default 1 year) limit: Maximum number of results to return (default 100) Note: This provides generous data for LLM filtering and analysis. Returns comprehensive results that the client can intelligently filter. """ cache_key = f"extended_search:{query}:{days_back}:{limit}" if cache_key in self._query_cache: return self._query_cache[cache_key] query_lower = query.lower() # First check if this is an exact tag match - use efficient tag search tags = await self.get_all_tags() exact_tag_match = next( (tag.tag for tag in tags if tag.tag.lower() == query_lower), None ) matches: list[Bookmark] = [] if exact_tag_match: # Use efficient tag-based search for exact matches try: await self._search_by_tag_direct( exact_tag_match, matches, None, None, limit ) except Exception: pass # If no tag match or tag search failed, do extended time-based search if not matches: def _get_extended_posts() -> Any: self._rate_limit_sync() from_date = datetime.now() - timedelta(days=days_back) return self._pb.posts.all( fromdt=from_date.strftime("%Y-%m-%dT%H:%M:%SZ") ) result: Any = await self._run_in_executor(_get_extended_posts) posts_list = result if isinstance(result, list) else [] # Search through the extended results for post in posts_list: if len(matches) >= limit: break bookmark = Bookmark.from_pinboard(self._convert_pinboard_bookmark(post)) if ( query_lower in bookmark.title.lower() or query_lower in bookmark.notes.lower() or any(query_lower in tag.lower() for tag in bookmark.tags) ): matches.append(bookmark) # Sort by most recent first matches.sort(key=lambda b: b.saved_at, reverse=True) # Cache the result self._query_cache[cache_key] = matches return matches