get_trending_models
Discover trending AI models from Hugging Face to identify popular tools for tasks like text-generation or image-classification.
Instructions
Get trending AI models from Hugging Face
Input Schema
TableJSON Schema
| Name | Required | Description | Default |
|---|---|---|---|
| task | No | Filter by task (e.g., 'text-generation', 'image-classification') | |
| sort | No | Sort criterion | downloads |
| limit | No | Maximum number of results |
Implementation Reference
- src/ai_research_mcp/server.py:458-478 (handler)The primary handler for the 'get_trending_models' tool. It manages caching, calls the HuggingFace client method, and formats the output using _format_models.async def _get_trending_models( self, task: Optional[str] = None, sort: str = "downloads", limit: int = 30, ) -> str: """Get trending models from Hugging Face.""" cache_key = f"hf_models_{task}_{sort}" cached = self.cache.get(cache_key, 3600) if cached: models = cached else: models = await asyncio.to_thread( self.huggingface.get_trending_models, task=task, sort=sort, limit=limit, ) self.cache.set(cache_key, models) return self._format_models(models)
- src/ai_research_mcp/server.py:159-182 (registration)Registration of the 'get_trending_models' tool, including name, description, and input schema definition.Tool( name="get_trending_models", description="Get trending AI models from Hugging Face", inputSchema={ "type": "object", "properties": { "task": { "type": "string", "description": "Filter by task (e.g., 'text-generation', 'image-classification')", }, "sort": { "type": "string", "enum": ["downloads", "likes", "trending", "created"], "description": "Sort criterion", "default": "downloads", }, "limit": { "type": "integer", "description": "Maximum number of results", "default": 30, }, }, }, ),
- Helper method in HuggingFaceClient that fetches and processes trending models using HfApi.list_models. Called by the server handler.def get_trending_models( self, task: Optional[str] = None, library: Optional[str] = None, sort: str = "downloads", limit: int = 50, ) -> List[Dict]: """Get trending models from Hugging Face. Args: task: Filter by task (e.g., 'text-generation', 'image-classification') library: Filter by library (e.g., 'pytorch', 'transformers') sort: Sort by 'downloads', 'likes', 'trending', or 'created' limit: Maximum number of results Returns: List of model dictionaries """ try: models = self.api.list_models( filter=task, library=library, sort=sort, direction=-1, limit=limit, ) results = [] for model in models: # Get model info model_info = { "id": model.id, "author": model.author if hasattr(model, "author") else model.id.split("/")[0], "model_name": model.modelId if hasattr(model, "modelId") else model.id.split("/")[-1], "url": f"https://huggingface.co/{model.id}", "downloads": model.downloads if hasattr(model, "downloads") else 0, "likes": model.likes if hasattr(model, "likes") else 0, "tags": model.tags if hasattr(model, "tags") else [], "pipeline_tag": model.pipeline_tag if hasattr(model, "pipeline_tag") else None, "library": model.library_name if hasattr(model, "library_name") else None, "created_at": model.created_at.isoformat() if hasattr(model, "created_at") and model.created_at else None, "last_modified": model.last_modified.isoformat() if hasattr(model, "last_modified") and model.last_modified else None, "source": "huggingface", } results.append(model_info) return results except Exception as e: print(f"Error fetching models: {e}") return []
- Helper function to format the list of models into a markdown string for the tool response.def _format_models(self, models: List[Dict]) -> str: """Format models as markdown.""" if not models: return "*No models found.*" lines = [] for i, model in enumerate(models, 1): model_id = model.get("id", "Unknown") url = model.get("url", "") downloads = model.get("downloads", 0) likes = model.get("likes", 0) task = model.get("pipeline_tag", "") lines.append(f"### {i}. [{model_id}]({url})") lines.append(f"📥 {downloads:,} downloads • ❤️ {likes} likes") if task: lines.append(f"Task: `{task}`") lines.append("") return "\n".join(lines)