Skip to main content
Glama

search-models

Find and filter models on Hugging Face Hub using queries, authors, and tags. Retrieve relevant models for tasks like text classification or translation with customizable result limits.

Instructions

Search for models on Hugging Face Hub

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
authorNoFilter by author/organization (e.g., 'huggingface', 'google')
limitNoMaximum number of results to return
queryNoSearch term (e.g., 'bert', 'gpt')
tagsNoFilter by tags (e.g., 'text-classification', 'translation')

Implementation Reference

  • The handler logic for the 'search-models' tool. It parses input arguments, constructs query parameters, calls the Hugging Face API via make_hf_request, handles errors, formats model information, and returns JSON results as text content.
    if name == "search-models": query = arguments.get("query") author = arguments.get("author") tags = arguments.get("tags") limit = arguments.get("limit", 10) params = {"limit": limit} if query: params["search"] = query if author: params["author"] = author if tags: params["filter"] = tags data = await make_hf_request("models", params) if "error" in data: return [ types.TextContent( type="text", text=f"Error searching models: {data['error']}" ) ] # Format the results results = [] for model in data: model_info = { "id": model.get("id", ""), "name": model.get("modelId", ""), "author": model.get("author", ""), "tags": model.get("tags", []), "downloads": model.get("downloads", 0), "likes": model.get("likes", 0), "lastModified": model.get("lastModified", ""), } results.append(model_info) return [types.TextContent(type="text", text=json.dumps(results, indent=2))]
  • Registration of the 'search-models' tool in the list_tools handler, including its description and input JSON schema.
    types.Tool( name="search-models", description="Search for models on Hugging Face Hub", inputSchema={ "type": "object", "properties": { "query": { "type": "string", "description": "Search term (e.g., 'bert', 'gpt')", }, "author": { "type": "string", "description": "Filter by author/organization (e.g., 'huggingface', 'google')", }, "tags": { "type": "string", "description": "Filter by tags (e.g., 'text-classification', 'translation')", }, "limit": { "type": "integer", "description": "Maximum number of results to return", }, }, }, ),
  • Helper function used by the search-models handler to make HTTP requests to the Hugging Face API endpoints.
    # Helper Functions async def make_hf_request( endpoint: str, params: Optional[Dict[str, Any]] = None ) -> Dict: """Make a request to the Hugging Face API with proper error handling.""" url = f"{HF_API_BASE}/{endpoint}" try: response = await http_client.get(url, params=params) response.raise_for_status() return response.json() except Exception as e: return {"error": str(e)}

Other Tools

Related Tools

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/shreyaskarnik/huggingface-mcp-server'

If you have feedback or need assistance with the MCP directory API, please join our Discord server