Skip to main content
Glama

recommend_model

Find AI models for specific tasks like image generation or video creation. Describe your task to get ranked recommendations of suitable models from Fal.ai's collection.

Instructions

Get AI-powered model recommendations for a specific task. Describe what you want to do (e.g., 'generate portrait photo', 'anime style illustration', 'product photography') and get the best-suited models ranked by relevance. Featured models by Fal.ai are prioritized.

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
taskYesDescription of your task (e.g., 'generate professional headshot', 'create anime character', 'transform photo to watercolor')
categoryNoOptional category hint to narrow search
limitNoMaximum number of recommendations

Implementation Reference

  • The handler function for the 'recommend_model' tool. It extracts task, category, and limit from arguments, calls ModelRegistry.recommend_models, formats the recommendations into markdown text content, and handles errors or empty results.
    async def handle_recommend_model( arguments: Dict[str, Any], registry: ModelRegistry, ) -> List[TextContent]: """Handle the recommend_model tool.""" task = arguments.get("task") if not task: return [ TextContent( type="text", text="❌ Please describe your task (e.g., 'generate professional headshot').", ) ] category = arguments.get("category") limit = arguments.get("limit", 5) recommend_result = await registry.recommend_models( task=task, category=category, limit=limit ) recommendations = recommend_result.recommendations if not recommendations: return [ TextContent( type="text", text=f"No models found for task: '{task}'. Try a different description or remove the category filter.", ) ] # Format output lines = [f'## Recommended Models for: "{task}"\n'] if recommend_result.used_fallback: lines.append( f"⚠️ *Using cached results ({recommend_result.fallback_reason}). Results may be less relevant.*\n" ) lines.append("💡 *Models are ranked by relevance. ⭐ = Featured by Fal.ai*\n") for i, rec in enumerate(recommendations, 1): # rec is a dictionary from model_registry.recommend_models model_id = rec.get("model_id", "unknown") name = rec.get("name") description = rec.get("description") highlighted = rec.get("highlighted", False) group = rec.get("group") score = rec.get("score", 0.0) # Badge for highlighted models highlighted_badge = " ⭐" if highlighted else "" lines.append(f"### {i}. `{model_id}`{highlighted_badge}") if name: lines.append(f"**{name}**") if description: lines.append(f"{description}") if group: lines.append(f"*Family: {group}*") lines.append(f"*Relevance: {score:.1%}*\n") return [TextContent(type="text", text="\n".join(lines))]
  • The Tool schema definition for 'recommend_model', including inputSchema with required 'task' parameter and optional 'category' and 'limit'.
    name="recommend_model", description="Get AI-powered model recommendations for a specific task. Describe what you want to do (e.g., 'generate portrait photo', 'anime style illustration', 'product photography') and get the best-suited models ranked by relevance. Featured models by Fal.ai are prioritized.", inputSchema={ "type": "object", "properties": { "task": { "type": "string", "description": "Description of your task (e.g., 'generate professional headshot', 'create anime character', 'transform photo to watercolor')", }, "category": { "type": "string", "enum": ["image", "video", "audio"], "description": "Optional category hint to narrow search", }, "limit": { "type": "integer", "default": 5, "minimum": 1, "maximum": 10, "description": "Maximum number of recommendations", }, }, "required": ["task"], }, ),
  • Registration of the 'recommend_model' handler in the TOOL_HANDLERS dictionary used by the MCP server to route tool calls. The handler is imported earlier in the file.
    TOOL_HANDLERS = { # Utility tools (no queue needed) "list_models": handle_list_models, "recommend_model": handle_recommend_model, "get_pricing": handle_get_pricing, "get_usage": handle_get_usage, "upload_file": handle_upload_file, # Image generation tools "generate_image": handle_generate_image, "generate_image_structured": handle_generate_image_structured, "generate_image_from_image": handle_generate_image_from_image, # Image editing tools "remove_background": handle_remove_background, "upscale_image": handle_upscale_image, "edit_image": handle_edit_image, "inpaint_image": handle_inpaint_image, "resize_image": handle_resize_image, "compose_images": handle_compose_images, # Video tools "generate_video": handle_generate_video, "generate_video_from_image": handle_generate_video_from_image, "generate_video_from_video": handle_generate_video_from_video, # Audio tools "generate_music": handle_generate_music, }
  • Core helper method in ModelRegistry that performs semantic search based on the task, builds recommendations with metadata and reasoning, and handles fallback to cached results.
    async def recommend_models( self, task: str, category: Optional[str] = None, limit: int = 5, ) -> RecommendationResult: """ Recommend the best models for a given task. Uses semantic search and prioritizes highlighted (featured) models. Returns models with relevance reasoning. Args: task: Description of the task (e.g., "generate professional headshot") category: Optional category hint ("image", "video", "audio") limit: Maximum number of recommendations Returns: RecommendationResult with recommendations and fallback indicator """ # Map simplified category to API category if provided api_category = None if category: # Use primary mapping for each simplified category category_to_api = { "image": "text-to-image", "video": "text-to-video", "audio": "text-to-audio", } api_category = category_to_api.get(category) # Search using the task as query search_result = await self.search_models( query=task, category=api_category, limit=limit * 2 ) # Build recommendations with reasoning recommendations: List[Dict[str, Any]] = [] for model in search_result.models[:limit]: rec = { "model_id": model.id, "name": model.name, "description": model.description, "category": model.category, "highlighted": model.highlighted, "group": model.group_label, "reason": self._generate_recommendation_reason(model, task), } recommendations.append(rec) return RecommendationResult( recommendations=recommendations, used_fallback=search_result.used_fallback, fallback_reason=search_result.fallback_reason, )

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/raveenb/fal-mcp-server'

If you have feedback or need assistance with the MCP directory API, please join our Discord server