Skip to main content
Glama

search

Search the web to get AI-generated answers with cited sources, supporting multiple focus modes and optimization settings.

Instructions

Search the web using Perplexica and get AI-generated responses with sources.

Args: search_request: The search request containing query, models, and options. Returns: A formatted string containing the AI-generated response and source citations.

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
search_requestYes

Implementation Reference

  • The MCP tool handler named 'search'. It executes the search use case via get_search_use_case(), formats the AI-generated response with source citations, and handles SearchError and general exceptions.
    @mcp.tool() async def search(search_request: SearchRequestDTO) -> str: """Search the web using Perplexica and get AI-generated responses with sources. Args: search_request: The search request containing query, models, and options. Returns: A formatted string containing the AI-generated response and source citations. """ use_case = get_search_use_case() try: result = await use_case.execute(search_request) response_parts = [result.message] if result.sources: response_parts.append("\n\n## Sources") for i, source in enumerate(result.sources, 1): source_line = f"{i}. [{source.title}]({source.url})" if source.snippet: source_line += f"\n > {source.snippet}" response_parts.append(source_line) return "\n".join(response_parts) except SearchError as e: return f"Search failed: {e.message}" except Exception as e: return f"Unexpected error: {e}"
  • Pydantic BaseModel defining the input schema for the 'search' tool, including nested ChatModelRequest and EmbeddingModelRequest, with validation, examples, and descriptions.
    class SearchRequestDTO(BaseModel): """Request DTO for search operations. Attributes: query: The search query string. chat_model: Configuration for the chat model. embedding_model: Configuration for the embedding model. focus_mode: The search focus mode. optimization_mode: The optimization mode for search. history: Conversation history as list of [role, content] tuples. system_instructions: Optional custom system instructions. stream: Whether to stream the response. """ model_config = ConfigDict( json_schema_extra={ "example": { "query": "What is the capital of France?", "chatModel": { "providerId": "a1850332-621f-4960-b005-b005b8680328", "key": "anthropic/claude-sonnet-4.5", }, "embeddingModel": { "providerId": "a1850332-621f-4960-b005-b005b8680328", "key": "openai/text-embedding-3-small", }, "focusMode": "webSearch", "optimizationMode": "balanced", "history": [ ["human", "Hi, how are you?"], ["assistant", "I am doing well, how can I help you today?"], ], "systemInstructions": "Focus on providing accurate information", "stream": False, } } ) query: str = Field(..., min_length=1, description="The search query string") chat_model: ChatModelRequest = Field(..., alias="chatModel") embedding_model: EmbeddingModelRequest = Field(..., alias="embeddingModel") focus_mode: str = Field( default="webSearch", alias="focusMode", description="Search focus mode", ) optimization_mode: str = Field( default="balanced", alias="optimizationMode", description="Optimization mode for search", ) history: list[list[str]] = Field( default_factory=list, description="Conversation history as list of [role, content] pairs", ) system_instructions: str | None = Field( default=None, alias="systemInstructions", description="Custom system instructions", ) stream: bool = Field(default=False, description="Whether to stream the response")
  • src/main.py:6-7 (registration)
    Import statement in main.py that triggers registration of the 'search' MCP tool (and others) via @mcp.tool() decorators in api.py.
    # Import api module to register MCP tools via decorators import application.api # noqa: F401
  • Creation of the FastMCP server instance with instructions documenting the 'search' tool.
    mcp = FastMCP( name="mcp-perplexica", instructions=""" MCP server for Perplexica search API. This server provides web search capabilities through Perplexica, allowing you to search the web and get AI-generated responses with source citations. Available tools: - search: Perform a web search using Perplexica """, host=settings.host, port=settings.port )
  • Dependency injection factory function that provides the SearchUseCase instance (wired with PerplexicaAdapter) to the tool handler.
    def get_search_use_case() -> SearchUseCase: """Create SearchUseCase instance with dependencies. Args: settings: Optional settings instance. If not provided, loads from environment. Returns: Configured SearchUseCase instance. """ adapter = get_perplexica_adapter() return SearchUseCase(search_port=adapter)

Other Tools

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/Kaiohz/mcp-perplexica'

If you have feedback or need assistance with the MCP directory API, please join our Discord server