Skip to main content
Glama
Rohit-Seelam

Perplexity MCP Server

by Rohit-Seelam

perplexity_large

Perform comprehensive research and detailed analysis for complex queries using deep reasoning and extensive search context, returning content with citations.

Instructions

Comprehensive research with maximum depth using sonar-deep-research. Best for: Deep research tasks, comprehensive analysis, complex multi-step reasoning, academic research, detailed technical investigations. Uses high reasoning effort and search context size. WARNING: This tool may take significantly longer (potentially 10-30 minutes) and may timeout on very complex queries. Args: query: The question or prompt to send to Perplexity messages: Optional conversation context (list of {"role": "user/assistant", "content": "..."}) Returns: Dictionary with content and citations

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
messagesNo
queryYes

Implementation Reference

  • The core handler function for the 'perplexity_large' tool. Decorated with @mcp.tool() for automatic MCP registration. Processes input query and messages, retrieves 'large' configuration, calls PerplexityClient.chat_completion, formats response, and handles errors.
    @mcp.tool() def perplexity_large(query: str, messages: List[Dict[str, str]] = None) -> Dict[str, Any]: """ Comprehensive research with maximum depth using sonar-deep-research. Best for: Deep research tasks, comprehensive analysis, complex multi-step reasoning, academic research, detailed technical investigations. Uses high reasoning effort and search context size. WARNING: This tool may take significantly longer (potentially 10-30 minutes) and may timeout on very complex queries. Args: query: The question or prompt to send to Perplexity messages: Optional conversation context (list of {"role": "user/assistant", "content": "..."}) Returns: Dictionary with content and citations """ try: client = get_perplexity_client() # Prepare messages if messages is None: messages = [] # Add the current query messages.append({"role": "user", "content": query}) # Get tool configuration config = TOOL_CONFIGS["large"] # Log warning about potential timeout logger.warning(f"Starting deep research query - this may take 10-30 minutes") # Make API request response = client.chat_completion(messages=messages, **config) # Format and return response return client.format_response(response) except Exception as e: logger.exception("Error in perplexity_large") return { "error": "tool_error", "message": f"Failed to process query: {str(e)}" }
  • Schema/configuration specific to 'perplexity_large': specifies the 'sonar-deep-research' model with high reasoning effort and high search context size.
    "large": { "model": "sonar-deep-research", "reasoning_effort": "high", "web_search_options": { "search_context_size": "high" } }
  • server.py:121-121 (registration)
    The @mcp.tool() decorator registers the perplexity_large function as an MCP tool.
    @mcp.tool()
  • Helper function to lazily initialize and retrieve the shared PerplexityClient instance used by all perplexity tools including large.
    def get_perplexity_client() -> PerplexityClient: """Get or create the Perplexity client instance.""" global perplexity_client if perplexity_client is None: perplexity_client = PerplexityClient() return perplexity_client

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/Rohit-Seelam/Perplexity_MCP'

If you have feedback or need assistance with the MCP directory API, please join our Discord server