Skip to main content
Glama
mkusaka

Perplexity AI MCP Server

by mkusaka

perplexity_search

Search the web using Perplexity AI models to get context-aware answers with citations for research, learning, and information discovery.

Instructions

Search using Perplexity AI's models with context-aware responses and citations

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
queryYes
modelNoModel to use (sonar-reasoning-pro, sonar-reasoning, sonar-pro, sonar)sonar
countNo

Implementation Reference

  • The asynchronous handler function that executes the Perplexity search using the OpenAI client configured for Perplexity API. It logs the query, makes the chat completion request, formats the response, and handles errors gracefully.
    async ({ query, model, count }) => { try { logger.info(`Performing search with model ${model}: ${query}`); const response = await client.chat.completions.create({ model, messages: [{ role: "user", content: query }], max_tokens: count * 100 }); return { content: [{ type: "text", text: response.choices[0]?.message.content || "No results found" }] }; } catch (error) { logger.error("Search error:", error); return { content: [{ type: "text", text: `Error performing search: ${error instanceof Error ? error.message : String(error)}` }], isError: true }; } } );
  • Zod schema defining the input parameters for the tool: query (required string), model (optional enum default 'sonar'), count (optional number 1-10 default 5).
    { query: z.string().min(1), model: z.enum(MODELS).default("sonar").describe("Model to use (sonar-reasoning-pro, sonar-reasoning, sonar-pro, sonar)"), count: z.number().min(1).max(10).optional().default(5) },
  • Registration of the 'perplexity_search' tool on the MCP server, including name, description, input schema, and handler function.
    "perplexity_search", "Search using Perplexity AI's models with context-aware responses and citations", { query: z.string().min(1), model: z.enum(MODELS).default("sonar").describe("Model to use (sonar-reasoning-pro, sonar-reasoning, sonar-pro, sonar)"), count: z.number().min(1).max(10).optional().default(5) }, async ({ query, model, count }) => { try { logger.info(`Performing search with model ${model}: ${query}`); const response = await client.chat.completions.create({ model, messages: [{ role: "user", content: query }], max_tokens: count * 100 }); return { content: [{ type: "text", text: response.choices[0]?.message.content || "No results found" }] }; } catch (error) { logger.error("Search error:", error); return { content: [{ type: "text", text: `Error performing search: ${error instanceof Error ? error.message : String(error)}` }], isError: true }; } } );
  • Const array defining the supported Perplexity models used in the schema's enum validation.
    const MODELS = [ "sonar-reasoning-pro", "sonar-reasoning", "sonar-pro", "sonar" ] as const;
  • Initialization of the OpenAI client configured for Perplexity AI API, used by the handler.
    const client = new OpenAI({ apiKey: PERPLEXITY_API_KEY, baseURL: "https://api.perplexity.ai" });

Other Tools

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/mkusaka/mcp-server-perplexity'

If you have feedback or need assistance with the MCP directory API, please join our Discord server