Skip to main content
Glama

MCP Perplexity Search

mcp-perplexity-search


⚠️ Notice

This repository is no longer maintained.

The functionality of this tool is now available in mcp-omnisearch, which combines multiple MCP tools in one unified package.

Please use mcp-omnisearch instead.


A Model Context Protocol (MCP) server for integrating Perplexity's AI API with LLMs. This server provides advanced chat completion capabilities with specialized prompt templates for various use cases.

Features

  • 🤖 Advanced chat completion using Perplexity's AI models

  • 📝 Predefined prompt templates for common scenarios:

    • Technical documentation generation

    • Security best practices analysis

    • Code review and improvements

    • API documentation in structured format

  • 🎯 Custom template support for specialized use cases

  • 📊 Multiple output formats (text, markdown, JSON)

  • 🔍 Optional source URL inclusion in responses

  • ⚙️ Configurable model parameters (temperature, max tokens)

  • 🚀 Support for various Perplexity models including Sonar and LLaMA

Configuration

This server requires configuration through your MCP client. Here are examples for different environments:

Cline Configuration

Add this to your Cline MCP settings:

{ "mcpServers": { "mcp-perplexity-search": { "command": "npx", "args": ["-y", "mcp-perplexity-search"], "env": { "PERPLEXITY_API_KEY": "your-perplexity-api-key" } } } }

Claude Desktop with WSL Configuration

For WSL environments, add this to your Claude Desktop configuration:

{ "mcpServers": { "mcp-perplexity-search": { "command": "wsl.exe", "args": [ "bash", "-c", "source ~/.nvm/nvm.sh && PERPLEXITY_API_KEY=your-perplexity-api-key /home/username/.nvm/versions/node/v20.12.1/bin/npx mcp-perplexity-search" ] } } }

Environment Variables

The server requires the following environment variable:

  • PERPLEXITY_API_KEY: Your Perplexity API key (required)

API

The server implements a single MCP tool with configurable parameters:

chat_completion

Generate chat completions using the Perplexity API with support for specialized prompt templates.

Parameters:

  • messages (array, required): Array of message objects with:

    • role (string): 'system', 'user', or 'assistant'

    • content (string): The message content

  • prompt_template (string, optional): Predefined template to use:

    • technical_docs: Technical documentation with code examples

    • security_practices: Security implementation guidelines

    • code_review: Code analysis and improvements

    • api_docs: API documentation in JSON format

  • custom_template (object, optional): Custom prompt template with:

    • system (string): System message for assistant behaviour

    • format (string): Output format preference

    • include_sources (boolean): Whether to include sources

  • format (string, optional): 'text', 'markdown', or 'json' (default: 'text')

  • include_sources (boolean, optional): Include source URLs (default: false)

  • model (string, optional): Perplexity model to use (default: 'sonar')

  • temperature (number, optional): Output randomness (0-1, default: 0.7)

  • max_tokens (number, optional): Maximum response length (default: 1024)

Development

Setup

  1. Clone the repository

  2. Install dependencies:

pnpm install
  1. Build the project:

pnpm build
  1. Run in development mode:

pnpm dev

Publishing

The project uses changesets for version management. To publish:

  1. Create a changeset:

pnpm changeset
  1. Version the package:

pnpm changeset version
  1. Publish to npm:

pnpm release

Contributing

Contributions are welcome! Please feel free to submit a Pull Request.

License

MIT License - see the LICENSE file for details.

Acknowledgments

Deploy Server
A
security – no known vulnerabilities
A
license - permissive license
A
quality - confirmed to work

Related MCP Servers

  • A
    security
    F
    license
    A
    quality
    Provides a standardized way to integrate Perplexity AI's features like chat, search, and documentation access into MCP-based systems.
    Last updated -
    1
  • -
    security
    A
    license
    -
    quality
    Enables LLMs to interact with Dify AI's chat completion API, including conversation context support and a restaurant recommendation tool.
    Last updated -
    11
    MIT License
  • -
    security
    A
    license
    -
    quality
    Provides access to Perplexity AI models through two tools: ask\_perplexity for expert programming assistance and chat\_perplexity for maintaining ongoing conversations with context preservation.
    Last updated -
    MIT License
    • Linux
    • Apple
  • -
    security
    F
    license
    -
    quality
    Interfaces with the Perplexity AI API to provide advanced question answering capabilities through the standardized Model Context Protocol, supporting multiple Perplexity models.
    Last updated -
    1

View all related MCP servers

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/spences10/mcp-perplexity-search'

If you have feedback or need assistance with the MCP directory API, please join our Discord server