Skip to main content
Glama

MCP Perplexica

MCP server proxy for Perplexica search API.

This server allows LLMs to perform web searches through Perplexica using the Model Context Protocol (MCP).

Features

  • πŸ” Web search through Perplexica

  • πŸ“š Multiple focus modes (web, academic, YouTube, Reddit, etc.)

  • ⚑ Configurable optimization modes (speed, balanced, quality)

  • πŸ”§ Customizable model configuration

  • πŸ“– Source citations in responses

  • πŸš€ Multiple transport modes (stdio, SSE, Streamable HTTP)

Prerequisites

Installation

  1. Clone the repository:

git clone https://github.com/Kaiohz/mcp-perplexica.git cd mcp-perplexica
  1. Install dependencies with UV:

uv sync
  1. Create your environment file:

cp .env.example .env
  1. Edit .env with your configuration:

# Perplexica API PERPLEXICA_URL=http://localhost:3000 # Transport: stdio (default), sse, or streamable-http TRANSPORT=stdio HOST=127.0.0.1 PORT=8000 # Model configuration DEFAULT_CHAT_MODEL_PROVIDER_ID=your-provider-id DEFAULT_CHAT_MODEL_KEY=anthropic/claude-sonnet-4.5 DEFAULT_EMBEDDING_MODEL_PROVIDER_ID=your-provider-id DEFAULT_EMBEDDING_MODEL_KEY=openai/text-embedding-3-small

Usage

Transport Modes

The server supports three transport modes:

Transport

Description

Use Case

stdio

Standard input/output

CLI tools, Claude Desktop

sse

Server-Sent Events over HTTP

Web clients

streamable-http

Streamable HTTP (recommended for production)

Production deployments

Running with Docker Compose

The easiest way to run both Perplexica and MCP Perplexica together:

# Copy and configure environment files cp .env.example .env cp .env.perplexica.example .env.perplexica # Edit .env with your MCP Perplexica settings # Edit .env.perplexica with your Perplexica settings # Start services docker compose up -d

This starts:

  • Perplexica on http://localhost:3000

  • MCP Perplexica connected to Perplexica

Running the MCP Server (without Docker)

Stdio mode (default)

uv run python -m main

SSE mode

TRANSPORT=sse PORT=8000 uv run python -m main

Streamable HTTP mode

TRANSPORT=streamable-http PORT=8000 uv run python -m main

Claude Desktop Configuration

Add to your Claude Desktop configuration (~/Library/Application Support/Claude/claude_desktop_config.json on macOS):

{ "mcpServers": { "perplexica": { "command": "uv", "args": ["run", "--directory", "/path/to/mcp-perplexica", "python", "-m", "main"], "env": { "PERPLEXICA_URL": "http://localhost:3000", "TRANSPORT": "stdio", "DEFAULT_CHAT_MODEL_PROVIDER_ID": "your-provider-id", "DEFAULT_CHAT_MODEL_KEY": "anthropic/claude-sonnet-4.5", "DEFAULT_EMBEDDING_MODEL_PROVIDER_ID": "your-provider-id", "DEFAULT_EMBEDDING_MODEL_KEY": "openai/text-embedding-3-small" } } } }

Claude Code Configuration

For HTTP-based transports, you can add the server to Claude Code:

# Start the server with streamable-http transport TRANSPORT=streamable-http PORT=8000 uv run python -m main # Add to Claude Code claude mcp add --transport http perplexica http://localhost:8000/mcp

Available Tools

Perform a web search using Perplexica.

Parameters:

Parameter

Type

Required

Description

query

string

Yes

The search query

focus_mode

string

No

Search focus:

webSearch

,

academicSearch

,

writingAssistant

,

wolframAlphaSearch

,

youtubeSearch

,

redditSearch

optimization_mode

string

No

Optimization:

speed

,

balanced

,

quality

system_instructions

string

No

Custom instructions for AI response

chat_model_provider_id

string

No

Override default chat model provider

chat_model_key

string

No

Override default chat model

embedding_model_provider_id

string

No

Override default embedding provider

embedding_model_key

string

No

Override default embedding model

Example:

Search for "latest developments in AI" using academic focus

Development

Install dev dependencies

uv sync --dev

Run tests

uv run pytest

Run linter

uv run ruff check . uv run ruff format . uv run black src/

Architecture

This project follows hexagonal architecture:

src/ β”œβ”€β”€ main.py # MCP server entry point β”œβ”€β”€ config.py # Pydantic Settings β”œβ”€β”€ dependencies.py # Dependency injection β”œβ”€β”€ domain/ # Business core (pure Python) β”‚ β”œβ”€β”€ entities.py # Dataclasses β”‚ └── ports.py # ABC interfaces β”œβ”€β”€ application/ # Use cases β”‚ β”œβ”€β”€ requests.py # Pydantic DTOs β”‚ └── use_cases.py # Business logic └── infrastructure/ # External adapters └── perplexica/ └── adapter.py # HTTP client

License

MIT

-
security - not tested
F
license - not found
-
quality - not tested

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/Kaiohz/mcp-perplexica'

If you have feedback or need assistance with the MCP directory API, please join our Discord server