chat_completion
Generate structured responses for tasks like technical documentation, code review, and API references using Perplexity’s AI API. Supports text, markdown, or JSON formats with optional source URL inclusion.
Instructions
Generate chat completions using the Perplexity API
Input Schema
Name | Required | Description | Default |
---|---|---|---|
custom_template | No | Custom prompt template. If provided, overrides prompt_template. | |
format | No | Response format. Use json for structured data, markdown for formatted text with code blocks. Overrides template format if provided. | text |
include_sources | No | Include source URLs in the response. Overrides template setting if provided. | |
max_tokens | No | The maximum number of tokens to generate in the response. One token is roughly 4 characters for English text. | |
messages | Yes | ||
model | No | Model to use for completion. Note: llama-3.1 models will be deprecated after 2/22/2025 | sonar |
prompt_template | No | Predefined prompt template to use for common use cases. Available templates: - technical_docs: Technical documentation with code examples and source references - security_practices: Security best practices and implementation guidelines with references - code_review: Code analysis focusing on best practices and improvements - api_docs: API documentation in structured JSON format with examples | |
temperature | No | Controls randomness in the output. Higher values (e.g. 0.8) make the output more random, while lower values (e.g. 0.2) make it more focused and deterministic. |
Input Schema (JSON Schema)
You must be authenticated.
Other Tools from MCP Perplexity Search
Related Tools
- @hannesrudolph/mcp-ragdocs
- @shariqriazz/vertex-ai-mcp-server
- @shariqriazz/vertex-ai-mcp-server
- @delano/postman-mcp-server
- @letsbuildagent/perplexity-tool