The Higress AI-Search MCP Server enhances AI model responses with real-time search results from various sources:
Internet Search: Retrieve information from Google, Bing, and Quark
Academic Search: Access scientific papers and research from Arxiv
Internal Knowledge Search: Query company policies, product documentation, and technical specifications
Provides academic search capabilities for scientific papers and research
Allows searching for general web information through Google search engine
Click on "Install Server".
Wait a few minutes for the server to deploy. Once ready, it will show a "Started" state.
In the chat, type
@followed by the MCP server name and your instructions, e.g., "@Higress AI-Search MCP Serversearch for recent advancements in quantum computing on Arxiv"
That's it! The server will respond to your query, and you can continue using it as needed.
Here is a step-by-step guide with screenshots.
Higress AI-Search MCP Server
Overview
A Model Context Protocol (MCP) server that provides an AI search tool to enhance AI model responses with real-time search results from various search engines through Higress ai-search feature.
Related MCP server: WebSearch-MCP
Demo
Cline
https://github.com/user-attachments/assets/60a06d99-a46c-40fc-b156-793e395542bb
Claude Desktop
https://github.com/user-attachments/assets/5c9e639f-c21c-4738-ad71-1a88cc0bcb46
Features
Internet Search: Google, Bing, Quark - for general web information
Academic Search: Arxiv - for scientific papers and research
Internal Knowledge Search
Prerequisites
Configuration
The server can be configured using environment variables:
HIGRESS_URL(optional): URL for the Higress service (default:http://localhost:8080/v1/chat/completions).MODEL(required): LLM model to use for generating responses.INTERNAL_KNOWLEDGE_BASES(optional): Description of internal knowledge bases.
Option 1: Using uvx
Using uvx will automatically install the package from PyPI, no need to clone the repository locally.
Option 2: Using uv with local development
Using uv requires cloning the repository locally and specifying the path to the source code.
License
This project is licensed under the MIT License - see the LICENSE file for details.