Higress AI-Search MCP Server
Higress AI-Search MCP 服务器
概述
模型上下文协议 (MCP) 服务器提供 AI 搜索工具,通过Higress ai-search功能利用来自各种搜索引擎的实时搜索结果增强 AI 模型响应。
Related MCP server: WebSearch-MCP
演示
克莱恩
https://github.com/user-attachments/assets/60a06d99-a46c-40fc-b156-793e395542bb
克劳德桌面
https://github.com/user-attachments/assets/5c9e639f-c21c-4738-ad71-1a88cc0bcb46
特征
互联网搜索:Google、Bing、Quark - 用于一般网络信息
学术搜索:Arxiv - 用于科学论文和研究
内部知识搜索
先决条件
配置
可以使用环境变量来配置服务器:
HIGRESS_URL(可选):Higress 服务的 URL(默认值:http://localhost:8080/v1/chat/completions)。MODEL(必需):用于生成响应的 LLM 模型。INTERNAL_KNOWLEDGE_BASES(可选):内部知识库的描述。
选项 1:使用 uvx
使用 uvx 将自动从 PyPI 安装包,无需在本地克隆存储库。
{
"mcpServers": {
"higress-ai-search-mcp-server": {
"command": "uvx",
"args": [
"higress-ai-search-mcp-server"
],
"env": {
"HIGRESS_URL": "http://localhost:8080/v1/chat/completions",
"MODEL": "qwen-turbo",
"INTERNAL_KNOWLEDGE_BASES": "Employee handbook, company policies, internal process documents"
}
}
}
}选项 2:使用 uv 进行本地开发
使用 uv 需要在本地克隆存储库并指定源代码的路径。
{
"mcpServers": {
"higress-ai-search-mcp-server": {
"command": "uv",
"args": [
"--directory",
"path/to/src/higress-ai-search-mcp-server",
"run",
"higress-ai-search-mcp-server"
],
"env": {
"HIGRESS_URL": "http://localhost:8080/v1/chat/completions",
"MODEL": "qwen-turbo",
"INTERNAL_KNOWLEDGE_BASES": "Employee handbook, company policies, internal process documents"
}
}
}
}执照
该项目根据 MIT 许可证获得许可 - 有关详细信息,请参阅LICENSE文件。
Resources
Unclaimed servers have limited discoverability.
Looking for Admin?
If you are the server author, to access and configure the admin panel.
Latest Blog Posts
MCP directory API
We provide all the information about MCP servers via our MCP API.
curl -X GET 'https://glama.ai/api/mcp/v1/servers/cr7258/higress-ai-search-mcp-server'
If you have feedback or need assistance with the MCP directory API, please join our Discord server