Processes and returns OI-Wiki content in Markdown format, allowing AI to reference competitive programming documentation and algorithms directly
Utilizes Milvus Lite as a vector database to store and retrieve semantic vectors of OI-Wiki content, enabling semantic search capabilities for competitive programming knowledge
mcp-oi-wiki
让大模型拥有 OI-Wiki 的加成!
How does it work?
使用 Deepseek-V3 对 OI-wiki 当前的 462 个页面做摘要,将摘要嵌入为语义向量,建立向量数据库。
查询时,找到数据库中最接近的向量,返回对应的 wiki markdown。
Usage
确保你拥有 uv
。
首先,下载本仓库:
然后打开你的 MCP 配置文件(mcpo 或 claude):
Update
可以生成自己的 db/oi-wiki.db
。
将 Silicon flow API key 放在 api.key
文件中。
然后运行:
在批量推理页面下载摘要结果到 result.jsonl
。
最后运行:
生成新的 db/oi-wiki.db
。
Thanks
hybrid server
The server is able to function both locally and remotely, depending on the configuration or use case.
Tools
Enhances large language models with competitive programming knowledge by leveraging OI-Wiki content through vector search, allowing models to retrieve relevant algorithms and techniques.
Related MCP Servers
- -securityFlicense-qualityFacilitates enhanced interaction with large language models (LLMs) by providing intelligent context management, tool integration, and multi-provider AI model coordination for efficient AI-driven workflows.Last updated -Python
- -securityFlicense-qualityA smart code retrieval tool based on Model Context Protocol that provides efficient and accurate code repository search capabilities for large language models.Last updated -20Python
- AsecurityAlicenseAqualityOne click installation & Configuration,access to OpenAI's websearch functionality through the Model Context Protocol。Last updated -156PythonMIT License
- -securityAlicense-qualityProvides tools for retrieving and processing documentation through vector search, enabling AI assistants to augment their responses with relevant documentation context.Last updated -33TypeScriptMIT License