MCP Server for ML Model Integration

local-only server

The server can only run on the client’s local machine because it depends on local resources.

Integrations

  • Integrates with a FastAPI hosted ML server to serve a trained Random Forest model for predictions and data processing.

  • Provides integration with GitHub repositories for cloning and accessing code resources needed for the MCP server setup.

  • Integrates with Imgur for image hosting used in the demonstration of the MCP server capabilities.

构建 MCP 服务器

有关如何构建 MCP 服务器以服务于经过训练的随机森林模型并将其与 Bee Framework 集成以实现 ReAct 交互的完整演练。

亲眼见证它的精彩表现📺

启动 MCP 服务器🚀

  1. 克隆此 repo git clone https://github.com/nicknochnack/BuildMCPServer
  2. 运行 MCP 服务器
    cd BuildMCPServer
    uv venv
    source .venv/bin/activate
    uv add .
    uv add ".[dev]"
    uv run mcp dev server.py
  3. 要运行代理,请在单独的终端中运行:
    source .venv/bin/activate
    uv run singleflowagent.py

启动 FastAPI 托管的 ML 服务器

git clone https://github.com/nicknochnack/CodeThat-FastML
cd CodeThat-FastML
pip install -r requirements.txt
uvicorn mlapi:app --reload
关于如何构建它的详细说明也可以在这里找到

其他参考

  • 构建 MCP 客户端(用于单流代理)
  • 我构建机器学习服务器的原始视频

谁、何时、为什么?

👨🏾‍💻 作者:Nick Renotte 📅 版本:1.x 📜 许可证:本项目采用 MIT 许可证

-
security - not tested
F
license - not found
-
quality - not tested

将经过训练的随机森林模型与 Bee Framework 相集成的服务器,使 AI 工具和代理能够与 ReAct 进行交互。

  1. See it live and in action 📺
    1. Startup MCP Server 🚀
      1. Startup FastAPI Hosted ML Server
        1. Other References 🔗 </br>
          1. Who, When, Why?
            ID: buf72euzzj