Uses OpenAI API compatible LLM server (such as vLLM) to serve the Lingshu medical AI model for medical image analysis, structured report generation, and medical Q&A capabilities
Click on "Install Server".
Wait a few minutes for the server to deploy. Once ready, it will show a "Started" state.
In the chat, type
@followed by the MCP server name and your instructions, e.g., "@Lingshu FastMCP Medical AI Serviceanalyze this chest X-ray for pneumonia signs"
That's it! The server will respond to your query, and you can continue using it as needed.
Here is a step-by-step guide with screenshots.
Lingshu FastMCP Medical AI Service
This project implements a FastMCP server for the Lingshu medical AI model and a corresponding client for testing and integration.
Components
mcp_server_lingshu.py: FastMCP server wrapping the Lingshu modelmcp_client_lingshu.py: Test client demonstrating interaction with the Lingshu FastMCP server
Server Features
Medical image analysis
Structured medical report generation
Medical Q&A
Prerequisites
FastMCP framework
OpenAI API compatible LLM server (e.g., vLLM)
Required Python packages (install via
pip install -r requirements.txt)
Setup
Clone the repository
Install dependencies:
pip install -r requirements.txt