Skip to main content
Glama

Lingshu FastMCP Medical AI Service

Lingshu FastMCP Medical AI Service

This project implements a FastMCP server for the Lingshu medical AI model and a corresponding client for testing and integration.

Components

  1. mcp_server_lingshu.py: FastMCP server wrapping the Lingshu model
  2. mcp_client_lingshu.py: Test client demonstrating interaction with the Lingshu FastMCP server

Server Features

  • Medical image analysis
  • Structured medical report generation
  • Medical Q&A

Prerequisites

  • FastMCP framework
  • OpenAI API compatible LLM server (e.g., vLLM)
  • Required Python packages (install via pip install -r requirements.txt)

Setup

  1. Clone the repository
  2. Install dependencies: pip install -r requirements.txt

Usage

Use vLLM to serve the Lingshu Model

vllm serve lingshu-medical-mllm/Lingshu-7B --dtype float16 --api_key api_key --port 8000 --max-model-len 32768

Wrap the server with FastMCP

export LINGSHU_SERVER_URL="http://localhost:8000/v1" export LINGSHU_SERVER_API="api_key" export LINGSHU_MODEL="lingshu-medical-mllm/Lingshu-7B" # the above config depends on your vllm server config python mcp_server_lingshu.py --host 127.0.0.1 --port 4200 --path /lingshu --log-level info

Try connect Lingshu with MCP

export LLM_SERVER_URL="xxx" export LLM_SERVER_API="xxx" export LLM_MODEL="xxx" ## this is your own model python mcp_client_lingshu.py --mcp-url http://127.0.0.1:4200/lingshu # the mcp-url should depend on the mcp server you deployed in the last step
-
security - not tested
F
license - not found
-
quality - not tested

remote-capable server

The server can be hosted and run remotely because it primarily relies on remote services or has no dependency on the local environment.

Enables medical image analysis, structured medical report generation, and medical Q&A through the Lingshu medical AI model. Provides healthcare professionals and developers with AI-powered medical assistance capabilities via a FastMCP server interface.

  1. Components
    1. Server Features
      1. Prerequisites
        1. Setup
          1. Usage
            1. Use vLLM to serve the Lingshu Model
            2. Wrap the server with FastMCP
            3. Try connect Lingshu with MCP

          MCP directory API

          We provide all the information about MCP servers via our MCP API.

          curl -X GET 'https://glama.ai/api/mcp/v1/servers/alibaba-damo-academy/Lingshu_MCP'

          If you have feedback or need assistance with the MCP directory API, please join our Discord server