We provide all the information about MCP servers via our MCP API.
curl -X GET 'https://glama.ai/api/mcp/v1/servers/bio-mcp/bio-mcp-evo2'
If you have feedback or need assistance with the MCP directory API, please join our Discord server
Singularity.def•1.72 KiB
Bootstrap: docker
From: nvcr.io/nvidia/pytorch:24.01-py3
%post
# Update and install system dependencies
apt-get update && apt-get install -y \
git \
wget \
curl \
python3-pip \
&& rm -rf /var/lib/apt/lists/*
# Set up Python environment
pip install --upgrade pip
# Install MCP and dependencies
mkdir -p /app
cd /app
# Clone and install evo2
git clone --recurse-submodules https://github.com/ArcInstitute/evo2.git /opt/evo2
cd /opt/evo2
pip install --no-cache-dir .
# Install MCP server dependencies
cd /app
pip install --no-cache-dir mcp>=1.1.0 pydantic>=2.0.0 pydantic-settings>=2.0.0 httpx>=0.24.0 nvidia-ml-py>=12.0.0
# Create working directory
mkdir -p /tmp/mcp-work
chmod 777 /tmp/mcp-work
%files
src /app/src
pyproject.toml /app/
%environment
export LC_ALL=C
export PATH=/opt/conda/bin:$PATH
export PYTHONPATH=/app:$PYTHONPATH
export BIO_MCP_EVO2_TEMP_DIR=/tmp/mcp-work
export BIO_MCP_EVO2_EXECUTION_MODE=local
export PYTHONUNBUFFERED=1
%runscript
cd /app
exec python -m src.server "$@"
%help
This container provides an MCP server for the evo2 DNA language model.
Usage:
singularity run --nv evo2.sif
The --nv flag is required to enable GPU support.
Environment variables:
BIO_MCP_EVO2_EXECUTION_MODE: Execution mode (local, sbatch, api)
BIO_MCP_EVO2_MODEL_SIZE: Model size (7b or 40b)
BIO_MCP_EVO2_CUDA_DEVICE: CUDA device index
BIO_MCP_EVO2_NIM_API_KEY: Nvidia NIM API key for API mode
%labels
Author Bio-MCP Team
Version 0.1.0
Description MCP server for evo2 DNA language model with GPU support