Click on "Install Server".
Wait a few minutes for the server to deploy. Once ready, it will show a "Started" state.
In the chat, type
@followed by the MCP server name and your instructions, e.g., "@Oumi MCP Serverfind a LoRA fine-tuning config for Llama 3.1 8B"
That's it! The server will respond to your query, and you can continue using it as needed.
Here is a step-by-step guide with screenshots.
Oumi MCP Server
An MCP (Model Context Protocol) server that gives AI coding assistants access to Oumi's library of ~500 ready-to-use YAML configs for fine-tuning LLMs.
When connected to Cursor, Claude Desktop, or any MCP-compatible client, the server lets the AI search for training recipes, retrieve full YAML configs, validate them, and follow guided ML engineering workflows -- all without you having to browse docs manually.
What it does
The server exposes 5 tools and 6 resources over MCP:
Tool | Purpose |
| Overview of capabilities and quickstart guide |
| Discover available model families and config types |
| Find training configs by filters |
| Get config details and full YAML content |
| Validate a config file before running |
Resource | Purpose |
| End-to-end ML engineering workflow guide |
| Training command usage and sizing heuristics |
| Synthetic data generation guidance |
| Dataset analysis and quality checks |
| Evaluation strategies and benchmarks |
| Inference best practices |
Supported models
Llama 3.1/3.2/4, Qwen 3, Phi 4, Gemma 3, DeepSeek R1, SmolLM, and more.
Supported training techniques
SFT, DPO, GRPO, KTO, LoRA, QLoRA, full fine-tuning, pretraining, evaluation, inference.
Installation
As part of Oumi (recommended)
pip install oumi[mcp]Standalone
pip install oumi-mcpFrom source (development)
git clone https://github.com/oumi-ai/oumi.git
cd projects/oumi-mcp
pip install -e .Running the server
oumi-mcpOr run as a Python module:
python -m oumi_mcp_serverConnecting to an MCP client
Cursor
Add to your Cursor MCP settings (.cursor/mcp.json):
{
"mcpServers": {
"oumi": {
"command": "oumi-mcp"
}
}
}Claude Desktop
Add to your Claude Desktop config (~/Library/Application Support/Claude/claude_desktop_config.json on macOS):
{
"mcpServers": {
"oumi": {
"command": "oumi-mcp"
}
}
}Any MCP client (stdio transport)
The server uses stdio transport by default. Point your MCP client to the oumi-mcp command.
How configs work
The server ships with a bundled snapshot of Oumi's ~500 YAML config files. On startup, it checks for a fresher cached copy and syncs from GitHub if the cache is stale (older than 24 hours). The resolution order is:
OUMI_MCP_CONFIGS_DIRenvironment variable (explicit override)~/.cache/oumi-mcp/configs(synced from GitHub, refreshed every 24h)Bundled configs shipped with the package (always-available fallback)
This means:
The server works immediately after install, even offline
Configs stay up-to-date automatically via lazy background sync
You can pin a specific config directory with the env var if needed
Force a sync
To manually refresh configs, delete the cache and restart:
rm -rf ~/.cache/oumi-mcp
oumi-mcpExample workflow
Once connected, ask your AI assistant something like:
"Find me a LoRA config for fine-tuning Llama 3.1 8B on my custom dataset"
The assistant will use the MCP tools to:
search_configs(model="llama3_1", query="8b_lora", task="sft")-- find matching recipesget_config("llama3_1/sft/8b_lora", include_content=True)-- retrieve the full YAMLHelp you customize
model_name,datasets,output_dir, etc.validate_config("/path/to/your/config.yaml", "training")-- validate before running
Configuration
Environment variable | Default | Description |
| (unset) | Override the configs directory path |
Project structure
oumi-mcp/
src/oumi_mcp_server/
__init__.py # Package metadata
__main__.py # python -m entry point
server.py # MCP server, tools, resources, config sync
config_service.py # Config parsing, search, metadata extraction
constants.py # Type definitions and constants
models.py # TypedDict data models
prompts/
mle_prompt.py # ML engineering workflow guidance resources
configs/ # Bundled YAML configs (~500 files)
recipes/ # Model-specific training recipes
apis/ # API provider configs
examples/ # Example configs
pyproject.tomlDevelopment
# Install in development mode
pip install -e ".[dev]"
# Run the server
oumi-mcp
# Run tests
pytestVersioning
This package follows semantic versioning. The version is independent from the main oumi package but tracks compatibility:
oumi-mcp 0.x.y is compatible with oumi >= 0.6.0
Configs are synced from the oumi
mainbranch and stay current regardless of package versionBump the oumi-mcp version when the server code, tools, or resources change
License
Apache-2.0 -- see the main Oumi repository for details.
This server cannot be installed
Resources
Unclaimed servers have limited discoverability.
Looking for Admin?
If you are the server author, to access and configure the admin panel.