loading…
Search for a command to run...
loading…
Provides access to over 500 pre-configured YAML templates and guided workflows for fine-tuning, training, and evaluating LLMs like Llama and DeepSeek. It enable
Provides access to over 500 pre-configured YAML templates and guided workflows for fine-tuning, training, and evaluating LLMs like Llama and DeepSeek. It enables AI assistants to search for recipes, retrieve configurations, and validate parameters for various machine learning tasks.
An MCP (Model Context Protocol) server that gives AI coding assistants access to Oumi's library of ~500 ready-to-use YAML configs for fine-tuning LLMs.
When connected to Cursor, Claude Desktop, or any MCP-compatible client, the server lets the AI search for training recipes, retrieve full YAML configs, validate them, and follow guided ML engineering workflows -- all without you having to browse docs manually.
The server exposes 5 tools and 6 resources over MCP:
| Tool | Purpose |
|---|---|
get_started() |
Overview of capabilities and quickstart guide |
list_categories() |
Discover available model families and config types |
search_configs(query, task, model, keyword) |
Find training configs by filters |
get_config(path, include_content) |
Get config details and full YAML content |
validate_config(config, task_type) |
Validate a config file before running |
| Resource | Purpose |
|---|---|
guidance://mle-workflow |
End-to-end ML engineering workflow guide |
guidance://mle-train |
Training command usage and sizing heuristics |
guidance://mle-synth |
Synthetic data generation guidance |
guidance://mle-analyze |
Dataset analysis and quality checks |
guidance://mle-eval |
Evaluation strategies and benchmarks |
guidance://mle-infer |
Inference best practices |
Llama 3.1/3.2/4, Qwen 3, Phi 4, Gemma 3, DeepSeek R1, SmolLM, and more.
SFT, DPO, GRPO, KTO, LoRA, QLoRA, full fine-tuning, pretraining, evaluation, inference.
pip install oumi[mcp]
pip install oumi-mcp
git clone https://github.com/oumi-ai/oumi.git
cd projects/oumi-mcp
pip install -e .
oumi-mcp
Or run as a Python module:
python -m oumi_mcp_server
Add to your Cursor MCP settings (.cursor/mcp.json):
{
"mcpServers": {
"oumi": {
"command": "oumi-mcp"
}
}
}
Add to your Claude Desktop config (~/Library/Application Support/Claude/claude_desktop_config.json on macOS):
{
"mcpServers": {
"oumi": {
"command": "oumi-mcp"
}
}
}
The server uses stdio transport by default. Point your MCP client to the oumi-mcp command.
The server ships with a bundled snapshot of Oumi's ~500 YAML config files. On startup, it checks for a fresher cached copy and syncs from GitHub if the cache is stale (older than 24 hours). The resolution order is:
OUMI_MCP_CONFIGS_DIR environment variable (explicit override)~/.cache/oumi-mcp/configs (synced from GitHub, refreshed every 24h)This means:
To manually refresh configs, delete the cache and restart:
rm -rf ~/.cache/oumi-mcp
oumi-mcp
Once connected, ask your AI assistant something like:
"Find me a LoRA config for fine-tuning Llama 3.1 8B on my custom dataset"
The assistant will use the MCP tools to:
search_configs(model="llama3_1", query="8b_lora", task="sft") -- find matching recipesget_config("llama3_1/sft/8b_lora", include_content=True) -- retrieve the full YAMLmodel_name, datasets, output_dir, etc.validate_config("/path/to/your/config.yaml", "training") -- validate before running| Environment variable | Default | Description |
|---|---|---|
OUMI_MCP_CONFIGS_DIR |
(unset) | Override the configs directory path |
oumi-mcp/
src/oumi_mcp_server/
__init__.py # Package metadata
__main__.py # python -m entry point
server.py # MCP server, tools, resources, config sync
config_service.py # Config parsing, search, metadata extraction
constants.py # Type definitions and constants
models.py # TypedDict data models
prompts/
mle_prompt.py # ML engineering workflow guidance resources
configs/ # Bundled YAML configs (~500 files)
recipes/ # Model-specific training recipes
apis/ # API provider configs
examples/ # Example configs
pyproject.toml
# Install in development mode
pip install -e ".[dev]"
# Run the server
oumi-mcp
# Run tests
pytest
This package follows semantic versioning. The version is independent from the main oumi package but tracks compatibility:
main branch and stay current regardless of package versionApache-2.0 -- see the main Oumi repository for details.
Добавь это в claude_desktop_config.json и перезапусти Claude Desktop.
{
"mcpServers": {
"oumi-mcp-server": {
"command": "npx",
"args": []
}
}
}