Skip to main content
Glama

Unsloth MCP Server

by OtotaO

export_model

Convert fine-tuned models into specific formats like GGUF, Ollama, VLLM, or HuggingFace for deployment or further use. Specify input and output paths along with quantization details if required.

Instructions

Export a fine-tuned Unsloth model to various formats

Input Schema

NameRequiredDescriptionDefault
export_formatYesFormat to export to (gguf, ollama, vllm, huggingface)
model_pathYesPath to the fine-tuned model
output_pathYesPath to save the exported model
quantization_bitsNoBits for quantization (for GGUF export)

Input Schema (JSON Schema)

{ "properties": { "export_format": { "description": "Format to export to (gguf, ollama, vllm, huggingface)", "enum": [ "gguf", "ollama", "vllm", "huggingface" ], "type": "string" }, "model_path": { "description": "Path to the fine-tuned model", "type": "string" }, "output_path": { "description": "Path to save the exported model", "type": "string" }, "quantization_bits": { "description": "Bits for quantization (for GGUF export)", "type": "number" } }, "required": [ "model_path", "export_format", "output_path" ], "type": "object" }
Install Server

Other Tools from Unsloth MCP Server

Related Tools

    MCP directory API

    We provide all the information about MCP servers via our MCP API.

    curl -X GET 'https://glama.ai/api/mcp/v1/servers/OtotaO/unsloth-mcp-server'

    If you have feedback or need assistance with the MCP directory API, please join our Discord server