export_model
Convert fine-tuned models to compatible formats (gguf, ollama, vllm, huggingface) for deployment, specifying paths, formats, and quantization bits.
Instructions
Export a fine-tuned Unsloth model to various formats
Input Schema
| Name | Required | Description | Default |
|---|---|---|---|
| export_format | Yes | Format to export to (gguf, ollama, vllm, huggingface) | |
| model_path | Yes | Path to the fine-tuned model | |
| output_path | Yes | Path to save the exported model | |
| quantization_bits | No | Bits for quantization (for GGUF export) |
Input Schema (JSON Schema)
{
"properties": {
"export_format": {
"description": "Format to export to (gguf, ollama, vllm, huggingface)",
"enum": [
"gguf",
"ollama",
"vllm",
"huggingface"
],
"type": "string"
},
"model_path": {
"description": "Path to the fine-tuned model",
"type": "string"
},
"output_path": {
"description": "Path to save the exported model",
"type": "string"
},
"quantization_bits": {
"description": "Bits for quantization (for GGUF export)",
"type": "number"
}
},
"required": [
"model_path",
"export_format",
"output_path"
],
"type": "object"
}