export_model
Convert fine-tuned models into specific formats like GGUF, Ollama, VLLM, or HuggingFace for deployment or further use. Specify input and output paths along with quantization details if required.
Instructions
Export a fine-tuned Unsloth model to various formats
Input Schema
Name | Required | Description | Default |
---|---|---|---|
export_format | Yes | Format to export to (gguf, ollama, vllm, huggingface) | |
model_path | Yes | Path to the fine-tuned model | |
output_path | Yes | Path to save the exported model | |
quantization_bits | No | Bits for quantization (for GGUF export) |