--- license: apache-2.0 base_model: mistralai/Ministral-3-14B-Reasoning-2512 pipeline_tag: image-text-to-text --- # Ministral-3-14B-Reasoning-2512-GGUF This model is converted from [mistralai/Ministral-3-14B-Reasoning-2512](https://huggingface.co/mistralai/Ministral-3-14B-Reasoning-2512) to GGUF using `convert_hf_to_gguf.py` To use it: ``` llama-server -hf ggml-org/Ministral-3-14B-Reasoning-2512-GGUF ```