Update README.md
Browse files
README.md
CHANGED
|
@@ -30,7 +30,7 @@ This model is the instruct post-trained version, fine-tuned for instruction task
|
|
| 30 |
|
| 31 |
The Ministral 3 family is designed for edge deployment, capable of running on a wide range of hardware. Ministral 3 14B can even be deployed locally, capable of fitting in 32GB of VRAM in BF16, and less than 24GB of RAM/VRAM when quantized.
|
| 32 |
|
| 33 |
-
We provide a no-loss FP8 version [here](https://huggingface.co/mistralai/Ministral-3-
|
| 34 |
|
| 35 |
## Key Features
|
| 36 |
Ministral 3 14B consists of two main architectural components:
|
|
|
|
| 30 |
|
| 31 |
The Ministral 3 family is designed for edge deployment, capable of running on a wide range of hardware. Ministral 3 14B can even be deployed locally, capable of fitting in 32GB of VRAM in BF16, and less than 24GB of RAM/VRAM when quantized.
|
| 32 |
|
| 33 |
+
We provide a no-loss FP8 version [here](https://huggingface.co/mistralai/Ministral-3-14B-Instruct-2512-FP8), you can find other formats and quantizations in the [Ministral 3 - Quants](https://huggingface.co/collections/mistralai/ministral-3-quants) collection.
|
| 34 |
|
| 35 |
## Key Features
|
| 36 |
Ministral 3 14B consists of two main architectural components:
|