pandora-s commited on
Commit
a39d2fc
·
verified ·
1 Parent(s): 4abb1c0

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +4 -3
README.md CHANGED
@@ -30,6 +30,8 @@ This model is the instruct post-trained version, fine-tuned for instruction task
30
 
31
  The Ministral 3 family is designed for edge deployment, capable of running on a wide range of hardware. Ministral 3 14B can even be deployed locally, capable of fitting in 32GB of VRAM in BF16, and less than 24GB of RAM/VRAM when quantized.
32
 
 
 
33
  ## Key Features
34
  Ministral 3 14B consists of two main architectural components:
35
  - **13.5B Language Model**
@@ -59,17 +61,16 @@ Bringing advanced AI capabilities to most environments.
59
  |--------------------------------|--------------------|-----------|------------------------------------------------------------------------------------------|
60
  | Ministral 3 3B Base 2512 | Base pre-trained | BF16 | [Hugging Face](https://huggingface.co/mistralai/Ministral-3-3B-Base-2512) |
61
  | Ministral 3 3B Instruct 2512 | Instruct post-trained | BF16 | [Hugging Face](https://huggingface.co/mistralai/Ministral-3-3B-Instruct-2512) |
62
- | - Ministral 3 3B Instruct 2512 FP8 | Instruct post-trained | FP8 | [Hugging Face](https://huggingface.co/mistralai/Ministral-3-3B-Instruct-2512-FP8) |
63
  | Ministral 3 3B Reasoning 2512 | Reasoning capable | BF16 | [Hugging Face](https://huggingface.co/mistralai/Ministral-3-3B-Reasoning-2512) |
64
  | Ministral 3 8B Base 2512 | Base pre-trained | BF16 | [Hugging Face](https://huggingface.co/mistralai/Ministral-3-8B-Base-2512) |
65
  | Ministral 3 8B Instruct 2512 | Instruct post-trained | BF16 | [Hugging Face](https://huggingface.co/mistralai/Ministral-3-8B-Instruct-2512) |
66
- | - Ministral 3 8B Instruct 2512 FP8 | Instruct post-trained | FP8 | [Hugging Face](https://huggingface.co/mistralai/Ministral-3-8B-Instruct-2512-FP8) |
67
  | Ministral 3 8B Reasoning 2512 | Reasoning capable | BF16 | [Hugging Face](https://huggingface.co/mistralai/Ministral-3-8B-Reasoning-2512) |
68
  | Ministral 3 14B Base 2512 | Base pre-trained** | BF16 | [Hugging Face](https://huggingface.co/mistralai/Ministral-3-14B-Base-2512) |
69
  | **Ministral 3 14B Instruct 2512** | **Instruct post-trained** | **BF16** | [Hugging Face](https://huggingface.co/mistralai/Ministral-3-14B-Instruct-2512) |
70
- | - Ministral 3 14B Instruct 2512 FP8 | Instruct post-trained | FP8 | [Hugging Face](https://huggingface.co/mistralai/Ministral-3-14B-Instruct-2512-FP8) |
71
  | Ministral 3 14B Reasoning 2512 | Reasoning capable | BF16 | [Hugging Face](https://huggingface.co/mistralai/Ministral-3-14B-Reasoning-2512) |
72
 
 
 
73
  ## Benchmark Results
74
 
75
  We compare Ministral 3 to similar sized models.
 
30
 
31
  The Ministral 3 family is designed for edge deployment, capable of running on a wide range of hardware. Ministral 3 14B can even be deployed locally, capable of fitting in 32GB of VRAM in BF16, and less than 24GB of RAM/VRAM when quantized.
32
 
33
+ We provide a no-loss FP8 version [here](https://huggingface.co/mistralai/Ministral-3-3B-Instruct-2512-FP8), you can find other formats and quantizations in the [Ministral 3 - Quants](https://huggingface.co/collections/mistralai/ministral-3-quants) collection.
34
+
35
  ## Key Features
36
  Ministral 3 14B consists of two main architectural components:
37
  - **13.5B Language Model**
 
61
  |--------------------------------|--------------------|-----------|------------------------------------------------------------------------------------------|
62
  | Ministral 3 3B Base 2512 | Base pre-trained | BF16 | [Hugging Face](https://huggingface.co/mistralai/Ministral-3-3B-Base-2512) |
63
  | Ministral 3 3B Instruct 2512 | Instruct post-trained | BF16 | [Hugging Face](https://huggingface.co/mistralai/Ministral-3-3B-Instruct-2512) |
 
64
  | Ministral 3 3B Reasoning 2512 | Reasoning capable | BF16 | [Hugging Face](https://huggingface.co/mistralai/Ministral-3-3B-Reasoning-2512) |
65
  | Ministral 3 8B Base 2512 | Base pre-trained | BF16 | [Hugging Face](https://huggingface.co/mistralai/Ministral-3-8B-Base-2512) |
66
  | Ministral 3 8B Instruct 2512 | Instruct post-trained | BF16 | [Hugging Face](https://huggingface.co/mistralai/Ministral-3-8B-Instruct-2512) |
 
67
  | Ministral 3 8B Reasoning 2512 | Reasoning capable | BF16 | [Hugging Face](https://huggingface.co/mistralai/Ministral-3-8B-Reasoning-2512) |
68
  | Ministral 3 14B Base 2512 | Base pre-trained** | BF16 | [Hugging Face](https://huggingface.co/mistralai/Ministral-3-14B-Base-2512) |
69
  | **Ministral 3 14B Instruct 2512** | **Instruct post-trained** | **BF16** | [Hugging Face](https://huggingface.co/mistralai/Ministral-3-14B-Instruct-2512) |
 
70
  | Ministral 3 14B Reasoning 2512 | Reasoning capable | BF16 | [Hugging Face](https://huggingface.co/mistralai/Ministral-3-14B-Reasoning-2512) |
71
 
72
+ Other formats available [here](https://huggingface.co/collections/mistralai/ministral-3-quants).
73
+
74
  ## Benchmark Results
75
 
76
  We compare Ministral 3 to similar sized models.