Fix links
Browse files
README.md
CHANGED
|
@@ -24,7 +24,7 @@ tags:
|
|
| 24 |
---
|
| 25 |
|
| 26 |
# Ministral 3 14B Instruct 2512
|
| 27 |
-
The largest model in the Ministral 3 family, **Ministral 3 14B** offers frontier capabilities and performance comparable to its larger [Mistral Small 3.2 24B](https://huggingface.co/mistralai/Mistral-Small-3.2-Instruct-2506) counterpart. A powerful and efficient language model with vision capabilities.
|
| 28 |
|
| 29 |
This model is the instruct post-trained version in **FP8**, fine-tuned for instruction tasks, making it ideal for chat and instruction based use cases.
|
| 30 |
|
|
@@ -146,7 +146,7 @@ To check:
|
|
| 146 |
python -c "import mistral_common; print(mistral_common.__version__)"
|
| 147 |
```
|
| 148 |
|
| 149 |
-
You can also make use of a ready-to-go [docker image](https://github.com/vllm-project/vllm/
|
| 150 |
|
| 151 |
#### Serve
|
| 152 |
|
|
|
|
| 24 |
---
|
| 25 |
|
| 26 |
# Ministral 3 14B Instruct 2512
|
| 27 |
+
The largest model in the Ministral 3 family, **Ministral 3 14B** offers frontier capabilities and performance comparable to its larger [Mistral Small 3.2 24B](https://huggingface.co/mistralai/Mistral-Small-3.2-24B-Instruct-2506) counterpart. A powerful and efficient language model with vision capabilities.
|
| 28 |
|
| 29 |
This model is the instruct post-trained version in **FP8**, fine-tuned for instruction tasks, making it ideal for chat and instruction based use cases.
|
| 30 |
|
|
|
|
| 146 |
python -c "import mistral_common; print(mistral_common.__version__)"
|
| 147 |
```
|
| 148 |
|
| 149 |
+
You can also make use of a ready-to-go [docker image](https://github.com/vllm-project/vllm/tree/main/docker) or on the [docker hub](https://hub.docker.com/layers/vllm/vllm-openai/latest/images).
|
| 150 |
|
| 151 |
#### Serve
|
| 152 |
|