Mixed-precision quantizations of Latitude Games' 12B models, using NVFP4+4/6 for MLP layers and FP8_DYNAMIC for self-attention.
Jesse Mitchell
DataSnake
AI & ML interests
None yet
Recent Activity
updated a collection about 2 hours ago
Roleplaying Quants updated a collection about 2 hours ago
Roleplaying Quants updated a collection about 2 hours ago
Roleplaying QuantsOrganizations
None yet