AQLM
Collection
AQLM quantized LLMs • 21 items • Updated
• 46
Official AQLM quantization of CohereForAI/c4ai-command-r-plus .
For this quantization, we used 1 codebook of 16 bits.
Results:
| Model | Quantization | MMLU (5-shot) | Model size, Gb |
|---|---|---|---|
| CohereForAI/c4ai-command-r-v01 | None | 0.7425 | 208 |
| 1x16 | 0.6795 | 31.9 |