turboderp's picture
Update README.md
e865d55 verified
metadata
license: apache-2.0

EXL2 quants of Mistral-7B-instruct-v0.3

v0.3's vocabulary is compatible with Mistral-Large-123B, so this works as a draft model for Mistral-Large.

2.80 bits per weight
3.00 bits per weight
4.00 bits per weight
4.50 bits per weight
5.00 bits per weight
6.00 bits per weight

measurement.json