JohannesGaessler commited on
Commit
c916094
·
verified ·
1 Parent(s): d107fc0

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +6 -3
README.md CHANGED
@@ -1,3 +1,6 @@
1
- ---
2
- license: apache-2.0
3
- ---
 
 
 
 
1
+ ---
2
+ license: apache-2.0
3
+ ---
4
+
5
+ This repository contains FP16 logits produced via the llama.cpp `perplexity` with `wikitext-2-raw/wiki.test.raw`.
6
+ By using the logits as input the KL divergence for a quantized model can be calculated without the need to run the model at FP16.