jukofyork commited on
Commit
69a082a
1 Parent(s): ffcaa1d

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +10 -2
README.md CHANGED
@@ -16,7 +16,7 @@ This is a fixed version of [Eurus-70b-nca](https://huggingface.co/openbmb/Eurus-
16
  <s>[INST] Instruction [/INST] Model answer</s> [INST] Follow-up instruction [/INST]
17
  ```
18
 
19
- This fixed version has a context length of 16k and a RoPE base frequency of 1000000:
20
 
21
  ```
22
  > ./perplexity -m eurus:70b-nca-q8_0.gguf -f wiki.test.raw -c 4096
@@ -28,7 +28,15 @@ Final estimate: PPL = 5.5200 +/- 0.03000
28
  Final estimate: PPL = 5.3553 +/- 0.02877
29
  ```
30
 
31
- I have also tested it with multi-turn conversations for 10k+ context and it has remained perfectly coherent.
 
 
 
 
 
 
 
 
32
 
33
  ---
34
 
 
16
  <s>[INST] Instruction [/INST] Model answer</s> [INST] Follow-up instruction [/INST]
17
  ```
18
 
19
+ This version has been fixed to have a context length of 16k and use RoPE base frequency of 1000000:
20
 
21
  ```
22
  > ./perplexity -m eurus:70b-nca-q8_0.gguf -f wiki.test.raw -c 4096
 
28
  Final estimate: PPL = 5.3553 +/- 0.02877
29
  ```
30
 
31
+ I have tested it with multi-turn conversations for 10k+ context and it has remained perfectly coherent.
32
+
33
+ It even looks to be fine for use with a context length of 32k:
34
+
35
+ ```
36
+ > ./perplexity -m eurus:70b-nca-q8_0.gguf -f wiki.test.raw -c 32768
37
+
38
+ Final estimate: PPL = 5.1806 +/- 0.02725
39
+ ```
40
 
41
  ---
42