Transformers
mpt
Composer
MosaicML
llm-foundry
TheBloke commited on
Commit
161100d
·
1 Parent(s): 30c843f

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +3 -3
README.md CHANGED
@@ -56,10 +56,10 @@ Below is an instruction that describes a task. Write a response that appropriate
56
 
57
  ## A note regarding context length: 8K
58
 
59
- It is confirmed that the 8K context of this model works in [KoboldCpp](https://github.com/LostRuins/koboldcpp), if you manually set max context to 8K by adjusting the text box above the slider:
60
- ![.](https://s3.amazonaws.com/moonup/production/uploads/63cd4b6d1c8a5d1d7d76a778/LcoIOa7YdDZa-R-R4BWYw.png)
61
 
62
- (set it to 8192 at most)
 
63
 
64
  It is currently unknown as to whether it is compatible with other clients.
65
 
 
56
 
57
  ## A note regarding context length: 8K
58
 
59
+ The base model has an 8K context length. It is not yet confirmed if the 8K context of this model works with the quantised files.
 
60
 
61
+ If it does, [KoboldCpp](https://github.com/LostRuins/koboldcpp) supports 8K context if you manually it to 8K by adjusting the text box above the slider:
62
+ ![.](https://i.imgur.com/tEbpeJq.png)
63
 
64
  It is currently unknown as to whether it is compatible with other clients.
65