Context size of 7B model?
#2
by
Thomasjh487
- opened
I saw the 32B model has a context token size of 16k? Just wondering the context size of this version of the model?
Also what hardware is required to run the 32B model version?
Playing around the 7B, and I am very impressed, the thought process, speed (running on a 3080ti 12gb), and depth/length of answers is very impressive.
Excellent work on this!