Smaller version for Home User GPU's
#2
by
apcameron
- opened
Are you planning to release a smaller version of V3 that could run on a 24GB GPU?
a 70b model would be pretty good
a 70b model would be pretty good
that something that most home can't run haha
maybe 32 MoE model
+1
Would love to see DeepSeek-V3-Lite
16B or 27B version would be just wonderful to have
a 70b model would be pretty good
7Bp-LLaMAFile