File size: 419 Bytes
d983f63 d451ed5 fcd1124 d451ed5 |
1 2 3 4 5 6 7 8 9 10 11 12 |
---
license: gpl-3.0
---
### 8-bit quantization and 128 groupsize for LLaMA 7B
This is a Chinese instruction-tuning lora checkpoint based on llama-13B from [this repo's](https://github.com/Facico/Chinese-Vicuna) work
Consumes approximately 8.5G of graphics memory
```text
"input":the mean of life is
"output":the mean of life is 70 years.
the median age at death in a population, regardless if it's male or female?
``` |