File size: 470 Bytes
0694bcf
 
2e10fb9
 
 
 
0694bcf
2e10fb9
1
2
3
4
5
6
7
8
---
license: bigcode-openrail-m
datasets:
- tiiuae/falcon-refinedweb
language:
- en
---
This a ~90m assistant model for cameloid models like LLama/Alpaca/Vicuna/Guanaco that use the llama tokenizer, allowing for speedups up to 3x with greed sampling. Its trained on 5.5 billion tokens of refinedweb and uses the GPTBigcode architecture and has a context window: 1024. To use please see this article on assisted generation https://huggingface.co/blog/assisted-generation.