Llama-Flan-XL2base / README.md
Sayan01's picture
Create README.md
8d7c1ff
|
raw
history blame
262 Bytes
metadata
license: apache-2.0
language:
  - en
datasets:
  - Open-Orca/FLAN

This is a 230M parameter Small Llama model distilled from the Original one. The model is distilled on OpenOrca's FLAN dataset. The distillation ran over 160000 random samples of FLAN dataset.