metadata
license: apache-2.0
language:
- en
datasets:
- Open-Orca/FLAN
This is a 230M parameter Small Llama model distilled from the Original one. The model is distilled on OpenOrca's FLAN dataset. The distillation ran over 160000 random samples of FLAN dataset.