File size: 262 Bytes
8d7c1ff
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
---
license: apache-2.0
language:
- en
datasets:
- Open-Orca/FLAN
---

This is a 230M parameter Small Llama model distilled from the Original one. The model is distilled on OpenOrca's FLAN dataset. The distillation ran over 160000 random samples of FLAN dataset.