metadata
datasets:
- nampdn-ai/tiny-textbooks
- nampdn-ai/tiny-lessons
language:
- en
Omega 2.6B
This model is derived from phi 1.3B using layer stacking techniques to double the number of hidden layers in the model. The model was then trained for 1 epoch on data from tiny-textbooks and tiny-lessons.