File size: 355 Bytes
056e037
 
 
 
 
1
2
3
4
5
6
# Sheared LLaMA on Deita-6K

This is the supervised fine-tuned version of [Sheared LLaMA 2.7B](https://huggingface.co/princeton-nlp/Sheared-LLaMA-2.7B) on [Deita-6k](https://huggingface.co/datasets/hkust-nlp/deita-6k-v0) for 2 epochs.

This model is for fair comparison with [LLaMA-MoE-v1-sft](https://huggingface.co/llama-moe/LLaMA-MoE-v1-3_5B-2_8-sft).