File size: 322 Bytes
37e9c69 ca00b92 5279770 37e9c69 ca00b92 010b076 ca00b92 5279770 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 |
---
license: mit
language:
- en
tags:
- IFT
---
# **Introduction**
This model originate from "LLaMA 2-7b" we trained only response part with the "Alpaca-GPT-4" dataset, utilizing LoRA (Low-Rank Adaptation) training. The weights from LoRA are merged into the model.
## Details
### Used Datasets
- vicgalle/alpaca-gpt4
|