File size: 891 Bytes
51324bd
79a866c
 
 
 
 
 
 
 
51324bd
79a866c
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
---
license: cc-by-nc-4.0
language:
- en
tags: 
- text-generation
datasets: 
- stanford_alpaca
pipeline_tag: text-generation
---
This repo contains the full weights (8bit) for Falcon-7b
fit on the [Code Alpaca](https://huggingface.co/datasets/sahil2801/CodeAlpaca-20k) dataset.

## Reproduction
This version of the weights was trained with the following hyperparameters:

- Epochs: 6
- Batch size: 128
- Micro batch size: 8
- Learning rate: 3e-4
- Lora _r_: 16
- Lora target modules: query_key_value

You can reproduce using this repository: 

https://github.com/jina-ai/jerboa

Make sure you install requirements and finetune using this command using the following command:

```
python finetune.py \
    --base-model='tiiuae/falcon-7b' \
    --num-epochs=6 \
    --output-dir='./jinaai/falcon-7b' \
    --lora-target-modules=query_key_value \
    --lora-r=16 \
    --micro-batch-size=8
```