File size: 337 Bytes
2c9819b |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 |
model_family: olmo-7b
LoRA:
r: 0
alpha: 32
dropout: 0.05
data_path: locuslab/TOFU
split: full
batch_size: 8
gradient_accumulation_steps: 4
num_epochs: 10
lr: 2.0e-06
seed: 42
run_index: 1
save_dir: paper_models/final_ft_noLORA_${num_epochs}_epochs_inst_lr${lr}_${model_family}_${split}_seed${seed}_${run_index}/
weight_decay: 0.01
|