tuhinatripathi commited on
Commit
e0956f9
1 Parent(s): 5edb9fb

tuhinatripathi/gemma2b-lpr-5k

Browse files
README.md CHANGED
@@ -17,7 +17,7 @@ should probably proofread and complete it, then remove this comment. -->
17
 
18
  # outputs
19
 
20
- This model is a fine-tuned version of [unsloth/gemma-2b-bnb-4bit](https://huggingface.co/unsloth/gemma-2b-bnb-4bit) on the None dataset.
21
 
22
  ## Model description
23
 
@@ -37,15 +37,15 @@ More information needed
37
 
38
  The following hyperparameters were used during training:
39
  - learning_rate: 0.0002
40
- - train_batch_size: 1
41
  - eval_batch_size: 8
42
  - seed: 3407
43
  - gradient_accumulation_steps: 4
44
- - total_train_batch_size: 4
45
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
46
  - lr_scheduler_type: linear
47
- - lr_scheduler_warmup_steps: 200
48
- - num_epochs: 1
49
  - mixed_precision_training: Native AMP
50
 
51
  ### Training results
 
17
 
18
  # outputs
19
 
20
+ This model is a fine-tuned version of [unsloth/gemma-2b-bnb-4bit](https://huggingface.co/unsloth/gemma-2b-bnb-4bit) on an unknown dataset.
21
 
22
  ## Model description
23
 
 
37
 
38
  The following hyperparameters were used during training:
39
  - learning_rate: 0.0002
40
+ - train_batch_size: 2
41
  - eval_batch_size: 8
42
  - seed: 3407
43
  - gradient_accumulation_steps: 4
44
+ - total_train_batch_size: 8
45
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
46
  - lr_scheduler_type: linear
47
+ - lr_scheduler_warmup_steps: 5
48
+ - training_steps: 240
49
  - mixed_precision_training: Native AMP
50
 
51
  ### Training results
adapter_config.json CHANGED
@@ -16,16 +16,16 @@
16
  "megatron_core": "megatron.core",
17
  "modules_to_save": null,
18
  "peft_type": "LORA",
19
- "r": 8,
20
  "rank_pattern": {},
21
  "revision": "unsloth",
22
  "target_modules": [
23
- "q_proj",
24
  "down_proj",
25
  "k_proj",
26
- "gate_proj",
27
- "o_proj",
28
  "up_proj",
 
 
29
  "v_proj"
30
  ],
31
  "task_type": "CAUSAL_LM",
 
16
  "megatron_core": "megatron.core",
17
  "modules_to_save": null,
18
  "peft_type": "LORA",
19
+ "r": 16,
20
  "rank_pattern": {},
21
  "revision": "unsloth",
22
  "target_modules": [
23
+ "o_proj",
24
  "down_proj",
25
  "k_proj",
 
 
26
  "up_proj",
27
+ "q_proj",
28
+ "gate_proj",
29
  "v_proj"
30
  ],
31
  "task_type": "CAUSAL_LM",
adapter_model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:8a1c76c5a586628ba2982a2ce20de8a65a2e4a0a10b4fd97ffcb6295c9e79aec
3
- size 39256456
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:7e1b3fed12110bfba1668f65fb039c64f2eabb700f189820995aaf8da3573ff1
3
+ size 78480072
runs/May13_17-09-36_d79fc51216f5/events.out.tfevents.1715620184.d79fc51216f5.1477.0 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:aec57ec10e8e6cd960c4c064b79cf92d9d90bd615266190eeb32fb1f30d2ecf4
3
+ size 5750
runs/May13_17-59-12_d79fc51216f5/events.out.tfevents.1715623170.d79fc51216f5.1477.1 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:9cfdbb3b68e427109f5c2e84720bf40ac70654eb06a41280a7a64debf8f1cbf2
3
+ size 5122
runs/May13_18-09-58_d79fc51216f5/events.out.tfevents.1715623825.d79fc51216f5.16931.0 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:851466dd5d4b775bf1cbc91d69712253d3a5413a698ac13aec8d539daa0ca9d3
3
+ size 5894
tokenizer.json CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:62aac1caf8a9d4c3f0bbcea6f3b568dc4c31697217cfb0a518a27db2e4da992a
3
- size 17518624
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:7da53ca29fb16f6b2489482fc0bc6a394162cdab14d12764a1755ebc583fea79
3
+ size 17518525
training_args.bin CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:3d2f05601b021a3bd3f68a895fdf4e7a91b218c0e03fd940b4435da1bc7d1eb5
3
  size 4984
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:ad292d6eab00c56afa98369b7f0c5d8befeb7905e17752487c16c748ed12618a
3
  size 4984