tsavage68 commited on
Commit
9bba860
·
verified ·
1 Parent(s): 7e1ca94

End of training

Browse files
README.md CHANGED
@@ -1,6 +1,6 @@
1
  ---
2
  license: apache-2.0
3
- base_model: tsavage68/UTI_M2_1000steps_1e5rate_SFT
4
  tags:
5
  - trl
6
  - dpo
@@ -15,17 +15,17 @@ should probably proofread and complete it, then remove this comment. -->
15
 
16
  # UTI_M2_1000steps_1e5rate_03beta_CSFTDPO
17
 
18
- This model is a fine-tuned version of [tsavage68/UTI_M2_1000steps_1e5rate_SFT](https://huggingface.co/tsavage68/UTI_M2_1000steps_1e5rate_SFT) on an unknown dataset.
19
  It achieves the following results on the evaluation set:
20
- - Loss: 0.0693
21
- - Rewards/chosen: -2.2122
22
- - Rewards/rejected: -25.2896
23
- - Rewards/accuracies: 0.9000
24
- - Rewards/margins: 23.0774
25
- - Logps/rejected: -128.4648
26
- - Logps/chosen: -27.6685
27
- - Logits/rejected: -3.7624
28
- - Logits/chosen: -3.6360
29
 
30
  ## Model description
31
 
@@ -59,26 +59,46 @@ The following hyperparameters were used during training:
59
 
60
  | Training Loss | Epoch | Step | Validation Loss | Rewards/chosen | Rewards/rejected | Rewards/accuracies | Rewards/margins | Logps/rejected | Logps/chosen | Logits/rejected | Logits/chosen |
61
  |:-------------:|:-------:|:----:|:---------------:|:--------------:|:----------------:|:------------------:|:---------------:|:--------------:|:------------:|:---------------:|:-------------:|
62
- | 0.0253 | 0.6667 | 50 | 0.0693 | -1.0069 | -24.2702 | 0.9000 | 23.2633 | -125.0668 | -23.6509 | -3.7670 | -3.6402 |
63
- | 0.0173 | 1.3333 | 100 | 0.0693 | -1.0516 | -24.3164 | 0.9000 | 23.2647 | -125.2207 | -23.8000 | -3.7667 | -3.6399 |
64
- | 0.104 | 2.0 | 150 | 0.0693 | -1.7358 | -24.8843 | 0.9000 | 23.1485 | -127.1138 | -26.0806 | -3.7640 | -3.6373 |
65
- | 0.052 | 2.6667 | 200 | 0.0693 | -1.8400 | -24.9713 | 0.9000 | 23.1313 | -127.4039 | -26.4280 | -3.7637 | -3.6370 |
66
- | 0.052 | 3.3333 | 250 | 0.0693 | -1.9058 | -25.0293 | 0.9000 | 23.1235 | -127.5972 | -26.6473 | -3.7636 | -3.6369 |
67
- | 0.0866 | 4.0 | 300 | 0.0693 | -1.9126 | -25.0284 | 0.9000 | 23.1158 | -127.5940 | -26.6699 | -3.7635 | -3.6369 |
68
- | 0.0347 | 4.6667 | 350 | 0.0693 | -1.9285 | -25.0452 | 0.9000 | 23.1167 | -127.6501 | -26.7230 | -3.7635 | -3.6369 |
69
- | 0.0866 | 5.3333 | 400 | 0.0693 | -2.0129 | -25.1139 | 0.9000 | 23.1009 | -127.8790 | -27.0044 | -3.7631 | -3.6366 |
70
- | 0.0866 | 6.0 | 450 | 0.0693 | -2.0711 | -25.1660 | 0.9000 | 23.0949 | -128.0527 | -27.1981 | -3.7629 | -3.6364 |
71
- | 0.1213 | 6.6667 | 500 | 0.0693 | -2.0881 | -25.1817 | 0.9000 | 23.0936 | -128.1050 | -27.2549 | -3.7629 | -3.6363 |
72
- | 0.0866 | 7.3333 | 550 | 0.0693 | -2.1580 | -25.2446 | 0.9000 | 23.0865 | -128.3148 | -27.4881 | -3.7626 | -3.6360 |
73
- | 0.104 | 8.0 | 600 | 0.0693 | -2.1627 | -25.2452 | 0.9000 | 23.0825 | -128.3167 | -27.5036 | -3.7627 | -3.6361 |
74
- | 0.0173 | 8.6667 | 650 | 0.0693 | -2.1675 | -25.2546 | 0.9000 | 23.0871 | -128.3482 | -27.5197 | -3.7625 | -3.6360 |
75
- | 0.052 | 9.3333 | 700 | 0.0693 | -2.1889 | -25.2768 | 0.9000 | 23.0879 | -128.4222 | -27.5908 | -3.7624 | -3.6360 |
76
- | 0.0866 | 10.0 | 750 | 0.0693 | -2.2010 | -25.2811 | 0.9000 | 23.0801 | -128.4364 | -27.6313 | -3.7624 | -3.6360 |
77
- | 0.0866 | 10.6667 | 800 | 0.0693 | -2.2118 | -25.2906 | 0.9000 | 23.0788 | -128.4683 | -27.6673 | -3.7625 | -3.6360 |
78
- | 0.052 | 11.3333 | 850 | 0.0693 | -2.2122 | -25.2871 | 0.9000 | 23.0749 | -128.4565 | -27.6685 | -3.7625 | -3.6360 |
79
- | 0.0693 | 12.0 | 900 | 0.0693 | -2.2123 | -25.2854 | 0.9000 | 23.0730 | -128.4507 | -27.6690 | -3.7624 | -3.6360 |
80
- | 0.0693 | 12.6667 | 950 | 0.0693 | -2.2114 | -25.2887 | 0.9000 | 23.0773 | -128.4618 | -27.6660 | -3.7624 | -3.6360 |
81
- | 0.0866 | 13.3333 | 1000 | 0.0693 | -2.2122 | -25.2896 | 0.9000 | 23.0774 | -128.4648 | -27.6685 | -3.7624 | -3.6360 |
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
82
 
83
 
84
  ### Framework versions
 
1
  ---
2
  license: apache-2.0
3
+ base_model: tsavage68/UTI_M2_1000steps_1e7rate_SFT
4
  tags:
5
  - trl
6
  - dpo
 
15
 
16
  # UTI_M2_1000steps_1e5rate_03beta_CSFTDPO
17
 
18
+ This model is a fine-tuned version of [tsavage68/UTI_M2_1000steps_1e7rate_SFT](https://huggingface.co/tsavage68/UTI_M2_1000steps_1e7rate_SFT) on an unknown dataset.
19
  It achieves the following results on the evaluation set:
20
+ - Loss: 0.6931
21
+ - Rewards/chosen: 0.0
22
+ - Rewards/rejected: 0.0
23
+ - Rewards/accuracies: 0.0
24
+ - Rewards/margins: 0.0
25
+ - Logps/rejected: 0.0
26
+ - Logps/chosen: 0.0
27
+ - Logits/rejected: -2.7147
28
+ - Logits/chosen: -2.7147
29
 
30
  ## Model description
31
 
 
59
 
60
  | Training Loss | Epoch | Step | Validation Loss | Rewards/chosen | Rewards/rejected | Rewards/accuracies | Rewards/margins | Logps/rejected | Logps/chosen | Logits/rejected | Logits/chosen |
61
  |:-------------:|:-------:|:----:|:---------------:|:--------------:|:----------------:|:------------------:|:---------------:|:--------------:|:------------:|:---------------:|:-------------:|
62
+ | 0.6931 | 0.3333 | 25 | 0.6931 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -2.7147 | -2.7147 |
63
+ | 0.6931 | 0.6667 | 50 | 0.6931 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -2.7147 | -2.7147 |
64
+ | 0.6931 | 1.0 | 75 | 0.6931 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -2.7147 | -2.7147 |
65
+ | 0.6931 | 1.3333 | 100 | 0.6931 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -2.7147 | -2.7147 |
66
+ | 0.6931 | 1.6667 | 125 | 0.6931 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -2.7147 | -2.7147 |
67
+ | 0.6931 | 2.0 | 150 | 0.6931 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -2.7147 | -2.7147 |
68
+ | 0.6931 | 2.3333 | 175 | 0.6931 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -2.7147 | -2.7147 |
69
+ | 0.6931 | 2.6667 | 200 | 0.6931 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -2.7147 | -2.7147 |
70
+ | 0.6931 | 3.0 | 225 | 0.6931 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -2.7147 | -2.7147 |
71
+ | 0.6931 | 3.3333 | 250 | 0.6931 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -2.7147 | -2.7147 |
72
+ | 0.6931 | 3.6667 | 275 | 0.6931 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -2.7147 | -2.7147 |
73
+ | 0.6931 | 4.0 | 300 | 0.6931 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -2.7147 | -2.7147 |
74
+ | 0.6931 | 4.3333 | 325 | 0.6931 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -2.7147 | -2.7147 |
75
+ | 0.6931 | 4.6667 | 350 | 0.6931 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -2.7147 | -2.7147 |
76
+ | 0.6931 | 5.0 | 375 | 0.6931 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -2.7147 | -2.7147 |
77
+ | 0.6931 | 5.3333 | 400 | 0.6931 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -2.7147 | -2.7147 |
78
+ | 0.6931 | 5.6667 | 425 | 0.6931 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -2.7147 | -2.7147 |
79
+ | 0.6931 | 6.0 | 450 | 0.6931 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -2.7147 | -2.7147 |
80
+ | 0.6931 | 6.3333 | 475 | 0.6931 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -2.7147 | -2.7147 |
81
+ | 0.6931 | 6.6667 | 500 | 0.6931 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -2.7147 | -2.7147 |
82
+ | 0.6931 | 7.0 | 525 | 0.6931 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -2.7147 | -2.7147 |
83
+ | 0.6931 | 7.3333 | 550 | 0.6931 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -2.7147 | -2.7147 |
84
+ | 0.6931 | 7.6667 | 575 | 0.6931 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -2.7147 | -2.7147 |
85
+ | 0.6931 | 8.0 | 600 | 0.6931 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -2.7147 | -2.7147 |
86
+ | 0.6931 | 8.3333 | 625 | 0.6931 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -2.7147 | -2.7147 |
87
+ | 0.6931 | 8.6667 | 650 | 0.6931 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -2.7147 | -2.7147 |
88
+ | 0.6931 | 9.0 | 675 | 0.6931 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -2.7147 | -2.7147 |
89
+ | 0.6931 | 9.3333 | 700 | 0.6931 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -2.7147 | -2.7147 |
90
+ | 0.6931 | 9.6667 | 725 | 0.6931 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -2.7147 | -2.7147 |
91
+ | 0.6931 | 10.0 | 750 | 0.6931 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -2.7147 | -2.7147 |
92
+ | 0.6931 | 10.3333 | 775 | 0.6931 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -2.7147 | -2.7147 |
93
+ | 0.6931 | 10.6667 | 800 | 0.6931 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -2.7147 | -2.7147 |
94
+ | 0.6931 | 11.0 | 825 | 0.6931 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -2.7147 | -2.7147 |
95
+ | 0.6931 | 11.3333 | 850 | 0.6931 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -2.7147 | -2.7147 |
96
+ | 0.6931 | 11.6667 | 875 | 0.6931 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -2.7147 | -2.7147 |
97
+ | 0.6931 | 12.0 | 900 | 0.6931 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -2.7147 | -2.7147 |
98
+ | 0.6931 | 12.3333 | 925 | 0.6931 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -2.7147 | -2.7147 |
99
+ | 0.6931 | 12.6667 | 950 | 0.6931 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -2.7147 | -2.7147 |
100
+ | 0.6931 | 13.0 | 975 | 0.6931 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -2.7147 | -2.7147 |
101
+ | 0.6931 | 13.3333 | 1000 | 0.6931 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -2.7147 | -2.7147 |
102
 
103
 
104
  ### Framework versions
config.json CHANGED
@@ -1,5 +1,5 @@
1
  {
2
- "_name_or_path": "tsavage68/UTI_M2_1000steps_1e5rate_SFT",
3
  "architectures": [
4
  "MistralForCausalLM"
5
  ],
 
1
  {
2
+ "_name_or_path": "tsavage68/UTI_M2_1000steps_1e7rate_SFT",
3
  "architectures": [
4
  "MistralForCausalLM"
5
  ],
final_checkpoint/config.json CHANGED
@@ -1,5 +1,5 @@
1
  {
2
- "_name_or_path": "tsavage68/UTI_M2_1000steps_1e5rate_SFT",
3
  "architectures": [
4
  "MistralForCausalLM"
5
  ],
 
1
  {
2
+ "_name_or_path": "tsavage68/UTI_M2_1000steps_1e7rate_SFT",
3
  "architectures": [
4
  "MistralForCausalLM"
5
  ],
final_checkpoint/model-00001-of-00003.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:86e97b32b9bdee2a762123ecf718420288036ef1ba23b4036f99777f56f57fd7
3
  size 4943162240
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:9aa2e9687a5e5d24a999a996e9fe4c2bc1cf34ad347da5dc5c7e0adffcb14982
3
  size 4943162240
final_checkpoint/model-00002-of-00003.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:925d7ef4b23b4e7cf977352d95d2f880c5945a070fdfaaf70fc76f804b96a1be
3
  size 4999819232
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:268bb18cc8bbff53c912fa3961a6281dd5c163edd1b8e5c85c9b12e87e4e3a63
3
  size 4999819232
final_checkpoint/model-00003-of-00003.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:0567b754e909d6e6e784aa7eefc0bc8e1239e9a04d7a7d429d28779ad70cb5c7
3
  size 4540516256
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:bbc021dcf68d9e7ddaab0ead255721e73b7f652e3bfd34985bba6c029e0b729c
3
  size 4540516256
model-00001-of-00003.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:86e97b32b9bdee2a762123ecf718420288036ef1ba23b4036f99777f56f57fd7
3
  size 4943162240
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:9aa2e9687a5e5d24a999a996e9fe4c2bc1cf34ad347da5dc5c7e0adffcb14982
3
  size 4943162240
model-00002-of-00003.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:925d7ef4b23b4e7cf977352d95d2f880c5945a070fdfaaf70fc76f804b96a1be
3
  size 4999819232
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:268bb18cc8bbff53c912fa3961a6281dd5c163edd1b8e5c85c9b12e87e4e3a63
3
  size 4999819232
model-00003-of-00003.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:0567b754e909d6e6e784aa7eefc0bc8e1239e9a04d7a7d429d28779ad70cb5c7
3
  size 4540516256
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:bbc021dcf68d9e7ddaab0ead255721e73b7f652e3bfd34985bba6c029e0b729c
3
  size 4540516256
tokenizer_config.json CHANGED
@@ -33,7 +33,7 @@
33
  "clean_up_tokenization_spaces": false,
34
  "eos_token": "</s>",
35
  "legacy": true,
36
- "max_length": 100,
37
  "model_max_length": 1000000000000000019884624838656,
38
  "pad_token": "</s>",
39
  "sp_model_kwargs": {},
 
33
  "clean_up_tokenization_spaces": false,
34
  "eos_token": "</s>",
35
  "legacy": true,
36
+ "max_length": 1024,
37
  "model_max_length": 1000000000000000019884624838656,
38
  "pad_token": "</s>",
39
  "sp_model_kwargs": {},
training_args.bin CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:ac335e67dd3438f94b550e8a71aeb31bdaea244a37bfd72922a3242deecac5a6
3
  size 4667
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:d69572a29a8cadaaef898f272ce9479c9a1b85857de82f7fc3678b5887084b34
3
  size 4667