Temo27Anas commited on
Commit
dcb4dcd
1 Parent(s): 86d231f

Model save

Browse files
Files changed (2) hide show
  1. README.md +161 -0
  2. model.safetensors +1 -1
README.md ADDED
@@ -0,0 +1,161 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: cc-by-nc-4.0
3
+ base_model: facebook/timesformer-base-finetuned-k400
4
+ tags:
5
+ - generated_from_trainer
6
+ metrics:
7
+ - accuracy
8
+ model-index:
9
+ - name: tsf-newDS-DRPT-r224-f90-8.8-h768-i3072-p32-b8-e100
10
+ results: []
11
+ ---
12
+
13
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
14
+ should probably proofread and complete it, then remove this comment. -->
15
+
16
+ # tsf-newDS-DRPT-r224-f90-8.8-h768-i3072-p32-b8-e100
17
+
18
+ This model is a fine-tuned version of [facebook/timesformer-base-finetuned-k400](https://huggingface.co/facebook/timesformer-base-finetuned-k400) on an unknown dataset.
19
+ It achieves the following results on the evaluation set:
20
+ - Loss: 1.4780
21
+ - Accuracy: 0.7478
22
+
23
+ ## Model description
24
+
25
+ More information needed
26
+
27
+ ## Intended uses & limitations
28
+
29
+ More information needed
30
+
31
+ ## Training and evaluation data
32
+
33
+ More information needed
34
+
35
+ ## Training procedure
36
+
37
+ ### Training hyperparameters
38
+
39
+ The following hyperparameters were used during training:
40
+ - learning_rate: 5e-05
41
+ - train_batch_size: 8
42
+ - eval_batch_size: 8
43
+ - seed: 42
44
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
45
+ - lr_scheduler_type: linear
46
+ - lr_scheduler_warmup_ratio: 0.1
47
+ - training_steps: 13100
48
+ - mixed_precision_training: Native AMP
49
+
50
+ ### Training results
51
+
52
+ | Training Loss | Epoch | Step | Validation Loss | Accuracy |
53
+ |:-------------:|:-----:|:-----:|:---------------:|:--------:|
54
+ | 1.1939 | 0.01 | 131 | 1.1309 | 0.3673 |
55
+ | 1.1355 | 1.01 | 262 | 1.1368 | 0.3230 |
56
+ | 1.0944 | 2.01 | 393 | 1.1001 | 0.3628 |
57
+ | 1.151 | 3.01 | 524 | 1.1368 | 0.3540 |
58
+ | 1.1259 | 4.01 | 655 | 1.1324 | 0.3230 |
59
+ | 1.1406 | 5.01 | 786 | 1.0984 | 0.3540 |
60
+ | 1.1126 | 6.01 | 917 | 1.0994 | 0.3540 |
61
+ | 1.1169 | 7.01 | 1048 | 1.1456 | 0.3230 |
62
+ | 1.1217 | 8.01 | 1179 | 1.1333 | 0.3230 |
63
+ | 1.1227 | 9.01 | 1310 | 1.1024 | 0.3230 |
64
+ | 1.1136 | 10.01 | 1441 | 1.1115 | 0.3540 |
65
+ | 1.0942 | 11.01 | 1572 | 1.0910 | 0.3142 |
66
+ | 1.1089 | 12.01 | 1703 | 1.0973 | 0.3540 |
67
+ | 1.1148 | 13.01 | 1834 | 1.1086 | 0.3496 |
68
+ | 1.1019 | 14.01 | 1965 | 1.0919 | 0.3540 |
69
+ | 1.1264 | 15.01 | 2096 | 1.1035 | 0.3540 |
70
+ | 1.1235 | 16.01 | 2227 | 1.0961 | 0.3540 |
71
+ | 1.1438 | 17.01 | 2358 | 1.0864 | 0.3584 |
72
+ | 1.1092 | 18.01 | 2489 | 1.0938 | 0.3319 |
73
+ | 1.111 | 19.01 | 2620 | 1.1190 | 0.3496 |
74
+ | 1.0871 | 20.01 | 2751 | 1.1039 | 0.3584 |
75
+ | 1.0632 | 21.01 | 2882 | 1.1465 | 0.3673 |
76
+ | 1.0743 | 22.01 | 3013 | 1.1446 | 0.3628 |
77
+ | 1.0811 | 23.01 | 3144 | 1.1103 | 0.3584 |
78
+ | 1.1378 | 24.01 | 3275 | 1.1192 | 0.3628 |
79
+ | 1.0274 | 25.01 | 3406 | 1.1488 | 0.3230 |
80
+ | 1.0446 | 26.01 | 3537 | 1.1257 | 0.3407 |
81
+ | 1.1225 | 27.01 | 3668 | 1.1199 | 0.3363 |
82
+ | 1.0504 | 28.01 | 3799 | 1.1628 | 0.3496 |
83
+ | 1.1138 | 29.01 | 3930 | 1.1905 | 0.3319 |
84
+ | 1.066 | 30.01 | 4061 | 1.1344 | 0.3407 |
85
+ | 1.0567 | 31.01 | 4192 | 1.1359 | 0.4027 |
86
+ | 1.011 | 32.01 | 4323 | 1.1819 | 0.3628 |
87
+ | 1.0595 | 33.01 | 4454 | 1.1846 | 0.3761 |
88
+ | 1.028 | 34.01 | 4585 | 1.2150 | 0.3717 |
89
+ | 1.045 | 35.01 | 4716 | 1.1456 | 0.3496 |
90
+ | 1.0459 | 36.01 | 4847 | 1.0731 | 0.4646 |
91
+ | 1.0581 | 37.01 | 4978 | 1.2463 | 0.4292 |
92
+ | 0.9436 | 38.01 | 5109 | 1.1388 | 0.4425 |
93
+ | 0.9794 | 39.01 | 5240 | 1.1613 | 0.4513 |
94
+ | 0.8882 | 40.01 | 5371 | 1.1544 | 0.4381 |
95
+ | 1.0316 | 41.01 | 5502 | 1.0461 | 0.4779 |
96
+ | 0.8349 | 42.01 | 5633 | 1.0396 | 0.5088 |
97
+ | 0.8478 | 43.01 | 5764 | 1.0630 | 0.5442 |
98
+ | 0.8072 | 44.01 | 5895 | 1.1215 | 0.5177 |
99
+ | 0.7213 | 45.01 | 6026 | 1.1616 | 0.6018 |
100
+ | 0.7108 | 46.01 | 6157 | 1.1122 | 0.6106 |
101
+ | 0.6225 | 47.01 | 6288 | 1.1400 | 0.6106 |
102
+ | 0.5557 | 48.01 | 6419 | 0.9576 | 0.6283 |
103
+ | 0.4944 | 49.01 | 6550 | 1.3350 | 0.5487 |
104
+ | 0.7068 | 50.01 | 6681 | 0.9125 | 0.6504 |
105
+ | 0.5947 | 51.01 | 6812 | 2.0044 | 0.4956 |
106
+ | 0.645 | 52.01 | 6943 | 1.1295 | 0.5796 |
107
+ | 0.4251 | 53.01 | 7074 | 1.7297 | 0.5 |
108
+ | 0.573 | 54.01 | 7205 | 0.9968 | 0.6372 |
109
+ | 0.4283 | 55.01 | 7336 | 1.1135 | 0.6195 |
110
+ | 0.6225 | 56.01 | 7467 | 0.8792 | 0.7212 |
111
+ | 0.3876 | 57.01 | 7598 | 1.3363 | 0.6150 |
112
+ | 0.4729 | 58.01 | 7729 | 1.2033 | 0.6460 |
113
+ | 0.4922 | 59.01 | 7860 | 1.0137 | 0.6593 |
114
+ | 0.3925 | 60.01 | 7991 | 1.5002 | 0.6106 |
115
+ | 0.4234 | 61.01 | 8122 | 1.3914 | 0.6018 |
116
+ | 0.3847 | 62.01 | 8253 | 1.2090 | 0.6460 |
117
+ | 0.3739 | 63.01 | 8384 | 1.1537 | 0.6549 |
118
+ | 0.4808 | 64.01 | 8515 | 1.0365 | 0.7124 |
119
+ | 0.2926 | 65.01 | 8646 | 1.2063 | 0.6814 |
120
+ | 0.5116 | 66.01 | 8777 | 0.9150 | 0.7301 |
121
+ | 0.34 | 67.01 | 8908 | 1.1562 | 0.6903 |
122
+ | 0.452 | 68.01 | 9039 | 1.2344 | 0.6947 |
123
+ | 0.2936 | 69.01 | 9170 | 2.3964 | 0.5088 |
124
+ | 0.3911 | 70.01 | 9301 | 1.4071 | 0.6327 |
125
+ | 0.19 | 71.01 | 9432 | 1.3819 | 0.6991 |
126
+ | 0.3191 | 72.01 | 9563 | 1.7279 | 0.6460 |
127
+ | 0.2172 | 73.01 | 9694 | 1.2274 | 0.7257 |
128
+ | 0.2871 | 74.01 | 9825 | 1.4077 | 0.6947 |
129
+ | 0.3536 | 75.01 | 9956 | 1.2094 | 0.7301 |
130
+ | 0.2616 | 76.01 | 10087 | 1.7737 | 0.6372 |
131
+ | 0.3808 | 77.01 | 10218 | 1.7553 | 0.6549 |
132
+ | 0.3956 | 78.01 | 10349 | 1.3767 | 0.7035 |
133
+ | 0.2217 | 79.01 | 10480 | 1.2784 | 0.7035 |
134
+ | 0.3449 | 80.01 | 10611 | 1.0742 | 0.7611 |
135
+ | 0.3193 | 81.01 | 10742 | 1.1135 | 0.7566 |
136
+ | 0.3241 | 82.01 | 10873 | 1.3711 | 0.7345 |
137
+ | 0.1948 | 83.01 | 11004 | 1.1718 | 0.7389 |
138
+ | 0.4882 | 84.01 | 11135 | 1.1333 | 0.7655 |
139
+ | 0.3604 | 85.01 | 11266 | 1.1587 | 0.7566 |
140
+ | 0.3536 | 86.01 | 11397 | 1.4604 | 0.6947 |
141
+ | 0.3896 | 87.01 | 11528 | 1.7899 | 0.6770 |
142
+ | 0.2398 | 88.01 | 11659 | 1.3172 | 0.7566 |
143
+ | 0.252 | 89.01 | 11790 | 1.7039 | 0.6858 |
144
+ | 0.1858 | 90.01 | 11921 | 2.2136 | 0.6195 |
145
+ | 0.2268 | 91.01 | 12052 | 1.4825 | 0.6991 |
146
+ | 0.2984 | 92.01 | 12183 | 1.5829 | 0.6858 |
147
+ | 0.1323 | 93.01 | 12314 | 1.5580 | 0.6947 |
148
+ | 0.3251 | 94.01 | 12445 | 1.4773 | 0.7522 |
149
+ | 0.1103 | 95.01 | 12576 | 1.7728 | 0.6460 |
150
+ | 0.2054 | 96.01 | 12707 | 1.6074 | 0.6681 |
151
+ | 0.2131 | 97.01 | 12838 | 1.9007 | 0.6770 |
152
+ | 0.0364 | 98.01 | 12969 | 1.5574 | 0.6947 |
153
+ | 0.1295 | 99.01 | 13100 | 1.4780 | 0.7478 |
154
+
155
+
156
+ ### Framework versions
157
+
158
+ - Transformers 4.41.2
159
+ - Pytorch 1.13.0+cu117
160
+ - Datasets 2.20.0
161
+ - Tokenizers 0.19.1
model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:c95979299fa6504564125973c10066398c301370513c836c026c5b02763c8faf
3
  size 324974092
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:f75259274b1732fa2144a1e9f8698e6e03f6006a7d2cbe1d0c1c2ad8edcb68f5
3
  size 324974092