Temo27Anas commited on
Commit
2093040
·
verified ·
1 Parent(s): 7633ab0

Model save

Browse files
Files changed (2) hide show
  1. README.md +121 -0
  2. model.safetensors +1 -1
README.md ADDED
@@ -0,0 +1,121 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: cc-by-nc-4.0
3
+ base_model: facebook/timesformer-base-finetuned-k400
4
+ tags:
5
+ - generated_from_trainer
6
+ metrics:
7
+ - accuracy
8
+ model-index:
9
+ - name: tsf-gs-rot-flip-wtoken-DRPT-r128-f150-8.8-h768-i3072-p32-b8-e60
10
+ results: []
11
+ ---
12
+
13
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
14
+ should probably proofread and complete it, then remove this comment. -->
15
+
16
+ # tsf-gs-rot-flip-wtoken-DRPT-r128-f150-8.8-h768-i3072-p32-b8-e60
17
+
18
+ This model is a fine-tuned version of [facebook/timesformer-base-finetuned-k400](https://huggingface.co/facebook/timesformer-base-finetuned-k400) on an unknown dataset.
19
+ It achieves the following results on the evaluation set:
20
+ - Loss: 0.7740
21
+ - Accuracy: 0.8610
22
+
23
+ ## Model description
24
+
25
+ More information needed
26
+
27
+ ## Intended uses & limitations
28
+
29
+ More information needed
30
+
31
+ ## Training and evaluation data
32
+
33
+ More information needed
34
+
35
+ ## Training procedure
36
+
37
+ ### Training hyperparameters
38
+
39
+ The following hyperparameters were used during training:
40
+ - learning_rate: 5e-05
41
+ - train_batch_size: 8
42
+ - eval_batch_size: 8
43
+ - seed: 42
44
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
45
+ - lr_scheduler_type: linear
46
+ - lr_scheduler_warmup_ratio: 0.1
47
+ - training_steps: 6480
48
+ - mixed_precision_training: Native AMP
49
+
50
+ ### Training results
51
+
52
+ | Training Loss | Epoch | Step | Validation Loss | Accuracy |
53
+ |:-------------:|:-------:|:----:|:---------------:|:--------:|
54
+ | 1.1114 | 0.0168 | 109 | 1.0984 | 0.3797 |
55
+ | 1.1049 | 1.0168 | 218 | 1.1141 | 0.3262 |
56
+ | 1.1056 | 2.0168 | 327 | 1.1710 | 0.3262 |
57
+ | 1.1154 | 3.0168 | 436 | 1.1256 | 0.3369 |
58
+ | 1.1056 | 4.0168 | 545 | 1.0962 | 0.3529 |
59
+ | 1.1569 | 5.0168 | 654 | 1.1136 | 0.3262 |
60
+ | 1.036 | 6.0168 | 763 | 1.0213 | 0.4652 |
61
+ | 1.0506 | 7.0168 | 872 | 1.0488 | 0.4278 |
62
+ | 1.042 | 8.0168 | 981 | 0.9366 | 0.5561 |
63
+ | 0.8999 | 9.0168 | 1090 | 0.8098 | 0.6310 |
64
+ | 0.9432 | 10.0168 | 1199 | 0.9513 | 0.5936 |
65
+ | 0.7898 | 11.0168 | 1308 | 0.5836 | 0.8021 |
66
+ | 0.7952 | 12.0168 | 1417 | 0.5680 | 0.7647 |
67
+ | 0.6641 | 13.0168 | 1526 | 0.6147 | 0.7861 |
68
+ | 0.6901 | 14.0168 | 1635 | 0.5688 | 0.7754 |
69
+ | 0.4637 | 15.0168 | 1744 | 0.5834 | 0.7914 |
70
+ | 0.5898 | 16.0168 | 1853 | 0.6636 | 0.7326 |
71
+ | 0.7036 | 17.0168 | 1962 | 0.7142 | 0.7433 |
72
+ | 0.3946 | 18.0168 | 2071 | 0.4866 | 0.8342 |
73
+ | 0.5379 | 19.0168 | 2180 | 0.6641 | 0.7701 |
74
+ | 0.5869 | 20.0168 | 2289 | 0.4817 | 0.8289 |
75
+ | 0.4564 | 21.0168 | 2398 | 0.4909 | 0.8396 |
76
+ | 0.419 | 22.0168 | 2507 | 0.5006 | 0.8235 |
77
+ | 0.4989 | 23.0168 | 2616 | 0.5648 | 0.8182 |
78
+ | 0.2701 | 24.0168 | 2725 | 0.5963 | 0.8342 |
79
+ | 0.5191 | 25.0168 | 2834 | 0.5766 | 0.7914 |
80
+ | 0.5088 | 26.0168 | 2943 | 0.4679 | 0.8610 |
81
+ | 0.3828 | 27.0168 | 3052 | 0.5231 | 0.8503 |
82
+ | 0.4228 | 28.0168 | 3161 | 0.6142 | 0.8235 |
83
+ | 0.5544 | 29.0168 | 3270 | 0.6508 | 0.8289 |
84
+ | 0.3595 | 30.0168 | 3379 | 0.6572 | 0.7914 |
85
+ | 0.3117 | 31.0168 | 3488 | 0.5587 | 0.8342 |
86
+ | 0.3324 | 32.0168 | 3597 | 0.5021 | 0.8610 |
87
+ | 0.3282 | 33.0168 | 3706 | 0.7642 | 0.8235 |
88
+ | 0.427 | 34.0168 | 3815 | 0.5739 | 0.8663 |
89
+ | 0.152 | 35.0168 | 3924 | 0.6957 | 0.8610 |
90
+ | 0.426 | 36.0168 | 4033 | 0.6705 | 0.8342 |
91
+ | 0.2803 | 37.0168 | 4142 | 0.5854 | 0.8449 |
92
+ | 0.3198 | 38.0168 | 4251 | 0.5280 | 0.8449 |
93
+ | 0.4348 | 39.0168 | 4360 | 0.7755 | 0.8128 |
94
+ | 0.1915 | 40.0168 | 4469 | 0.6813 | 0.8503 |
95
+ | 0.0793 | 41.0168 | 4578 | 0.7260 | 0.8503 |
96
+ | 0.3902 | 42.0168 | 4687 | 0.6581 | 0.8663 |
97
+ | 0.3552 | 43.0168 | 4796 | 0.5732 | 0.8610 |
98
+ | 0.3091 | 44.0168 | 4905 | 0.7510 | 0.8396 |
99
+ | 0.1103 | 45.0168 | 5014 | 0.7604 | 0.8503 |
100
+ | 0.3362 | 46.0168 | 5123 | 0.7156 | 0.8556 |
101
+ | 0.1935 | 47.0168 | 5232 | 0.6882 | 0.8503 |
102
+ | 0.0889 | 48.0168 | 5341 | 0.7639 | 0.8663 |
103
+ | 0.2156 | 49.0168 | 5450 | 0.8250 | 0.8610 |
104
+ | 0.0949 | 50.0168 | 5559 | 0.8256 | 0.8556 |
105
+ | 0.1735 | 51.0168 | 5668 | 0.6839 | 0.8770 |
106
+ | 0.1612 | 52.0168 | 5777 | 0.9082 | 0.8663 |
107
+ | 0.0556 | 53.0168 | 5886 | 0.7659 | 0.8770 |
108
+ | 0.0997 | 54.0168 | 5995 | 0.8025 | 0.8824 |
109
+ | 0.1648 | 55.0168 | 6104 | 0.8449 | 0.8770 |
110
+ | 0.1443 | 56.0168 | 6213 | 0.7801 | 0.8770 |
111
+ | 0.0405 | 57.0168 | 6322 | 0.8647 | 0.8663 |
112
+ | 0.3323 | 58.0168 | 6431 | 0.8411 | 0.8503 |
113
+ | 0.1137 | 59.0076 | 6480 | 0.7740 | 0.8610 |
114
+
115
+
116
+ ### Framework versions
117
+
118
+ - Transformers 4.41.2
119
+ - Pytorch 1.13.0+cu117
120
+ - Datasets 2.20.0
121
+ - Tokenizers 0.19.1
model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:99f4873a6d6753662e26323988328be0b625cbd7101d331deaf170f094e20c29
3
  size 325158412
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:09edfffc474217a7ce618f01268584d6e2c07da61883df9ef73ef07f500aec3a
3
  size 325158412