File size: 4,833 Bytes
ecacb20
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
---
library_name: transformers
license: mit
base_model: google/vivit-b-16x2-kinetics400
tags:
- generated_from_trainer
metrics:
- accuracy
- precision
- recall
- f1
model-index:
- name: ViViT_WLASL_250_epochs
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# ViViT_WLASL_250_epochs

This model is a fine-tuned version of [google/vivit-b-16x2-kinetics400](https://huggingface.co/google/vivit-b-16x2-kinetics400) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 4.0544
- Top 1 Accuracy: 0.2617
- Top 5 Accuracy: 0.5577
- Top 10 Accuracy: 0.6670
- Accuracy: 0.2617
- Precision: 0.2325
- Recall: 0.2617
- F1: 0.2253

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 2
- eval_batch_size: 2
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 8
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- training_steps: 893000
- mixed_precision_training: Native AMP

### Training results

| Training Loss | Epoch   | Step  | Validation Loss | Top 1 Accuracy | Top 5 Accuracy | Top 10 Accuracy | Accuracy | Precision | Recall | F1     |
|:-------------:|:-------:|:-----:|:---------------:|:--------------:|:--------------:|:---------------:|:--------:|:---------:|:------:|:------:|
| 30.5598       | 0.004   | 3572  | 7.6528          | 0.0010         | 0.0038         | 0.0064          | 0.0010   | 0.0008    | 0.0010 | 0.0004 |
| 29.9841       | 1.0040  | 7144  | 7.5548          | 0.0046         | 0.0120         | 0.0176          | 0.0046   | 0.0006    | 0.0046 | 0.0009 |
| 28.2597       | 2.0040  | 10716 | 7.2959          | 0.0125         | 0.0337         | 0.0495          | 0.0125   | 0.0053    | 0.0125 | 0.0048 |
| 26.1127       | 3.0040  | 14289 | 6.9165          | 0.0304         | 0.0748         | 0.1223          | 0.0301   | 0.0108    | 0.0301 | 0.0120 |
| 23.7044       | 4.004   | 17861 | 6.4996          | 0.0447         | 0.1407         | 0.2102          | 0.0447   | 0.0182    | 0.0447 | 0.0196 |
| 20.6604       | 5.0040  | 21433 | 6.0328          | 0.0822         | 0.2288         | 0.3121          | 0.0822   | 0.0421    | 0.0822 | 0.0434 |
| 17.6287       | 6.0040  | 25005 | 5.5622          | 0.1210         | 0.3041         | 0.4213          | 0.1210   | 0.0714    | 0.1210 | 0.0742 |
| 14.3215       | 7.0040  | 28578 | 5.0794          | 0.1576         | 0.3797         | 0.4951          | 0.1573   | 0.0998    | 0.1573 | 0.1038 |
| 10.5032       | 8.004   | 32150 | 4.6439          | 0.1915         | 0.4494         | 0.5695          | 0.1915   | 0.1353    | 0.1915 | 0.1386 |
| 7.2387        | 9.0040  | 35722 | 4.2461          | 0.2247         | 0.5123         | 0.6297          | 0.2255   | 0.1676    | 0.2255 | 0.1721 |
| 3.9708        | 10.0040 | 39294 | 3.9632          | 0.2485         | 0.5587         | 0.6701          | 0.2487   | 0.2034    | 0.2487 | 0.2046 |
| 2.1244        | 11.0040 | 42867 | 3.7748          | 0.2587         | 0.5753         | 0.6872          | 0.2587   | 0.2258    | 0.2587 | 0.2220 |
| 1.3992        | 12.004  | 46439 | 3.6907          | 0.2543         | 0.5794         | 0.6885          | 0.2543   | 0.2279    | 0.2543 | 0.2210 |
| 1.0175        | 13.0040 | 50011 | 3.7060          | 0.2503         | 0.5738         | 0.6874          | 0.2503   | 0.2176    | 0.2503 | 0.2142 |
| 0.914         | 14.0040 | 53583 | 3.6819          | 0.2648         | 0.5804         | 0.6915          | 0.2648   | 0.2380    | 0.2648 | 0.2311 |
| 0.7522        | 15.0040 | 57156 | 3.7360          | 0.2561         | 0.5758         | 0.6969          | 0.2564   | 0.2325    | 0.2564 | 0.2235 |
| 1.045         | 16.004  | 60728 | 3.7846          | 0.2638         | 0.5723         | 0.6877          | 0.2635   | 0.2470    | 0.2635 | 0.2327 |
| 0.8234        | 17.0040 | 64300 | 3.8910          | 0.2574         | 0.5692         | 0.6724          | 0.2572   | 0.2386    | 0.2572 | 0.2261 |
| 0.7311        | 18.0040 | 67872 | 4.0142          | 0.2561         | 0.5585         | 0.6680          | 0.2561   | 0.2402    | 0.2561 | 0.2262 |
| 1.0981        | 19.0040 | 71445 | 4.0544          | 0.2617         | 0.5577         | 0.6670          | 0.2617   | 0.2325    | 0.2617 | 0.2253 |


### Framework versions

- Transformers 4.46.1
- Pytorch 2.5.1+cu124
- Datasets 3.1.0
- Tokenizers 0.20.1