finetuned-Accident-SingleLabel-Final-v2
This model is a fine-tuned version of MCG-NJU/videomae-base on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 1.2768
- Accuracy: 0.5588
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- distributed_type: multi-GPU
- num_devices: 2
- total_train_batch_size: 16
- total_eval_batch_size: 16
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- training_steps: 50
Training results
Training Loss | Epoch | Step | Validation Loss | Accuracy |
---|---|---|---|---|
No log | 0.08 | 4 | 1.8602 | 0.1304 |
No log | 1.08 | 8 | 1.6211 | 0.4783 |
1.7588 | 2.08 | 12 | 1.4450 | 0.5652 |
1.7588 | 3.08 | 16 | 1.2476 | 0.6087 |
1.0616 | 4.08 | 20 | 1.1461 | 0.6087 |
1.0616 | 5.08 | 24 | 1.0131 | 0.5217 |
1.0616 | 6.08 | 28 | 0.9023 | 0.6087 |
0.9537 | 7.08 | 32 | 0.9932 | 0.6087 |
0.9537 | 8.08 | 36 | 0.9547 | 0.6087 |
0.7395 | 9.08 | 40 | 1.0174 | 0.6087 |
0.7395 | 10.08 | 44 | 0.9636 | 0.6087 |
0.7395 | 11.08 | 48 | 0.9022 | 0.5652 |
0.7531 | 12.04 | 50 | 0.9311 | 0.5652 |
Framework versions
- Transformers 4.35.0
- Pytorch 2.1.0
- Datasets 2.14.6
- Tokenizers 0.14.1
- Downloads last month
- 6
Inference Providers
NEW
This model is not currently available via any of the supported third-party Inference Providers, and
the model is not deployed on the HF Inference API.
Model tree for pavitemple/finetuned-Accident-SingleLabel-Final-v2
Base model
MCG-NJU/videomae-base