File size: 13,231 Bytes
57c6c41
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
---
license: apache-2.0
tags:
- generated_from_trainer
metrics:
- accuracy
- wer
model-index:
- name: model_broadclass_onSet0.1
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# model_broadclass_onSet0.1

This model is a fine-tuned version of [facebook/wav2vec2-large-xlsr-53](https://huggingface.co/facebook/wav2vec2-large-xlsr-53) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.1129
- 0 Precision: 1.0
- 0 Recall: 1.0
- 0 F1-score: 1.0
- 0 Support: 31
- 1 Precision: 0.9259
- 1 Recall: 1.0
- 1 F1-score: 0.9615
- 1 Support: 25
- 2 Precision: 1.0
- 2 Recall: 0.9259
- 2 F1-score: 0.9615
- 2 Support: 27
- 3 Precision: 1.0
- 3 Recall: 1.0
- 3 F1-score: 1.0
- 3 Support: 15
- Accuracy: 0.9796
- Macro avg Precision: 0.9815
- Macro avg Recall: 0.9815
- Macro avg F1-score: 0.9808
- Macro avg Support: 98
- Weighted avg Precision: 0.9811
- Weighted avg Recall: 0.9796
- Weighted avg F1-score: 0.9796
- Weighted avg Support: 98
- Wer: 0.0859
- Mtrix: [[0, 1, 2, 3], [0, 31, 0, 0, 0], [1, 0, 25, 0, 0], [2, 0, 2, 25, 0], [3, 0, 0, 0, 15]]

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 0.0003
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 16
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 200
- num_epochs: 80
- mixed_precision_training: Native AMP

### Training results

| Training Loss | Epoch | Step | Validation Loss | 0 Precision | 0 Recall | 0 F1-score | 0 Support | 1 Precision | 1 Recall | 1 F1-score | 1 Support | 2 Precision | 2 Recall | 2 F1-score | 2 Support | 3 Precision | 3 Recall | 3 F1-score | 3 Support | Accuracy | Macro avg Precision | Macro avg Recall | Macro avg F1-score | Macro avg Support | Weighted avg Precision | Weighted avg Recall | Weighted avg F1-score | Weighted avg Support | Wer    | Mtrix                                                                                   |
|:-------------:|:-----:|:----:|:---------------:|:-----------:|:--------:|:----------:|:---------:|:-----------:|:--------:|:----------:|:---------:|:-----------:|:--------:|:----------:|:---------:|:-----------:|:--------:|:----------:|:---------:|:--------:|:-------------------:|:----------------:|:------------------:|:-----------------:|:----------------------:|:-------------------:|:---------------------:|:--------------------:|:------:|:---------------------------------------------------------------------------------------:|
| 2.343         | 4.16  | 100  | 2.2083          | 0.3163      | 1.0      | 0.4806     | 31        | 0.0         | 0.0      | 0.0        | 25        | 0.0         | 0.0      | 0.0        | 27        | 0.0         | 0.0      | 0.0        | 15        | 0.3163   | 0.0791              | 0.25             | 0.1202             | 98                | 0.1001                 | 0.3163              | 0.1520                | 98                   | 0.9847 | [[0, 1, 2, 3], [0, 31, 0, 0, 0], [1, 25, 0, 0, 0], [2, 27, 0, 0, 0], [3, 15, 0, 0, 0]]  |
| 2.2769        | 8.33  | 200  | 2.1649          | 0.3163      | 1.0      | 0.4806     | 31        | 0.0         | 0.0      | 0.0        | 25        | 0.0         | 0.0      | 0.0        | 27        | 0.0         | 0.0      | 0.0        | 15        | 0.3163   | 0.0791              | 0.25             | 0.1202             | 98                | 0.1001                 | 0.3163              | 0.1520                | 98                   | 0.9847 | [[0, 1, 2, 3], [0, 31, 0, 0, 0], [1, 25, 0, 0, 0], [2, 27, 0, 0, 0], [3, 15, 0, 0, 0]]  |
| 1.9687        | 12.49 | 300  | 1.8723          | 0.3163      | 1.0      | 0.4806     | 31        | 0.0         | 0.0      | 0.0        | 25        | 0.0         | 0.0      | 0.0        | 27        | 0.0         | 0.0      | 0.0        | 15        | 0.3163   | 0.0791              | 0.25             | 0.1202             | 98                | 0.1001                 | 0.3163              | 0.1520                | 98                   | 0.9847 | [[0, 1, 2, 3], [0, 31, 0, 0, 0], [1, 25, 0, 0, 0], [2, 27, 0, 0, 0], [3, 15, 0, 0, 0]]  |
| 1.8046        | 16.65 | 400  | 1.6982          | 0.3163      | 1.0      | 0.4806     | 31        | 0.0         | 0.0      | 0.0        | 25        | 0.0         | 0.0      | 0.0        | 27        | 0.0         | 0.0      | 0.0        | 15        | 0.3163   | 0.0791              | 0.25             | 0.1202             | 98                | 0.1001                 | 0.3163              | 0.1520                | 98                   | 0.9847 | [[0, 1, 2, 3], [0, 31, 0, 0, 0], [1, 25, 0, 0, 0], [2, 27, 0, 0, 0], [3, 15, 0, 0, 0]]  |
| 1.5645        | 20.82 | 500  | 1.5862          | 0.3163      | 1.0      | 0.4806     | 31        | 0.0         | 0.0      | 0.0        | 25        | 0.0         | 0.0      | 0.0        | 27        | 0.0         | 0.0      | 0.0        | 15        | 0.3163   | 0.0791              | 0.25             | 0.1202             | 98                | 0.1001                 | 0.3163              | 0.1520                | 98                   | 0.9847 | [[0, 1, 2, 3], [0, 31, 0, 0, 0], [1, 25, 0, 0, 0], [2, 27, 0, 0, 0], [3, 15, 0, 0, 0]]  |
| 1.5322        | 24.98 | 600  | 1.5736          | 0.3163      | 1.0      | 0.4806     | 31        | 0.0         | 0.0      | 0.0        | 25        | 0.0         | 0.0      | 0.0        | 27        | 0.0         | 0.0      | 0.0        | 15        | 0.3163   | 0.0791              | 0.25             | 0.1202             | 98                | 0.1001                 | 0.3163              | 0.1520                | 98                   | 0.9847 | [[0, 1, 2, 3], [0, 31, 0, 0, 0], [1, 25, 0, 0, 0], [2, 27, 0, 0, 0], [3, 15, 0, 0, 0]]  |
| 1.5468        | 29.16 | 700  | 1.4736          | 0.3163      | 1.0      | 0.4806     | 31        | 0.0         | 0.0      | 0.0        | 25        | 0.0         | 0.0      | 0.0        | 27        | 0.0         | 0.0      | 0.0        | 15        | 0.3163   | 0.0791              | 0.25             | 0.1202             | 98                | 0.1001                 | 0.3163              | 0.1520                | 98                   | 0.9847 | [[0, 1, 2, 3], [0, 31, 0, 0, 0], [1, 25, 0, 0, 0], [2, 27, 0, 0, 0], [3, 15, 0, 0, 0]]  |
| 1.0542        | 33.33 | 800  | 1.0068          | 0.3163      | 1.0      | 0.4806     | 31        | 0.0         | 0.0      | 0.0        | 25        | 0.0         | 0.0      | 0.0        | 27        | 0.0         | 0.0      | 0.0        | 15        | 0.3163   | 0.0791              | 0.25             | 0.1202             | 98                | 0.1001                 | 0.3163              | 0.1520                | 98                   | 0.9847 | [[0, 1, 2, 3], [0, 31, 0, 0, 0], [1, 25, 0, 0, 0], [2, 27, 0, 0, 0], [3, 15, 0, 0, 0]]  |
| 0.9664        | 37.49 | 900  | 0.9831          | 0.3483      | 1.0      | 0.5167     | 31        | 1.0         | 0.12     | 0.2143     | 25        | 1.0         | 0.0370   | 0.0714     | 27        | 0.8         | 0.2667   | 0.4        | 15        | 0.3980   | 0.7871              | 0.3559           | 0.3006             | 98                | 0.7632                 | 0.3980              | 0.2990                | 98                   | 0.9758 | [[0, 1, 2, 3], [0, 31, 0, 0, 0], [1, 21, 3, 0, 1], [2, 26, 0, 1, 0], [3, 11, 0, 0, 4]]  |
| 0.9405        | 41.65 | 1000 | 0.9402          | 0.3827      | 1.0      | 0.5536     | 31        | 1.0         | 0.04     | 0.0769     | 25        | 1.0         | 0.4815   | 0.65       | 27        | 1.0         | 0.2      | 0.3333     | 15        | 0.4898   | 0.8457              | 0.4304           | 0.4035             | 98                | 0.8047                 | 0.4898              | 0.4248                | 98                   | 0.9630 | [[0, 1, 2, 3], [0, 31, 0, 0, 0], [1, 24, 1, 0, 0], [2, 14, 0, 13, 0], [3, 12, 0, 0, 3]] |
| 0.9341        | 45.82 | 1100 | 0.9330          | 0.5082      | 1.0      | 0.6739     | 31        | 0.9231      | 0.48     | 0.6316     | 25        | 1.0         | 0.6296   | 0.7727     | 27        | 0.8571      | 0.4      | 0.5455     | 15        | 0.6735   | 0.8221              | 0.6274           | 0.6559             | 98                | 0.8029                 | 0.6735              | 0.6707                | 98                   | 0.9497 | [[0, 1, 2, 3], [0, 31, 0, 0, 0], [1, 12, 12, 0, 1], [2, 9, 1, 17, 0], [3, 9, 0, 0, 6]]  |
| 0.8769        | 49.98 | 1200 | 0.8662          | 0.6327      | 1.0      | 0.775      | 31        | 0.9565      | 0.88     | 0.9167     | 25        | 1.0         | 0.6296   | 0.7727     | 27        | 0.8889      | 0.5333   | 0.6667     | 15        | 0.7959   | 0.8695              | 0.7607           | 0.7828             | 98                | 0.8557                 | 0.7959              | 0.7939                | 98                   | 0.9442 | [[0, 1, 2, 3], [0, 31, 0, 0, 0], [1, 2, 22, 0, 1], [2, 9, 1, 17, 0], [3, 7, 0, 0, 8]]   |
| 0.8122        | 54.16 | 1300 | 0.7951          | 0.9062      | 0.9355   | 0.9206     | 31        | 0.8519      | 0.92     | 0.8846     | 25        | 1.0         | 0.8519   | 0.92       | 27        | 0.9375      | 1.0      | 0.9677     | 15        | 0.9184   | 0.9239              | 0.9268           | 0.9232             | 98                | 0.9230                 | 0.9184              | 0.9185                | 98                   | 0.9348 | [[0, 1, 2, 3], [0, 29, 2, 0, 0], [1, 1, 23, 0, 1], [2, 2, 2, 23, 0], [3, 0, 0, 0, 15]]  |
| 0.5747        | 58.33 | 1400 | 0.4843          | 1.0         | 1.0      | 1.0        | 31        | 0.96        | 0.96     | 0.96       | 25        | 1.0         | 0.9630   | 0.9811     | 27        | 0.9375      | 1.0      | 0.9677     | 15        | 0.9796   | 0.9744              | 0.9807           | 0.9772             | 98                | 0.9802                 | 0.9796              | 0.9797                | 98                   | 0.6732 | [[0, 1, 2, 3], [0, 31, 0, 0, 0], [1, 0, 24, 0, 1], [2, 0, 1, 26, 0], [3, 0, 0, 0, 15]]  |
| 0.2794        | 62.49 | 1500 | 0.2062          | 1.0         | 1.0      | 1.0        | 31        | 0.96        | 0.96     | 0.96       | 25        | 1.0         | 0.9630   | 0.9811     | 27        | 0.9375      | 1.0      | 0.9677     | 15        | 0.9796   | 0.9744              | 0.9807           | 0.9772             | 98                | 0.9802                 | 0.9796              | 0.9797                | 98                   | 0.2236 | [[0, 1, 2, 3], [0, 31, 0, 0, 0], [1, 0, 24, 0, 1], [2, 0, 1, 26, 0], [3, 0, 0, 0, 15]]  |
| 0.1654        | 66.65 | 1600 | 0.1573          | 1.0         | 0.9677   | 0.9836     | 31        | 0.9259      | 1.0      | 0.9615     | 25        | 1.0         | 0.9630   | 0.9811     | 27        | 1.0         | 1.0      | 1.0        | 15        | 0.9796   | 0.9815              | 0.9827           | 0.9816             | 98                | 0.9811                 | 0.9796              | 0.9798                | 98                   | 0.1303 | [[0, 1, 2, 3], [0, 30, 1, 0, 0], [1, 0, 25, 0, 0], [2, 0, 1, 26, 0], [3, 0, 0, 0, 15]]  |
| 0.1092        | 70.82 | 1700 | 0.1451          | 1.0         | 0.9677   | 0.9836     | 31        | 0.8889      | 0.96     | 0.9231     | 25        | 1.0         | 0.9259   | 0.9615     | 27        | 0.9375      | 1.0      | 0.9677     | 15        | 0.9592   | 0.9566              | 0.9634           | 0.9590             | 98                | 0.9621                 | 0.9592              | 0.9597                | 98                   | 0.1056 | [[0, 1, 2, 3], [0, 30, 1, 0, 0], [1, 0, 24, 0, 1], [2, 0, 2, 25, 0], [3, 0, 0, 0, 15]]  |
| 0.085         | 74.98 | 1800 | 0.1126          | 1.0         | 1.0      | 1.0        | 31        | 0.9259      | 1.0      | 0.9615     | 25        | 1.0         | 0.9259   | 0.9615     | 27        | 1.0         | 1.0      | 1.0        | 15        | 0.9796   | 0.9815              | 0.9815           | 0.9808             | 98                | 0.9811                 | 0.9796              | 0.9796                | 98                   | 0.0938 | [[0, 1, 2, 3], [0, 31, 0, 0, 0], [1, 0, 25, 0, 0], [2, 0, 2, 25, 0], [3, 0, 0, 0, 15]]  |
| 0.0824        | 79.16 | 1900 | 0.1118          | 1.0         | 1.0      | 1.0        | 31        | 0.9259      | 1.0      | 0.9615     | 25        | 1.0         | 0.9259   | 0.9615     | 27        | 1.0         | 1.0      | 1.0        | 15        | 0.9796   | 0.9815              | 0.9815           | 0.9808             | 98                | 0.9811                 | 0.9796              | 0.9796                | 98                   | 0.0859 | [[0, 1, 2, 3], [0, 31, 0, 0, 0], [1, 0, 25, 0, 0], [2, 0, 2, 25, 0], [3, 0, 0, 0, 15]]  |


### Framework versions

- Transformers 4.25.1
- Pytorch 1.13.0+cu116
- Datasets 2.8.0
- Tokenizers 0.13.2