File size: 8,507 Bytes
8712ba8
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
---

license: apache-2.0
base_model: facebook/wav2vec2-base
tags:
- generated_from_trainer
metrics:
- accuracy
- precision
- recall
- f1
model-index:
- name: wav2vec2-classifier-aug-large
  results: []
---


<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# wav2vec2-classifier-aug-large

This model is a fine-tuned version of [facebook/wav2vec2-base](https://huggingface.co/facebook/wav2vec2-base) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.4814
- Accuracy: 0.8666
- Precision: 0.8790
- Recall: 0.8666
- F1: 0.8664
- Binary: 0.9089

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 0.0001

- train_batch_size: 32

- eval_batch_size: 32

- seed: 42

- gradient_accumulation_steps: 4

- total_train_batch_size: 128
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 10

- mixed_precision_training: Native AMP



### Training results



| Training Loss | Epoch | Step | Validation Loss | Accuracy | Precision | Recall | F1     | Binary |

|:-------------:|:-----:|:----:|:---------------:|:--------:|:---------:|:------:|:------:|:------:|

| No log        | 0.1   | 50   | 3.8510          | 0.0606   | 0.0171    | 0.0606 | 0.0152 | 0.3357 |

| No log        | 0.2   | 100  | 3.4088          | 0.0863   | 0.0528    | 0.0863 | 0.0366 | 0.3563 |

| No log        | 0.29  | 150  | 3.1643          | 0.1105   | 0.0472    | 0.1105 | 0.0408 | 0.3740 |

| No log        | 0.39  | 200  | 2.8623          | 0.2453   | 0.1221    | 0.2453 | 0.1447 | 0.4687 |

| No log        | 0.49  | 250  | 2.5829          | 0.2763   | 0.1792    | 0.2763 | 0.1862 | 0.4920 |

| No log        | 0.59  | 300  | 2.3355          | 0.3787   | 0.3219    | 0.3787 | 0.3034 | 0.5642 |

| No log        | 0.69  | 350  | 2.0865          | 0.4744   | 0.4303    | 0.4744 | 0.4021 | 0.6301 |

| No log        | 0.78  | 400  | 1.9045          | 0.5296   | 0.4587    | 0.5296 | 0.4649 | 0.6687 |

| No log        | 0.88  | 450  | 1.6752          | 0.5472   | 0.5374    | 0.5472 | 0.4900 | 0.6829 |

| No log        | 0.98  | 500  | 1.5273          | 0.6105   | 0.6019    | 0.6105 | 0.5709 | 0.7276 |

| 2.9976        | 1.08  | 550  | 1.3712          | 0.6536   | 0.6358    | 0.6536 | 0.6128 | 0.7567 |

| 2.9976        | 1.18  | 600  | 1.3239          | 0.6725   | 0.6797    | 0.6725 | 0.6389 | 0.7702 |

| 2.9976        | 1.27  | 650  | 1.1953          | 0.7170   | 0.7116    | 0.7170 | 0.6878 | 0.8024 |

| 2.9976        | 1.37  | 700  | 1.1213          | 0.7170   | 0.7116    | 0.7170 | 0.6834 | 0.8020 |

| 2.9976        | 1.47  | 750  | 1.0287          | 0.7291   | 0.7293    | 0.7291 | 0.7043 | 0.8106 |

| 2.9976        | 1.57  | 800  | 0.9258          | 0.7722   | 0.7818    | 0.7722 | 0.7572 | 0.8414 |

| 2.9976        | 1.67  | 850  | 0.8634          | 0.7722   | 0.7967    | 0.7722 | 0.7574 | 0.8415 |

| 2.9976        | 1.76  | 900  | 0.7849          | 0.7938   | 0.8226    | 0.7938 | 0.7874 | 0.8546 |

| 2.9976        | 1.86  | 950  | 0.8423          | 0.7601   | 0.7760    | 0.7601 | 0.7480 | 0.8321 |

| 2.9976        | 1.96  | 1000 | 0.7670          | 0.7830   | 0.8069    | 0.7830 | 0.7704 | 0.8488 |

| 1.686         | 2.06  | 1050 | 0.7352          | 0.7951   | 0.8024    | 0.7951 | 0.7832 | 0.8566 |

| 1.686         | 2.16  | 1100 | 0.7278          | 0.8019   | 0.8234    | 0.8019 | 0.7951 | 0.8627 |

| 1.686         | 2.25  | 1150 | 0.6867          | 0.8113   | 0.8241    | 0.8113 | 0.8059 | 0.8683 |

| 1.686         | 2.35  | 1200 | 0.6489          | 0.8167   | 0.8343    | 0.8167 | 0.8093 | 0.8722 |

| 1.686         | 2.45  | 1250 | 0.6217          | 0.8288   | 0.8454    | 0.8288 | 0.8242 | 0.8811 |

| 1.686         | 2.55  | 1300 | 0.6416          | 0.8113   | 0.8320    | 0.8113 | 0.8050 | 0.8678 |

| 1.686         | 2.65  | 1350 | 0.6517          | 0.8113   | 0.8254    | 0.8113 | 0.8055 | 0.8693 |

| 1.686         | 2.75  | 1400 | 0.6330          | 0.8140   | 0.8313    | 0.8140 | 0.8092 | 0.8710 |

| 1.686         | 2.84  | 1450 | 0.5905          | 0.8329   | 0.8575    | 0.8329 | 0.8339 | 0.8844 |

| 1.686         | 2.94  | 1500 | 0.5974          | 0.8329   | 0.8480    | 0.8329 | 0.8291 | 0.8840 |

| 1.2582        | 3.04  | 1550 | 0.6449          | 0.8235   | 0.8430    | 0.8235 | 0.8192 | 0.8774 |

| 1.2582        | 3.14  | 1600 | 0.5734          | 0.8464   | 0.8633    | 0.8464 | 0.8449 | 0.8933 |

| 1.2582        | 3.24  | 1650 | 0.5771          | 0.8450   | 0.8641    | 0.8450 | 0.8440 | 0.8910 |

| 1.2582        | 3.33  | 1700 | 0.5133          | 0.8491   | 0.8619    | 0.8491 | 0.8466 | 0.8942 |

| 1.2582        | 3.43  | 1750 | 0.5608          | 0.8437   | 0.8621    | 0.8437 | 0.8419 | 0.8906 |

| 1.2582        | 3.53  | 1800 | 0.6194          | 0.8221   | 0.8446    | 0.8221 | 0.8197 | 0.8759 |

| 1.2582        | 3.63  | 1850 | 0.5060          | 0.8410   | 0.8527    | 0.8410 | 0.8381 | 0.8899 |

| 1.2582        | 3.73  | 1900 | 0.6035          | 0.8315   | 0.8528    | 0.8315 | 0.8262 | 0.8829 |

| 1.2582        | 3.82  | 1950 | 0.5269          | 0.8396   | 0.8542    | 0.8396 | 0.8376 | 0.8891 |

| 1.2582        | 3.92  | 2000 | 0.5115          | 0.8531   | 0.8638    | 0.8531 | 0.8489 | 0.8982 |

| 1.0473        | 4.02  | 2050 | 0.5209          | 0.8518   | 0.8688    | 0.8518 | 0.8497 | 0.8969 |

| 1.0473        | 4.12  | 2100 | 0.5327          | 0.8342   | 0.8530    | 0.8342 | 0.8326 | 0.8844 |

| 1.0473        | 4.22  | 2150 | 0.4859          | 0.8544   | 0.8694    | 0.8544 | 0.8527 | 0.8980 |

| 1.0473        | 4.31  | 2200 | 0.5414          | 0.8450   | 0.8648    | 0.8450 | 0.8402 | 0.8918 |

| 1.0473        | 4.41  | 2250 | 0.5982          | 0.8383   | 0.8545    | 0.8383 | 0.8355 | 0.8871 |

| 1.0473        | 4.51  | 2300 | 0.5458          | 0.8450   | 0.8562    | 0.8450 | 0.8421 | 0.8934 |

| 1.0473        | 4.61  | 2350 | 0.5115          | 0.8625   | 0.8753    | 0.8625 | 0.8601 | 0.9042 |

| 1.0473        | 4.71  | 2400 | 0.5226          | 0.8518   | 0.8671    | 0.8518 | 0.8491 | 0.8961 |

| 1.0473        | 4.8   | 2450 | 0.5058          | 0.8679   | 0.8807    | 0.8679 | 0.8661 | 0.9082 |

| 1.0473        | 4.9   | 2500 | 0.5442          | 0.8491   | 0.8647    | 0.8491 | 0.8461 | 0.8957 |

| 1.0473        | 5.0   | 2550 | 0.4810          | 0.8693   | 0.8816    | 0.8693 | 0.8680 | 0.9089 |

| 0.9144        | 5.1   | 2600 | 0.4729          | 0.8787   | 0.8918    | 0.8787 | 0.8769 | 0.9159 |

| 0.9144        | 5.2   | 2650 | 0.4981          | 0.8585   | 0.8686    | 0.8585 | 0.8564 | 0.9019 |

| 0.9144        | 5.29  | 2700 | 0.5505          | 0.8477   | 0.8629    | 0.8477 | 0.8464 | 0.8937 |

| 0.9144        | 5.39  | 2750 | 0.4829          | 0.8706   | 0.8859    | 0.8706 | 0.8701 | 0.9111 |

| 0.9144        | 5.49  | 2800 | 0.5203          | 0.8544   | 0.8690    | 0.8544 | 0.8520 | 0.9003 |

| 0.9144        | 5.59  | 2850 | 0.4907          | 0.8585   | 0.8730    | 0.8585 | 0.8568 | 0.9027 |

| 0.9144        | 5.69  | 2900 | 0.4710          | 0.8706   | 0.8801    | 0.8706 | 0.8686 | 0.9096 |

| 0.9144        | 5.78  | 2950 | 0.5062          | 0.8504   | 0.8647    | 0.8504 | 0.8490 | 0.8965 |

| 0.9144        | 5.88  | 3000 | 0.4455          | 0.8774   | 0.8914    | 0.8774 | 0.8777 | 0.9163 |

| 0.9144        | 5.98  | 3050 | 0.5032          | 0.8544   | 0.8718    | 0.8544 | 0.8541 | 0.8985 |

| 0.8213        | 6.08  | 3100 | 0.4735          | 0.8733   | 0.8895    | 0.8733 | 0.8718 | 0.9135 |

| 0.8213        | 6.18  | 3150 | 0.4743          | 0.8693   | 0.8880    | 0.8693 | 0.8679 | 0.9102 |

| 0.8213        | 6.27  | 3200 | 0.5357          | 0.8531   | 0.8720    | 0.8531 | 0.8492 | 0.8984 |

| 0.8213        | 6.37  | 3250 | 0.4820          | 0.8625   | 0.8783    | 0.8625 | 0.8601 | 0.9059 |

| 0.8213        | 6.47  | 3300 | 0.4732          | 0.8760   | 0.8897    | 0.8760 | 0.8755 | 0.9159 |

| 0.8213        | 6.57  | 3350 | 0.4814          | 0.8666   | 0.8790    | 0.8666 | 0.8664 | 0.9089 |





### Framework versions



- Transformers 4.38.2

- Pytorch 2.3.0

- Datasets 2.19.1

- Tokenizers 0.15.1