File size: 8,985 Bytes
8b8c7e1
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
---
tags:
- generated_from_trainer
metrics:
- accuracy
- f1
- precision
- recall
model-index:
- name: final_V4_resized_balanced_Bert_balanced_dataset-after-adding-new-words-text-classification-model
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# final_V4_resized_balanced_Bert_balanced_dataset-after-adding-new-words-text-classification-model

This model was trained from scratch on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 3.7809
- Accuracy: 0.5419
- F1: 0.5
- Precision: 0.5
- Recall: 0.5

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 16
- eval_batch_size: 32
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 100
- num_epochs: 5
- mixed_precision_training: Native AMP

### Training results

| Training Loss | Epoch | Step | Validation Loss | Accuracy | F1     | Precision | Recall |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:------:|:---------:|:------:|
| 1.8374        | 0.06  | 50   | 1.8580          | 0.1350   | 0.0735 | 0.0910    | 0.1771 |
| 1.1067        | 0.12  | 100  | 1.7265          | 0.4003   | 0.2554 | 0.3101    | 0.3385 |
| 0.5553        | 0.18  | 150  | 0.9466          | 0.7571   | 0.5029 | 0.5522    | 0.5495 |
| 0.2362        | 0.25  | 200  | 0.8960          | 0.8595   | 0.7592 | 0.7701    | 0.7770 |
| 0.2549        | 0.31  | 250  | 0.9180          | 0.8623   | 0.7585 | 0.7737    | 0.7785 |
| 0.1716        | 0.37  | 300  | 0.9975          | 0.8646   | 0.7807 | 0.7630    | 0.8138 |
| 0.2299        | 0.43  | 350  | 0.8119          | 0.8614   | 0.7843 | 0.7481    | 0.8354 |
| 0.1657        | 0.49  | 400  | 0.9501          | 0.8657   | 0.7878 | 0.7724    | 0.8215 |
| 0.1585        | 0.55  | 450  | 1.0274          | 0.8661   | 0.7962 | 0.7708    | 0.8367 |
| 0.1856        | 0.61  | 500  | 1.0357          | 0.8675   | 0.7948 | 0.7554    | 0.8510 |
| 0.1002        | 0.67  | 550  | 1.1383          | 0.8657   | 0.7978 | 0.7673    | 0.8423 |
| 0.1505        | 0.74  | 600  | 1.0459          | 0.8678   | 0.7981 | 0.7646    | 0.8477 |
| 0.1264        | 0.8   | 650  | 0.9859          | 0.8692   | 0.8048 | 0.7781    | 0.8467 |
| 0.13          | 0.86  | 700  | 1.0246          | 0.8678   | 0.7947 | 0.7569    | 0.8496 |
| 0.1151        | 0.92  | 750  | 0.5834          | 0.8764   | 0.8223 | 0.8837    | 0.8621 |
| 0.1776        | 0.98  | 800  | 1.0297          | 0.8675   | 0.7903 | 0.7773    | 0.8202 |
| 0.0488        | 1.04  | 850  | 1.0348          | 0.8700   | 0.8038 | 0.7724    | 0.8504 |
| 0.0752        | 1.1   | 900  | 1.1004          | 0.8675   | 0.7839 | 0.7725    | 0.8136 |
| 0.0696        | 1.17  | 950  | 1.1802          | 0.8701   | 0.8020 | 0.7756    | 0.8438 |
| 0.0743        | 1.23  | 1000 | 1.1167          | 0.8695   | 0.8065 | 0.7791    | 0.8490 |
| 0.0757        | 1.29  | 1050 | 1.1188          | 0.8704   | 0.8078 | 0.7795    | 0.8516 |
| 0.0757        | 1.35  | 1100 | 0.7870          | 0.8692   | 0.8062 | 0.7816    | 0.8460 |
| 0.0847        | 1.41  | 1150 | 1.0518          | 0.8698   | 0.8045 | 0.7791    | 0.8450 |
| 0.0502        | 1.47  | 1200 | 1.1333          | 0.8689   | 0.7993 | 0.7679    | 0.8469 |
| 0.0516        | 1.53  | 1250 | 1.2185          | 0.8626   | 0.7707 | 0.7217    | 0.8429 |
| 0.0841        | 1.6   | 1300 | 1.2722          | 0.8689   | 0.8026 | 0.7798    | 0.8404 |
| 0.1063        | 1.66  | 1350 | 1.2437          | 0.8690   | 0.8008 | 0.7808    | 0.8362 |
| 0.097         | 1.72  | 1400 | 1.1243          | 0.8684   | 0.7930 | 0.7836    | 0.8201 |
| 0.0746        | 1.78  | 1450 | 1.2221          | 0.8701   | 0.8072 | 0.7801    | 0.8498 |
| 0.0726        | 1.84  | 1500 | 0.7919          | 0.8676   | 0.8076 | 0.7799    | 0.8500 |
| 0.0779        | 1.9   | 1550 | 1.1613          | 0.8704   | 0.8092 | 0.7837    | 0.8500 |
| 0.0895        | 1.96  | 1600 | 1.0377          | 0.8704   | 0.8060 | 0.7788    | 0.8485 |
| 0.0481        | 2.02  | 1650 | 1.1583          | 0.8710   | 0.8087 | 0.7810    | 0.8520 |
| 0.0266        | 2.09  | 1700 | 1.1655          | 0.8687   | 0.8020 | 0.7734    | 0.8452 |
| 0.0403        | 2.15  | 1750 | 1.2421          | 0.8707   | 0.8066 | 0.7777    | 0.8509 |
| 0.0116        | 2.21  | 1800 | 1.2306          | 0.8701   | 0.8048 | 0.7778    | 0.8474 |
| 0.0287        | 2.27  | 1850 | 1.2461          | 0.8700   | 0.8057 | 0.7784    | 0.8487 |
| 0.0197        | 2.33  | 1900 | 1.2199          | 0.8612   | 0.7937 | 0.7568    | 0.8451 |
| 0.0325        | 2.39  | 1950 | 1.3021          | 0.8703   | 0.8051 | 0.7785    | 0.8472 |
| 0.0443        | 2.45  | 2000 | 1.2395          | 0.8703   | 0.8061 | 0.7771    | 0.8503 |
| 0.0189        | 2.52  | 2050 | 1.2496          | 0.8704   | 0.8052 | 0.7812    | 0.8449 |
| 0.0056        | 2.58  | 2100 | 1.2561          | 0.8706   | 0.8073 | 0.7772    | 0.8527 |
| 0.0188        | 2.64  | 2150 | 1.2711          | 0.8706   | 0.8053 | 0.7818    | 0.8443 |
| 0.0287        | 2.7   | 2200 | 1.2728          | 0.8706   | 0.8068 | 0.7781    | 0.8504 |
| 0.0487        | 2.76  | 2250 | 1.1602          | 0.8710   | 0.8074 | 0.7802    | 0.8499 |
| 0.0409        | 2.82  | 2300 | 1.0628          | 0.8706   | 0.8061 | 0.7760    | 0.8510 |
| 0.053         | 2.88  | 2350 | 1.1891          | 0.8707   | 0.8076 | 0.7779    | 0.8526 |
| 0.0109        | 2.94  | 2400 | 1.2429          | 0.8700   | 0.8065 | 0.7811    | 0.8476 |
| 0.0392        | 3.01  | 2450 | 1.2635          | 0.8709   | 0.8058 | 0.7759    | 0.8509 |
| 0.0237        | 3.07  | 2500 | 1.2678          | 0.8703   | 0.8023 | 0.7817    | 0.8386 |
| 0.007         | 3.13  | 2550 | 1.2495          | 0.8709   | 0.8077 | 0.7812    | 0.8497 |
| 0.009         | 3.19  | 2600 | 1.2368          | 0.8715   | 0.8091 | 0.7807    | 0.8529 |
| 0.0022        | 3.25  | 2650 | 1.2436          | 0.8707   | 0.8055 | 0.7829    | 0.8435 |
| 0.0175        | 3.31  | 2700 | 1.2469          | 0.8712   | 0.8079 | 0.7818    | 0.8493 |
| 0.0037        | 3.37  | 2750 | 1.2342          | 0.8710   | 0.8068 | 0.7781    | 0.8510 |
| 0.0091        | 3.44  | 2800 | 1.2489          | 0.8701   | 0.8041 | 0.7780    | 0.8459 |
| 0.0008        | 3.5   | 2850 | 1.2252          | 0.8709   | 0.8059 | 0.7742    | 0.8532 |
| 0.005         | 3.56  | 2900 | 1.2281          | 0.8696   | 0.8030 | 0.7693    | 0.8522 |
| 0.0004        | 3.62  | 2950 | 1.2746          | 0.8704   | 0.8052 | 0.7760    | 0.8501 |
| 0.0009        | 3.68  | 3000 | 1.2903          | 0.8706   | 0.8054 | 0.7760    | 0.8504 |
| 0.001         | 3.74  | 3050 | 1.2960          | 0.8712   | 0.8060 | 0.7780    | 0.8492 |
| 0.0002        | 3.8   | 3100 | 1.3036          | 0.8712   | 0.8060 | 0.7780    | 0.8492 |
| 0.0185        | 3.87  | 3150 | 1.3224          | 0.8701   | 0.8048 | 0.7797    | 0.8453 |
| 0.0013        | 3.93  | 3200 | 1.3236          | 0.8707   | 0.8064 | 0.7780    | 0.8503 |
| 0.0003        | 3.99  | 3250 | 1.3241          | 0.8710   | 0.8068 | 0.7780    | 0.8509 |
| 0.0128        | 4.05  | 3300 | 1.3175          | 0.8704   | 0.8075 | 0.7794    | 0.8504 |
| 0.0005        | 4.11  | 3350 | 1.3160          | 0.8709   | 0.8078 | 0.7797    | 0.8508 |
| 0.0147        | 4.17  | 3400 | 1.3180          | 0.8712   | 0.8078 | 0.7803    | 0.8505 |
| 0.0064        | 4.23  | 3450 | 1.3197          | 0.8710   | 0.8076 | 0.7793    | 0.8510 |
| 0.0009        | 4.29  | 3500 | 1.3245          | 0.8712   | 0.8080 | 0.7800    | 0.8512 |
| 0.0002        | 4.36  | 3550 | 1.3336          | 0.8712   | 0.8079 | 0.7820    | 0.8491 |
| 0.0131        | 4.42  | 3600 | 1.3113          | 0.8710   | 0.8076 | 0.7794    | 0.8511 |
| 0.0003        | 4.48  | 3650 | 1.3200          | 0.8712   | 0.8079 | 0.7820    | 0.8491 |
| 0.0005        | 4.54  | 3700 | 1.3258          | 0.8712   | 0.8080 | 0.7841    | 0.8472 |
| 0.0102        | 4.6   | 3750 | 1.3177          | 0.8712   | 0.8079 | 0.7797    | 0.8512 |
| 0.0161        | 4.66  | 3800 | 1.3042          | 0.8712   | 0.8077 | 0.7794    | 0.8512 |
| 0.0178        | 4.72  | 3850 | 1.3133          | 0.8710   | 0.8076 | 0.7794    | 0.8511 |
| 0.0067        | 4.79  | 3900 | 1.3154          | 0.8709   | 0.8076 | 0.7793    | 0.8510 |
| 0.0191        | 4.85  | 3950 | 1.3187          | 0.8715   | 0.8103 | 0.7843    | 0.8516 |
| 0.0048        | 4.91  | 4000 | 1.3218          | 0.8713   | 0.8091 | 0.7842    | 0.8492 |
| 0.0046        | 4.97  | 4050 | 1.3220          | 0.8715   | 0.8103 | 0.7843    | 0.8516 |


### Framework versions

- Transformers 4.39.3
- Pytorch 2.1.2
- Datasets 2.18.0
- Tokenizers 0.15.2