File size: 5,634 Bytes
bbd23de
 
 
79695c4
bbd23de
 
66bf94e
 
 
 
 
bbd23de
 
 
 
 
 
 
 
 
 
79695c4
 
a4f2723
 
 
 
 
bbd23de
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
f148c08
0cffc05
 
bbd23de
 
 
66bf94e
f148c08
bbd23de
 
 
 
a4f2723
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
bbd23de
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
---
library_name: transformers
license: apache-2.0
base_model: latterworks/highlightedreport-classifier-test
tags:
- generated_from_trainer
metrics:
- accuracy
- f1
- precision
- recall
model-index:
- name: highlightedreport-classifier-test
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# highlightedreport-classifier-test

This model is a fine-tuned version of [latterworks/highlightedreport-classifier-test](https://huggingface.co/latterworks/highlightedreport-classifier-test) on the None dataset.
It achieves the following results on the evaluation set:
- Accuracy: 0.7840
- Loss: 0.6084
- F1: 0.7701
- Precision: 0.7694
- Recall: 0.7708

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 128
- eval_batch_size: 128
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 10
- mixed_precision_training: Native AMP

### Training results

| Training Loss | Epoch  | Step | Accuracy | Validation Loss | F1     | Precision | Recall |
|:-------------:|:------:|:----:|:--------:|:---------------:|:------:|:---------:|:------:|
| 0.6461        | 0.2375 | 100  | 0.7873   | 0.5155          | 0.7772 | 0.7647    | 0.7900 |
| 0.4988        | 0.4751 | 200  | 0.8014   | 0.4455          | 0.7839 | 0.8012    | 0.7673 |
| 0.4784        | 0.7126 | 300  | 0.7944   | 0.4508          | 0.7914 | 0.7554    | 0.8310 |
| 0.4827        | 0.9501 | 400  | 0.7793   | 0.4632          | 0.7836 | 0.7259    | 0.8512 |
| 0.4628        | 1.1876 | 500  | 0.7944   | 0.4459          | 0.7819 | 0.7787    | 0.7851 |
| 0.4622        | 1.4252 | 600  | 0.7907   | 0.4591          | 0.7890 | 0.7488    | 0.8338 |
| 0.4508        | 1.6627 | 700  | 0.7927   | 0.4550          | 0.7875 | 0.7590    | 0.8181 |
| 0.4572        | 1.9002 | 800  | 0.7903   | 0.4516          | 0.7639 | 0.8104    | 0.7224 |
| 0.4359        | 2.1378 | 900  | 0.7917   | 0.4772          | 0.7874 | 0.7558    | 0.8217 |
| 0.3967        | 2.3753 | 1000 | 0.7984   | 0.4567          | 0.7797 | 0.8003    | 0.7601 |
| 0.415         | 2.6128 | 1100 | 0.7852   | 0.4792          | 0.7847 | 0.7408    | 0.8342 |
| 0.4097        | 2.8504 | 1200 | 0.7965   | 0.4661          | 0.7847 | 0.7795    | 0.7900 |
| 0.3962        | 3.0879 | 1300 | 0.7972   | 0.4655          | 0.7738 | 0.8120    | 0.7391 |
| 0.3738        | 3.3254 | 1400 | 0.7887   | 0.4740          | 0.7806 | 0.7612    | 0.8011 |
| 0.3618        | 3.5629 | 1500 | 0.7935   | 0.4706          | 0.7720 | 0.8015    | 0.7445 |
| 0.3604        | 3.8005 | 1600 | 0.7942   | 0.4779          | 0.78   | 0.7828    | 0.7772 |
| 0.3533        | 4.0380 | 1700 | 0.7892   | 0.4899          | 0.7752 | 0.7760    | 0.7744 |
| 0.3194        | 4.2755 | 1800 | 0.7902   | 0.5034          | 0.7785 | 0.7717    | 0.7854 |
| 0.3285        | 4.5131 | 1900 | 0.7893   | 0.4958          | 0.7767 | 0.7730    | 0.7804 |
| 0.3256        | 4.7506 | 2000 | 0.7908   | 0.4952          | 0.7720 | 0.7905    | 0.7544 |
| 0.321         | 4.9881 | 2100 | 0.7873   | 0.5050          | 0.7760 | 0.7675    | 0.7847 |
| 0.2915        | 5.2257 | 2200 | 0.7872   | 0.5167          | 0.7722 | 0.7761    | 0.7683 |
| 0.2819        | 5.4632 | 2300 | 0.7828   | 0.5344          | 0.7745 | 0.7554    | 0.7947 |
| 0.2839        | 5.7007 | 2400 | 0.7812   | 0.5529          | 0.7761 | 0.7465    | 0.8082 |
| 0.2836        | 5.9382 | 2500 | 0.7771   | 0.5433          | 0.7741 | 0.7384    | 0.8135 |
| 0.2686        | 6.1758 | 2600 | 0.7832   | 0.5545          | 0.7692 | 0.7687    | 0.7698 |
| 0.2559        | 6.4133 | 2700 | 0.7820   | 0.5578          | 0.7728 | 0.7564    | 0.7900 |
| 0.2525        | 6.6508 | 2800 | 0.7837   | 0.5647          | 0.7703 | 0.7678    | 0.7730 |
| 0.251         | 6.8884 | 2900 | 0.7867   | 0.5588          | 0.7713 | 0.7764    | 0.7662 |
| 0.2463        | 7.1259 | 3000 | 0.7877   | 0.5754          | 0.7738 | 0.7739    | 0.7737 |
| 0.2284        | 7.3634 | 3100 | 0.7842   | 0.5907          | 0.7758 | 0.7571    | 0.7954 |
| 0.2295        | 7.6010 | 3200 | 0.7835   | 0.5832          | 0.7654 | 0.7789    | 0.7523 |
| 0.234         | 7.8385 | 3300 | 0.7807   | 0.5821          | 0.7670 | 0.7650    | 0.7690 |
| 0.2296        | 8.0760 | 3400 | 0.7850   | 0.5823          | 0.7667 | 0.7813    | 0.7527 |
| 0.2161        | 8.3135 | 3500 | 0.7837   | 0.5908          | 0.7694 | 0.7699    | 0.7690 |
| 0.2253        | 8.5511 | 3600 | 0.7857   | 0.5907          | 0.7648 | 0.7887    | 0.7423 |
| 0.21          | 8.7886 | 3700 | 0.7835   | 0.6021          | 0.7719 | 0.7636    | 0.7804 |
| 0.2123        | 9.0261 | 3800 | 0.7840   | 0.6025          | 0.7691 | 0.7720    | 0.7662 |
| 0.1977        | 9.2637 | 3900 | 0.7827   | 0.6081          | 0.7655 | 0.7755    | 0.7559 |
| 0.2061        | 9.5012 | 4000 | 0.7838   | 0.6090          | 0.7715 | 0.7656    | 0.7776 |
| 0.2032        | 9.7387 | 4100 | 0.7850   | 0.6081          | 0.7718 | 0.7690    | 0.7747 |
| 0.2077        | 9.9762 | 4200 | 0.7838   | 0.6084          | 0.7699 | 0.7694    | 0.7705 |


### Framework versions

- Transformers 4.48.3
- Pytorch 2.6.0+cu124
- Datasets 3.4.1
- Tokenizers 0.21.1