File size: 3,927 Bytes
eb0cef4
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
bd410b3
 
 
eb0cef4
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
31f9c23
 
bc83b35
 
679605a
 
c9a1a67
 
 
1a34193
 
6b5b79b
 
81cc82a
 
a11ecdd
 
d5963e1
 
6fc0d94
 
09612b5
 
c9d199b
 
fcbabc1
 
623a994
 
8fcbb3b
 
1caf59a
 
e2ed2b7
 
23f69cd
 
eff2d22
 
c29d45c
 
5486fb3
 
8ec7523
 
a6f1c7f
 
4b2af24
 
c1d301e
 
57a7b4a
 
61be133
 
4f25031
 
861ec86
 
45be8a2
 
bd410b3
 
eb0cef4
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
---
license: apache-2.0
base_model: facebook/bart-base
tags:
- generated_from_keras_callback
model-index:
- name: pijarcandra22/BartIndo2Bali
  results: []
---

<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->

# pijarcandra22/BartIndo2Bali

This model is a fine-tuned version of [facebook/bart-base](https://huggingface.co/facebook/bart-base) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 0.3826
- Validation Loss: 2.3379
- Epoch: 63

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- optimizer: {'name': 'AdamWeightDecay', 'learning_rate': 2e-05, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight_decay_rate': 0.01}
- training_precision: float32

### Training results

| Train Loss | Validation Loss | Epoch |
|:----------:|:---------------:|:-----:|
| 4.3767     | 3.6194          | 0     |
| 3.5364     | 3.1996          | 1     |
| 3.1525     | 2.9458          | 2     |
| 2.8777     | 2.8118          | 3     |
| 2.6993     | 2.6979          | 4     |
| 2.5550     | 2.6071          | 5     |
| 2.4536     | 2.5362          | 6     |
| 2.3338     | 2.4572          | 7     |
| 2.2394     | 2.3878          | 8     |
| 2.1466     | 2.3692          | 9     |
| 2.0795     | 2.3189          | 10    |
| 2.0061     | 2.2674          | 11    |
| 1.9321     | 2.2393          | 12    |
| 1.8837     | 2.2181          | 13    |
| 1.8224     | 2.2002          | 14    |
| 1.7626     | 2.1671          | 15    |
| 1.7251     | 2.1386          | 16    |
| 1.6624     | 2.1245          | 17    |
| 1.6191     | 2.1134          | 18    |
| 1.6177     | 2.1061          | 19    |
| 1.5524     | 2.0845          | 20    |
| 1.4965     | 2.0750          | 21    |
| 1.4618     | 2.0527          | 22    |
| 1.4188     | 2.0584          | 23    |
| 1.3774     | 2.0359          | 24    |
| 1.3469     | 2.0567          | 25    |
| 1.3113     | 2.0295          | 26    |
| 1.2791     | 2.0134          | 27    |
| 1.2436     | 2.0431          | 28    |
| 1.1915     | 2.0201          | 29    |
| 1.1815     | 2.0283          | 30    |
| 1.1314     | 2.0230          | 31    |
| 1.1071     | 2.0424          | 32    |
| 1.0781     | 2.0357          | 33    |
| 1.0429     | 2.0208          | 34    |
| 1.0134     | 2.0458          | 35    |
| 0.9799     | 2.0466          | 36    |
| 0.9567     | 2.0592          | 37    |
| 0.9261     | 2.0278          | 38    |
| 0.8931     | 2.0641          | 39    |
| 0.8742     | 2.0783          | 40    |
| 0.8397     | 2.0781          | 41    |
| 0.8228     | 2.1010          | 42    |
| 0.7819     | 2.1042          | 43    |
| 0.7667     | 2.1302          | 44    |
| 0.7508     | 2.1193          | 45    |
| 0.7136     | 2.1372          | 46    |
| 0.6849     | 2.1513          | 47    |
| 0.6625     | 2.1747          | 48    |
| 0.6451     | 2.1936          | 49    |
| 0.6114     | 2.1650          | 50    |
| 0.5907     | 2.2176          | 51    |
| 0.5781     | 2.2313          | 52    |
| 0.5594     | 2.2287          | 53    |
| 0.5361     | 2.2260          | 54    |
| 0.5168     | 2.2444          | 55    |
| 0.5022     | 2.2660          | 56    |
| 0.4826     | 2.2912          | 57    |
| 0.4607     | 2.2922          | 58    |
| 0.4442     | 2.2912          | 59    |
| 0.4262     | 2.3032          | 60    |
| 0.4050     | 2.3335          | 61    |
| 0.4005     | 2.3327          | 62    |
| 0.3826     | 2.3379          | 63    |


### Framework versions

- Transformers 4.35.2
- TensorFlow 2.14.0
- Datasets 2.15.0
- Tokenizers 0.15.0