Gigatrue Finetunes
Collection
3 items
•
Updated
This model is a fine-tuned version of google/flan-t5-small on an unknown dataset. It achieves the following results on the evaluation set:
More information needed
More information needed
More information needed
The following hyperparameters were used during training:
Training Loss | Epoch | Step | Validation Loss |
---|---|---|---|
2.5659 | 0.2030 | 3000 | 2.1724 |
2.4802 | 0.4059 | 6000 | 2.1500 |
2.4581 | 0.6089 | 9000 | 2.1390 |
2.4493 | 0.8119 | 12000 | 2.1325 |
2.4436 | 1.0148 | 15000 | 2.1282 |
2.4398 | 1.2178 | 18000 | 2.1255 |
2.437 | 1.4207 | 21000 | 2.1246 |
2.434 | 1.6237 | 24000 | 2.1246 |
2.4337 | 1.8267 | 27000 | 2.1226 |
2.4337 | 2.0296 | 30000 | 2.1228 |
2.4314 | 2.2326 | 33000 | 2.1234 |
2.4332 | 2.4356 | 36000 | 2.1232 |
2.4327 | 2.6385 | 39000 | 2.1230 |
2.4329 | 2.8415 | 42000 | 2.1229 |