gemma2b-summarize-gpt4o-32k / train_results.json
chansung's picture
Model save
88160fb verified
raw
history blame
236 Bytes
{
"epoch": 10.0,
"total_flos": 4.287825372721971e+17,
"train_loss": 1.1969801510850044,
"train_runtime": 3891.9077,
"train_samples": 32305,
"train_samples_per_second": 9.003,
"train_steps_per_second": 0.188
}