gemma2b-summarize-gpt4o-4k / train_results.json
chansung's picture
Model save
3bbaad7 verified
raw
history blame contribute delete
247 Bytes
{
"epoch": 9.473684210526315,
"total_flos": 5.286360054444851e+16,
"train_loss": 1.6763220760557387,
"train_runtime": 490.2927,
"train_samples": 4038,
"train_samples_per_second": 8.933,
"train_steps_per_second": 0.184
}