gemma2b-summarize-gpt4o-256k / train_results.json
chansung's picture
Model save
4bc4ae5 verified
raw
history blame contribute delete
252 Bytes
{
"epoch": 9.974380871050384,
"total_flos": 3.4809256003093135e+18,
"train_loss": 0.9919237802289936,
"train_runtime": 34991.5416,
"train_samples": 258442,
"train_samples_per_second": 8.027,
"train_steps_per_second": 0.083
}