calculator

This model is a fine-tuned version of microsoft/Phi-3.5-mini-instruct on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.6422

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 24
  • eval_batch_size: 24
  • seed: 42
  • optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • num_epochs: 5

Training results

Training Loss Epoch Step Validation Loss
2.7655 1.0 167 0.6422
0.6496 2.0 334 0.6717
0.6479 3.0 501 0.6706
0.8804 4.0 668 0.8621
0.8283 5.0 835 0.7388

Framework versions

  • PEFT 0.14.0
  • Transformers 4.47.0
  • Pytorch 2.3.1.post300
  • Datasets 2.2.1
  • Tokenizers 0.21.0
Downloads last month
16
Inference API
Unable to determine this model’s pipeline type. Check the docs .

Model tree for aisuko/calculator

Adapter
(163)
this model