017-microsoft-MiniLM-finetuned-yahoo-800_200

This model is a fine-tuned version of microsoft/MiniLM-L12-H384-uncased on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 1.4048
  • F1: 0.6237
  • Accuracy: 0.63
  • Precision: 0.6273
  • Recall: 0.63
  • System Ram Used: 3.8778
  • System Ram Total: 83.4807
  • Gpu Ram Allocated: 0.3903
  • Gpu Ram Cached: 12.8340
  • Gpu Ram Total: 39.5640
  • Gpu Utilization: 32
  • Disk Space Used: 25.4337
  • Disk Space Total: 78.1898

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 25

Training results

Training Loss Epoch Step Validation Loss F1 Accuracy Precision Recall System Ram Used System Ram Total Gpu Ram Allocated Gpu Ram Cached Gpu Ram Total Gpu Utilization Disk Space Used Disk Space Total
2.3021 1.28 32 2.2975 0.0519 0.12 0.1102 0.12 3.8424 83.4807 0.3904 12.8340 39.5640 29 24.5606 78.1898
2.2615 2.56 64 2.1926 0.2339 0.31 0.4649 0.31 3.8514 83.4807 0.3904 12.8340 39.5640 30 24.5606 78.1898
2.0677 3.84 96 1.9658 0.4301 0.51 0.3950 0.51 3.8537 83.4807 0.3905 12.8340 39.5640 22 24.5606 78.1898
1.8562 5.12 128 1.8383 0.4655 0.545 0.4587 0.545 3.8574 83.4807 0.3904 12.8340 39.5640 41 24.5606 78.1898
1.6929 6.4 160 1.7403 0.4942 0.555 0.5261 0.555 3.8549 83.4807 0.3904 12.8340 39.5640 29 24.5607 78.1898
1.5569 7.68 192 1.6663 0.5467 0.585 0.6496 0.585 3.8549 83.4807 0.3904 12.8340 39.5640 37 24.5607 78.1898
1.4636 8.96 224 1.6123 0.5475 0.58 0.5539 0.58 3.8539 83.4807 0.3904 12.8340 39.5640 30 24.5607 78.1898
1.3683 10.24 256 1.5615 0.5829 0.595 0.6016 0.595 3.8527 83.4807 0.3904 12.8340 39.5640 41 24.5607 78.1898
1.2649 11.52 288 1.5261 0.5904 0.61 0.6243 0.61 3.8646 83.4807 0.3904 12.8340 39.5640 30 24.5607 78.1898
1.1968 12.8 320 1.4976 0.6012 0.615 0.6070 0.615 3.8766 83.4807 0.3904 12.8340 39.5640 45 24.5607 78.1898
1.1291 14.08 352 1.4756 0.5983 0.615 0.6164 0.615 3.8749 83.4807 0.3905 12.8340 39.5640 47 24.5607 78.1898
1.0673 15.36 384 1.4660 0.6064 0.62 0.6258 0.62 3.8752 83.4807 0.3907 12.8340 39.5640 35 24.5607 78.1898
0.9884 16.64 416 1.4410 0.6135 0.625 0.6204 0.625 3.8757 83.4807 0.3904 12.8340 39.5640 33 24.5608 78.1898
0.9743 17.92 448 1.4328 0.6233 0.635 0.6343 0.635 3.8747 83.4807 0.3905 12.8340 39.5640 44 24.5608 78.1898
0.926 19.2 480 1.4344 0.6088 0.615 0.6238 0.615 3.8742 83.4807 0.3904 12.8340 39.5640 31 24.5608 78.1898
0.8815 20.48 512 1.4282 0.6235 0.625 0.6350 0.625 4.0591 83.4807 0.3904 12.8340 39.5640 43 25.4337 78.1898
0.8613 21.76 544 1.4146 0.6329 0.635 0.6408 0.635 4.0655 83.4807 0.3904 12.8340 39.5640 26 25.4337 78.1898
0.8466 23.04 576 1.4086 0.6318 0.635 0.6415 0.635 4.0544 83.4807 0.3904 12.8340 39.5640 35 25.4337 78.1898
0.8282 24.32 608 1.4058 0.6243 0.63 0.6319 0.63 3.8886 83.4807 0.3904 12.8340 39.5640 27 25.4337 78.1898

Framework versions

  • Transformers 4.31.0
  • Pytorch 2.0.1+cu118
  • Datasets 2.13.1
  • Tokenizers 0.13.3
Downloads last month
15
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for diogopaes10/017-microsoft-MiniLM-finetuned-yahoo-800_200

Finetuned
(30)
this model