019-microsoft-MiniLM-finetuned-yahoo-80000_20000

This model is a fine-tuned version of microsoft/MiniLM-L12-H384-uncased on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.8508
  • F1: 0.7322
  • Accuracy: 0.7357
  • Precision: 0.7318
  • Recall: 0.7357
  • System Ram Used: 4.0900
  • System Ram Total: 83.4807
  • Gpu Ram Allocated: 0.3934
  • Gpu Ram Cached: 16.0508
  • Gpu Ram Total: 39.5640
  • Gpu Utilization: 31
  • Disk Space Used: 26.4706
  • Disk Space Total: 78.1898

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 5

Training results

Training Loss Epoch Step Validation Loss F1 Accuracy Precision Recall System Ram Used System Ram Total Gpu Ram Allocated Gpu Ram Cached Gpu Ram Total Gpu Utilization Disk Space Used Disk Space Total
1.5316 0.25 625 1.1302 0.6824 0.6928 0.6859 0.6928 4.1089 83.4807 0.3935 16.0508 39.5640 33 25.7180 78.1898
1.0615 0.5 1250 1.0022 0.7011 0.7049 0.7065 0.7049 3.8585 83.4807 0.3936 16.0508 39.5640 33 26.0913 78.1898
0.9804 0.75 1875 0.9258 0.7158 0.7191 0.7201 0.7191 3.8640 83.4807 0.3935 16.0508 39.5640 33 26.4646 78.1898
0.9244 1.0 2500 0.8795 0.7219 0.7286 0.7266 0.7286 3.8815 83.4807 0.3935 16.0508 39.5640 32 26.4649 78.1898
0.8471 1.25 3125 0.8886 0.7243 0.7305 0.7280 0.7305 4.0318 83.4807 0.3935 16.0508 39.5640 31 26.4653 78.1898
0.8294 1.5 3750 0.8648 0.7285 0.7303 0.7304 0.7303 3.8228 83.4807 0.3935 16.0508 39.5640 33 26.4656 78.1898
0.8229 1.75 4375 0.8477 0.7306 0.7347 0.7314 0.7347 3.8704 83.4807 0.3935 16.0508 39.5640 32 26.4658 78.1898
0.8227 2.0 5000 0.8514 0.7300 0.7321 0.7343 0.7321 3.8656 83.4807 0.3935 16.0508 39.5640 34 26.4661 78.1898
0.7515 2.25 5625 0.8580 0.7286 0.7327 0.7324 0.7327 4.0576 83.4807 0.3935 16.0508 39.5640 32 26.4664 78.1898
0.7523 2.5 6250 0.8498 0.7296 0.734 0.7314 0.734 3.8656 83.4807 0.3935 16.0508 39.5640 32 26.4666 78.1898
0.7396 2.75 6875 0.8403 0.7326 0.7365 0.7323 0.7365 3.8686 83.4807 0.3935 16.0508 39.5640 33 26.4669 78.1898
0.7308 3.0 7500 0.8414 0.7348 0.7378 0.7339 0.7378 3.8611 83.4807 0.3935 16.0508 39.5640 26 26.4671 78.1898
0.6929 3.25 8125 0.8551 0.7322 0.7350 0.7376 0.7350 4.0565 83.4807 0.3936 16.0508 39.5640 29 26.4680 78.1898
0.6772 3.5 8750 0.8471 0.7335 0.738 0.7327 0.738 3.8351 83.4807 0.3935 16.0508 39.5640 31 26.4684 78.1898
0.682 3.75 9375 0.8460 0.7311 0.735 0.7311 0.735 3.8782 83.4807 0.3935 16.0508 39.5640 34 26.4686 78.1898
0.6741 4.0 10000 0.8409 0.7335 0.7376 0.7330 0.7376 3.8848 83.4807 0.3935 16.0508 39.5640 31 26.4690 78.1898
0.6247 4.25 10625 0.8500 0.7332 0.736 0.7324 0.736 4.0838 83.4807 0.3935 16.0508 39.5640 32 26.4694 78.1898
0.6446 4.5 11250 0.8464 0.7323 0.7358 0.7320 0.7358 3.8687 83.4807 0.3936 16.0508 39.5640 31 26.4697 78.1898
0.6355 4.75 11875 0.8503 0.7311 0.7349 0.7308 0.7349 3.8853 83.4807 0.3935 16.0508 39.5640 30 26.4700 78.1898
0.6396 5.0 12500 0.8508 0.7322 0.7357 0.7318 0.7357 3.8995 83.4807 0.3935 16.0508 39.5640 33 26.4704 78.1898

Framework versions

  • Transformers 4.31.0
  • Pytorch 2.0.1+cu118
  • Datasets 2.13.1
  • Tokenizers 0.13.3
Downloads last month
15
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for diogopaes10/019-microsoft-MiniLM-finetuned-yahoo-80000_20000

Finetuned
(30)
this model