File size: 2,339 Bytes
996dcbb
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
---
library_name: transformers
license: apache-2.0
base_model: bert-base-uncased
tags:
- generated_from_trainer
model-index:
- name: bert-reg-crossencoder-contrastive
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# bert-reg-crossencoder-contrastive

This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0001
- Mse: 0.2717
- Mae: 0.4451
- Pearson Corr: -0.2034
- Spearman Corr: -0.1953
- Cosine Sim: 0.9027

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 100
- num_epochs: 7

### Training results

| Training Loss | Epoch | Step | Validation Loss | Mse    | Mae    | Pearson Corr | Spearman Corr | Cosine Sim |
|:-------------:|:-----:|:----:|:---------------:|:------:|:------:|:------------:|:-------------:|:----------:|
| 0.0148        | 1.0   | 41   | 0.0014          | 0.2889 | 0.4614 | -0.1243      | -0.0625       | 0.9003     |
| 0.0096        | 2.0   | 82   | 0.0074          | 0.3706 | 0.5451 | -0.0433      | -0.0347       | 0.9030     |
| 0.0059        | 3.0   | 123  | 0.0001          | 0.2549 | 0.4285 | -0.0372      | -0.0585       | 0.9032     |
| 0.004         | 4.0   | 164  | 0.0023          | 0.3175 | 0.4940 | -0.0783      | -0.0715       | 0.9029     |
| 0.0026        | 5.0   | 205  | 0.0003          | 0.2770 | 0.4519 | -0.0308      | -0.0070       | 0.9033     |
| 0.0019        | 6.0   | 246  | 0.0002          | 0.2771 | 0.4512 | -0.1884      | -0.1805       | 0.9028     |
| 0.0018        | 7.0   | 287  | 0.0001          | 0.2717 | 0.4451 | -0.2034      | -0.1953       | 0.9027     |


### Framework versions

- Transformers 4.44.2
- Pytorch 2.5.0+cu121
- Datasets 3.1.0
- Tokenizers 0.19.1