SetFit with sentence-transformers/all-mpnet-base-v2
This is a SetFit model that can be used for Text Classification. This SetFit model uses sentence-transformers/all-mpnet-base-v2 as the Sentence Transformer embedding model. A LogisticRegression instance is used for classification.
The model has been trained using an efficient few-shot learning technique that involves:
- Fine-tuning a Sentence Transformer with contrastive learning.
- Training a classification head with features from the fine-tuned Sentence Transformer.
Model Details
Model Description
- Model Type: SetFit
- Sentence Transformer body: sentence-transformers/all-mpnet-base-v2
- Classification head: a LogisticRegression instance
- Maximum Sequence Length: 384 tokens
- Number of Classes: 64 classes
Model Sources
- Repository: SetFit on GitHub
- Paper: Efficient Few-Shot Learning Without Prompts
- Blogpost: SetFit: Efficient Few-Shot Learning Without Prompts
Model Labels
Label | Examples |
---|---|
0 |
|
20 |
|
11 |
|
16 |
|
35 |
|
21 |
|
46 |
|
47 |
|
6 |
|
51 |
|
27 |
|
19 |
|
62 |
|
12 |
|
13 |
|
56 |
|
31 |
|
49 |
|
15 |
|
60 |
|
7 |
|
33 |
|
4 |
|
3 |
|
61 |
|
22 |
|
30 |
|
48 |
|
39 |
|
10 |
|
57 |
|
50 |
|
8 |
|
36 |
|
54 |
|
63 |
|
43 |
|
55 |
|
5 |
|
14 |
|
2 |
|
58 |
|
37 |
|
23 |
|
52 |
|
28 |
|
59 |
|
53 |
|
45 |
|
41 |
|
17 |
|
34 |
|
25 |
|
24 |
|
9 |
|
1 |
|
29 |
|
32 |
|
18 |
|
44 |
|
26 |
|
42 |
|
40 |
|
38 |
|
Evaluation
Metrics
Label | Accuracy |
---|---|
all | 0.5463 |
Uses
Direct Use for Inference
First install the SetFit library:
pip install setfit
Then you can load this model and run inference.
from setfit import SetFitModel
# Download from the 🤗 Hub
model = SetFitModel.from_pretrained("Jalajkx/all_mpnetcric-setfit-model")
# Run inference
preds = model("and then when i asked i didn't even get to speak to the supervisor provider i just got hung up on")
Training Details
Training Set Metrics
Training set | Min | Median | Max |
---|---|---|---|
Word count | 1 | 32.4224 | 283 |
Label | Training Sample Count |
---|---|
0 | 36 |
1 | 36 |
2 | 36 |
3 | 36 |
4 | 36 |
5 | 36 |
6 | 36 |
7 | 36 |
8 | 36 |
9 | 6 |
10 | 36 |
11 | 36 |
12 | 36 |
13 | 9 |
14 | 36 |
15 | 36 |
16 | 17 |
17 | 36 |
18 | 4 |
19 | 29 |
20 | 30 |
21 | 36 |
22 | 25 |
23 | 36 |
24 | 36 |
25 | 36 |
26 | 4 |
27 | 36 |
28 | 36 |
29 | 4 |
30 | 8 |
31 | 36 |
32 | 4 |
33 | 36 |
34 | 11 |
35 | 36 |
36 | 36 |
37 | 36 |
38 | 10 |
39 | 13 |
40 | 2 |
41 | 36 |
42 | 9 |
43 | 36 |
44 | 10 |
45 | 36 |
46 | 36 |
47 | 14 |
48 | 36 |
49 | 36 |
50 | 36 |
51 | 36 |
52 | 36 |
53 | 36 |
54 | 36 |
55 | 36 |
56 | 36 |
57 | 36 |
58 | 36 |
59 | 8 |
60 | 36 |
61 | 36 |
62 | 36 |
63 | 36 |
Training Hyperparameters
- batch_size: (4, 4)
- num_epochs: (1, 1)
- max_steps: -1
- sampling_strategy: oversampling
- num_iterations: 25
- body_learning_rate: (2e-05, 2e-05)
- head_learning_rate: 2e-05
- loss: CosineSimilarityLoss
- distance_metric: cosine_distance
- margin: 0.25
- end_to_end: False
- use_amp: False
- warmup_proportion: 0.1
- seed: 42
- eval_max_steps: -1
- load_best_model_at_end: False
Training Results
Epoch | Step | Training Loss | Validation Loss |
---|---|---|---|
0.0000 | 1 | 0.2196 | - |
0.0022 | 50 | 0.2183 | - |
0.0044 | 100 | 0.3574 | - |
0.0065 | 150 | 0.1756 | - |
0.0087 | 200 | 0.1396 | - |
0.0109 | 250 | 0.2875 | - |
0.0131 | 300 | 0.1307 | - |
0.0152 | 350 | 0.1465 | - |
0.0174 | 400 | 0.1503 | - |
0.0196 | 450 | 0.1579 | - |
0.0218 | 500 | 0.3216 | - |
0.0240 | 550 | 0.2399 | - |
0.0261 | 600 | 0.2824 | - |
0.0283 | 650 | 0.1217 | - |
0.0305 | 700 | 0.0647 | - |
0.0327 | 750 | 0.2651 | - |
0.0348 | 800 | 0.1792 | - |
0.0370 | 850 | 0.1461 | - |
0.0392 | 900 | 0.0256 | - |
0.0414 | 950 | 0.1175 | - |
0.0435 | 1000 | 0.2394 | - |
0.0457 | 1050 | 0.1582 | - |
0.0479 | 1100 | 0.2785 | - |
0.0501 | 1150 | 0.0611 | - |
0.0523 | 1200 | 0.1937 | - |
0.0544 | 1250 | 0.0804 | - |
0.0566 | 1300 | 0.0811 | - |
0.0588 | 1350 | 0.0663 | - |
0.0610 | 1400 | 0.2148 | - |
0.0631 | 1450 | 0.0428 | - |
0.0653 | 1500 | 0.0083 | - |
0.0675 | 1550 | 0.0884 | - |
0.0697 | 1600 | 0.1341 | - |
0.0719 | 1650 | 0.0949 | - |
0.0740 | 1700 | 0.1839 | - |
0.0762 | 1750 | 0.2244 | - |
0.0784 | 1800 | 0.0309 | - |
0.0806 | 1850 | 0.0277 | - |
0.0827 | 1900 | 0.2016 | - |
0.0849 | 1950 | 0.1174 | - |
0.0871 | 2000 | 0.0942 | - |
0.0893 | 2050 | 0.0483 | - |
0.0915 | 2100 | 0.2057 | - |
0.0936 | 2150 | 0.0151 | - |
0.0958 | 2200 | 0.023 | - |
0.0980 | 2250 | 0.0514 | - |
0.1002 | 2300 | 0.1541 | - |
0.1023 | 2350 | 0.1426 | - |
0.1045 | 2400 | 0.0187 | - |
0.1067 | 2450 | 0.0386 | - |
0.1089 | 2500 | 0.274 | - |
0.1110 | 2550 | 0.0723 | - |
0.1132 | 2600 | 0.0115 | - |
0.1154 | 2650 | 0.053 | - |
0.1176 | 2700 | 0.2371 | - |
0.1198 | 2750 | 0.2472 | - |
0.1219 | 2800 | 0.0386 | - |
0.1241 | 2850 | 0.0159 | - |
0.1263 | 2900 | 0.0276 | - |
0.1285 | 2950 | 0.1229 | - |
0.1306 | 3000 | 0.0037 | - |
0.1328 | 3050 | 0.0029 | - |
0.1350 | 3100 | 0.0037 | - |
0.1372 | 3150 | 0.022 | - |
0.1394 | 3200 | 0.0389 | - |
0.1415 | 3250 | 0.0146 | - |
0.1437 | 3300 | 0.0034 | - |
0.1459 | 3350 | 0.0721 | - |
0.1481 | 3400 | 0.0462 | - |
0.1502 | 3450 | 0.0039 | - |
0.1524 | 3500 | 0.1225 | - |
0.1546 | 3550 | 0.0009 | - |
0.1568 | 3600 | 0.1005 | - |
0.1590 | 3650 | 0.008 | - |
0.1611 | 3700 | 0.121 | - |
0.1633 | 3750 | 0.2982 | - |
0.1655 | 3800 | 0.008 | - |
0.1677 | 3850 | 0.001 | - |
0.1698 | 3900 | 0.216 | - |
0.1720 | 3950 | 0.0458 | - |
0.1742 | 4000 | 0.0155 | - |
0.1764 | 4050 | 0.1235 | - |
0.1785 | 4100 | 0.0059 | - |
0.1807 | 4150 | 0.2421 | - |
0.1829 | 4200 | 0.2232 | - |
0.1851 | 4250 | 0.0396 | - |
0.1873 | 4300 | 0.2164 | - |
0.1894 | 4350 | 0.0839 | - |
0.1916 | 4400 | 0.0116 | - |
0.1938 | 4450 | 0.2666 | - |
0.1960 | 4500 | 0.0648 | - |
0.1981 | 4550 | 0.074 | - |
0.2003 | 4600 | 0.077 | - |
0.2025 | 4650 | 0.0739 | - |
0.2047 | 4700 | 0.0029 | - |
0.2069 | 4750 | 0.0679 | - |
0.2090 | 4800 | 0.0049 | - |
0.2112 | 4850 | 0.0281 | - |
0.2134 | 4900 | 0.049 | - |
0.2156 | 4950 | 0.0052 | - |
0.2177 | 5000 | 0.1657 | - |
0.2199 | 5050 | 0.0005 | - |
0.2221 | 5100 | 0.0041 | - |
0.2243 | 5150 | 0.0008 | - |
0.2265 | 5200 | 0.0587 | - |
0.2286 | 5250 | 0.0753 | - |
0.2308 | 5300 | 0.1744 | - |
0.2330 | 5350 | 0.0055 | - |
0.2352 | 5400 | 0.0023 | - |
0.2373 | 5450 | 0.0002 | - |
0.2395 | 5500 | 0.0472 | - |
0.2417 | 5550 | 0.0042 | - |
0.2439 | 5600 | 0.0137 | - |
0.2460 | 5650 | 0.1646 | - |
0.2482 | 5700 | 0.0509 | - |
0.2504 | 5750 | 0.0062 | - |
0.2526 | 5800 | 0.0019 | - |
0.2548 | 5850 | 0.0048 | - |
0.2569 | 5900 | 0.0031 | - |
0.2591 | 5950 | 0.0011 | - |
0.2613 | 6000 | 0.004 | - |
0.2635 | 6050 | 0.0498 | - |
0.2656 | 6100 | 0.0042 | - |
0.2678 | 6150 | 0.0018 | - |
0.2700 | 6200 | 0.0061 | - |
0.2722 | 6250 | 0.1355 | - |
0.2744 | 6300 | 0.0039 | - |
0.2765 | 6350 | 0.0044 | - |
0.2787 | 6400 | 0.001 | - |
0.2809 | 6450 | 0.0011 | - |
0.2831 | 6500 | 0.0302 | - |
0.2852 | 6550 | 0.1502 | - |
0.2874 | 6600 | 0.0029 | - |
0.2896 | 6650 | 0.0016 | - |
0.2918 | 6700 | 0.0232 | - |
0.2940 | 6750 | 0.176 | - |
0.2961 | 6800 | 0.0323 | - |
0.2983 | 6850 | 0.0818 | - |
0.3005 | 6900 | 0.0427 | - |
0.3027 | 6950 | 0.1716 | - |
0.3048 | 7000 | 0.0137 | - |
0.3070 | 7050 | 0.0032 | - |
0.3092 | 7100 | 0.0095 | - |
0.3114 | 7150 | 0.177 | - |
0.3135 | 7200 | 0.0005 | - |
0.3157 | 7250 | 0.0157 | - |
0.3179 | 7300 | 0.0012 | - |
0.3201 | 7350 | 0.0027 | - |
0.3223 | 7400 | 0.1351 | - |
0.3244 | 7450 | 0.0019 | - |
0.3266 | 7500 | 0.0009 | - |
0.3288 | 7550 | 0.2017 | - |
0.3310 | 7600 | 0.0059 | - |
0.3331 | 7650 | 0.0013 | - |
0.3353 | 7700 | 0.0377 | - |
0.3375 | 7750 | 0.0056 | - |
0.3397 | 7800 | 0.0055 | - |
0.3419 | 7850 | 0.0745 | - |
0.3440 | 7900 | 0.0046 | - |
0.3462 | 7950 | 0.002 | - |
0.3484 | 8000 | 0.0355 | - |
0.3506 | 8050 | 0.0004 | - |
0.3527 | 8100 | 0.0004 | - |
0.3549 | 8150 | 0.0072 | - |
0.3571 | 8200 | 0.0013 | - |
0.3593 | 8250 | 0.0032 | - |
0.3615 | 8300 | 0.0006 | - |
0.3636 | 8350 | 0.0095 | - |
0.3658 | 8400 | 0.0006 | - |
0.3680 | 8450 | 0.0005 | - |
0.3702 | 8500 | 0.0004 | - |
0.3723 | 8550 | 0.0019 | - |
0.3745 | 8600 | 0.0002 | - |
0.3767 | 8650 | 0.0015 | - |
0.3789 | 8700 | 0.0117 | - |
0.3810 | 8750 | 0.002 | - |
0.3832 | 8800 | 0.0005 | - |
0.3854 | 8850 | 0.0009 | - |
0.3876 | 8900 | 0.0041 | - |
0.3898 | 8950 | 0.0484 | - |
0.3919 | 9000 | 0.0058 | - |
0.3941 | 9050 | 0.0027 | - |
0.3963 | 9100 | 0.0002 | - |
0.3985 | 9150 | 0.2323 | - |
0.4006 | 9200 | 0.0163 | - |
0.4028 | 9250 | 0.0333 | - |
0.4050 | 9300 | 0.0033 | - |
0.4072 | 9350 | 0.0023 | - |
0.4094 | 9400 | 0.0044 | - |
0.4115 | 9450 | 0.0142 | - |
0.4137 | 9500 | 0.0261 | - |
0.4159 | 9550 | 0.004 | - |
0.4181 | 9600 | 0.027 | - |
0.4202 | 9650 | 0.0104 | - |
0.4224 | 9700 | 0.0005 | - |
0.4246 | 9750 | 0.2452 | - |
0.4268 | 9800 | 0.0069 | - |
0.4290 | 9850 | 0.0245 | - |
0.4311 | 9900 | 0.0005 | - |
0.4333 | 9950 | 0.0041 | - |
0.4355 | 10000 | 0.1058 | - |
0.4377 | 10050 | 0.0009 | - |
0.4398 | 10100 | 0.0067 | - |
0.4420 | 10150 | 0.0832 | - |
0.4442 | 10200 | 0.0016 | - |
0.4464 | 10250 | 0.039 | - |
0.4485 | 10300 | 0.0078 | - |
0.4507 | 10350 | 0.0013 | - |
0.4529 | 10400 | 0.0003 | - |
0.4551 | 10450 | 0.0259 | - |
0.4573 | 10500 | 0.008 | - |
0.4594 | 10550 | 0.2137 | - |
0.4616 | 10600 | 0.0083 | - |
0.4638 | 10650 | 0.0206 | - |
0.4660 | 10700 | 0.0039 | - |
0.4681 | 10750 | 0.2205 | - |
0.4703 | 10800 | 0.0072 | - |
0.4725 | 10850 | 0.0436 | - |
0.4747 | 10900 | 0.071 | - |
0.4769 | 10950 | 0.0004 | - |
0.4790 | 11000 | 0.0147 | - |
0.4812 | 11050 | 0.0095 | - |
0.4834 | 11100 | 0.0069 | - |
0.4856 | 11150 | 0.0027 | - |
0.4877 | 11200 | 0.0151 | - |
0.4899 | 11250 | 0.0076 | - |
0.4921 | 11300 | 0.0016 | - |
0.4943 | 11350 | 0.1457 | - |
0.4965 | 11400 | 0.1454 | - |
0.4986 | 11450 | 0.0013 | - |
0.5008 | 11500 | 0.0027 | - |
0.5030 | 11550 | 0.0583 | - |
0.5052 | 11600 | 0.0029 | - |
0.5073 | 11650 | 0.0139 | - |
0.5095 | 11700 | 0.0004 | - |
0.5117 | 11750 | 0.0098 | - |
0.5139 | 11800 | 0.0009 | - |
0.5160 | 11850 | 0.0003 | - |
0.5182 | 11900 | 0.0009 | - |
0.5204 | 11950 | 0.0088 | - |
0.5226 | 12000 | 0.0006 | - |
0.5248 | 12050 | 0.0014 | - |
0.5269 | 12100 | 0.0008 | - |
0.5291 | 12150 | 0.0008 | - |
0.5313 | 12200 | 0.0008 | - |
0.5335 | 12250 | 0.0005 | - |
0.5356 | 12300 | 0.0028 | - |
0.5378 | 12350 | 0.0011 | - |
0.5400 | 12400 | 0.0136 | - |
0.5422 | 12450 | 0.0318 | - |
0.5444 | 12500 | 0.0037 | - |
0.5465 | 12550 | 0.0029 | - |
0.5487 | 12600 | 0.0073 | - |
0.5509 | 12650 | 0.0099 | - |
0.5531 | 12700 | 0.015 | - |
0.5552 | 12750 | 0.0047 | - |
0.5574 | 12800 | 0.0891 | - |
0.5596 | 12850 | 0.0007 | - |
0.5618 | 12900 | 0.0784 | - |
0.5640 | 12950 | 0.0636 | - |
0.5661 | 13000 | 0.0029 | - |
0.5683 | 13050 | 0.0048 | - |
0.5705 | 13100 | 0.0698 | - |
0.5727 | 13150 | 0.0002 | - |
0.5748 | 13200 | 0.0734 | - |
0.5770 | 13250 | 0.0004 | - |
0.5792 | 13300 | 0.0135 | - |
0.5814 | 13350 | 0.0034 | - |
0.5835 | 13400 | 0.0018 | - |
0.5857 | 13450 | 0.0175 | - |
0.5879 | 13500 | 0.0003 | - |
0.5901 | 13550 | 0.0002 | - |
0.5923 | 13600 | 0.0032 | - |
0.5944 | 13650 | 0.0007 | - |
0.5966 | 13700 | 0.0021 | - |
0.5988 | 13750 | 0.0019 | - |
0.6010 | 13800 | 0.0006 | - |
0.6031 | 13850 | 0.0014 | - |
0.6053 | 13900 | 0.0011 | - |
0.6075 | 13950 | 0.2383 | - |
0.6097 | 14000 | 0.0009 | - |
0.6119 | 14050 | 0.0863 | - |
0.6140 | 14100 | 0.0005 | - |
0.6162 | 14150 | 0.0017 | - |
0.6184 | 14200 | 0.0003 | - |
0.6206 | 14250 | 0.0025 | - |
0.6227 | 14300 | 0.0008 | - |
0.6249 | 14350 | 0.0005 | - |
0.6271 | 14400 | 0.0006 | - |
0.6293 | 14450 | 0.0517 | - |
0.6315 | 14500 | 0.0005 | - |
0.6336 | 14550 | 0.0075 | - |
0.6358 | 14600 | 0.0004 | - |
0.6380 | 14650 | 0.0003 | - |
0.6402 | 14700 | 0.0003 | - |
0.6423 | 14750 | 0.0045 | - |
0.6445 | 14800 | 0.0005 | - |
0.6467 | 14850 | 0.0002 | - |
0.6489 | 14900 | 0.0125 | - |
0.6510 | 14950 | 0.0015 | - |
0.6532 | 15000 | 0.0017 | - |
0.6554 | 15050 | 0.0011 | - |
0.6576 | 15100 | 0.0207 | - |
0.6598 | 15150 | 0.0002 | - |
0.6619 | 15200 | 0.0252 | - |
0.6641 | 15250 | 0.0006 | - |
0.6663 | 15300 | 0.0015 | - |
0.6685 | 15350 | 0.0018 | - |
0.6706 | 15400 | 0.0386 | - |
0.6728 | 15450 | 0.0011 | - |
0.6750 | 15500 | 0.0003 | - |
0.6772 | 15550 | 0.0007 | - |
0.6794 | 15600 | 0.0028 | - |
0.6815 | 15650 | 0.0056 | - |
0.6837 | 15700 | 0.0005 | - |
0.6859 | 15750 | 0.0002 | - |
0.6881 | 15800 | 0.0305 | - |
0.6902 | 15850 | 0.0005 | - |
0.6924 | 15900 | 0.0018 | - |
0.6946 | 15950 | 0.0011 | - |
0.6968 | 16000 | 0.0006 | - |
0.6990 | 16050 | 0.0072 | - |
0.7011 | 16100 | 0.0224 | - |
0.7033 | 16150 | 0.0011 | - |
0.7055 | 16200 | 0.0005 | - |
0.7077 | 16250 | 0.0007 | - |
0.7098 | 16300 | 0.0005 | - |
0.7120 | 16350 | 0.0028 | - |
0.7142 | 16400 | 0.0017 | - |
0.7164 | 16450 | 0.2294 | - |
0.7185 | 16500 | 0.0253 | - |
0.7207 | 16550 | 0.0122 | - |
0.7229 | 16600 | 0.0001 | - |
0.7251 | 16650 | 0.0327 | - |
0.7273 | 16700 | 0.0042 | - |
0.7294 | 16750 | 0.0008 | - |
0.7316 | 16800 | 0.0004 | - |
0.7338 | 16850 | 0.0003 | - |
0.7360 | 16900 | 0.0005 | - |
0.7381 | 16950 | 0.0003 | - |
0.7403 | 17000 | 0.0021 | - |
0.7425 | 17050 | 0.2041 | - |
0.7447 | 17100 | 0.0002 | - |
0.7469 | 17150 | 0.0006 | - |
0.7490 | 17200 | 0.0002 | - |
0.7512 | 17250 | 0.0008 | - |
0.7534 | 17300 | 0.068 | - |
0.7556 | 17350 | 0.0016 | - |
0.7577 | 17400 | 0.0006 | - |
0.7599 | 17450 | 0.0005 | - |
0.7621 | 17500 | 0.0011 | - |
0.7643 | 17550 | 0.2192 | - |
0.7665 | 17600 | 0.0006 | - |
0.7686 | 17650 | 0.0003 | - |
0.7708 | 17700 | 0.0017 | - |
0.7730 | 17750 | 0.0033 | - |
0.7752 | 17800 | 0.0001 | - |
0.7773 | 17850 | 0.0011 | - |
0.7795 | 17900 | 0.0302 | - |
0.7817 | 17950 | 0.0004 | - |
0.7839 | 18000 | 0.2921 | - |
0.7860 | 18050 | 0.0001 | - |
0.7882 | 18100 | 0.006 | - |
0.7904 | 18150 | 0.0164 | - |
0.7926 | 18200 | 0.0003 | - |
0.7948 | 18250 | 0.0021 | - |
0.7969 | 18300 | 0.0094 | - |
0.7991 | 18350 | 0.002 | - |
0.8013 | 18400 | 0.0405 | - |
0.8035 | 18450 | 0.001 | - |
0.8056 | 18500 | 0.2594 | - |
0.8078 | 18550 | 0.0075 | - |
0.8100 | 18600 | 0.0003 | - |
0.8122 | 18650 | 0.0009 | - |
0.8144 | 18700 | 0.0018 | - |
0.8165 | 18750 | 0.0007 | - |
0.8187 | 18800 | 0.0006 | - |
0.8209 | 18850 | 0.0009 | - |
0.8231 | 18900 | 0.0003 | - |
0.8252 | 18950 | 0.0006 | - |
0.8274 | 19000 | 0.0002 | - |
0.8296 | 19050 | 0.0004 | - |
0.8318 | 19100 | 0.0018 | - |
0.8340 | 19150 | 0.0007 | - |
0.8361 | 19200 | 0.0005 | - |
0.8383 | 19250 | 0.0206 | - |
0.8405 | 19300 | 0.0005 | - |
0.8427 | 19350 | 0.1918 | - |
0.8448 | 19400 | 0.0093 | - |
0.8470 | 19450 | 0.0032 | - |
0.8492 | 19500 | 0.0004 | - |
0.8514 | 19550 | 0.1727 | - |
0.8535 | 19600 | 0.2034 | - |
0.8557 | 19650 | 0.0007 | - |
0.8579 | 19700 | 0.0004 | - |
0.8601 | 19750 | 0.0001 | - |
0.8623 | 19800 | 0.0024 | - |
0.8644 | 19850 | 0.0122 | - |
0.8666 | 19900 | 0.0003 | - |
0.8688 | 19950 | 0.0093 | - |
0.8710 | 20000 | 0.0003 | - |
0.8731 | 20050 | 0.0007 | - |
0.8753 | 20100 | 0.0044 | - |
0.8775 | 20150 | 0.0006 | - |
0.8797 | 20200 | 0.0002 | - |
0.8819 | 20250 | 0.0003 | - |
0.8840 | 20300 | 0.0024 | - |
0.8862 | 20350 | 0.0051 | - |
0.8884 | 20400 | 0.0767 | - |
0.8906 | 20450 | 0.0004 | - |
0.8927 | 20500 | 0.0002 | - |
0.8949 | 20550 | 0.0007 | - |
0.8971 | 20600 | 0.0012 | - |
0.8993 | 20650 | 0.0004 | - |
0.9015 | 20700 | 0.0003 | - |
0.9036 | 20750 | 0.0002 | - |
0.9058 | 20800 | 0.0005 | - |
0.9080 | 20850 | 0.0007 | - |
0.9102 | 20900 | 0.0006 | - |
0.9123 | 20950 | 0.2469 | - |
0.9145 | 21000 | 0.0002 | - |
0.9167 | 21050 | 0.0009 | - |
0.9189 | 21100 | 0.002 | - |
0.9210 | 21150 | 0.0027 | - |
0.9232 | 21200 | 0.0007 | - |
0.9254 | 21250 | 0.0008 | - |
0.9276 | 21300 | 0.0265 | - |
0.9298 | 21350 | 0.0019 | - |
0.9319 | 21400 | 0.0003 | - |
0.9341 | 21450 | 0.0064 | - |
0.9363 | 21500 | 0.0003 | - |
0.9385 | 21550 | 0.0015 | - |
0.9406 | 21600 | 0.0002 | - |
0.9428 | 21650 | 0.0015 | - |
0.9450 | 21700 | 0.1497 | - |
0.9472 | 21750 | 0.1422 | - |
0.9494 | 21800 | 0.0001 | - |
0.9515 | 21850 | 0.0007 | - |
0.9537 | 21900 | 0.0053 | - |
0.9559 | 21950 | 0.0002 | - |
0.9581 | 22000 | 0.0003 | - |
0.9602 | 22050 | 0.1234 | - |
0.9624 | 22100 | 0.2087 | - |
0.9646 | 22150 | 0.0005 | - |
0.9668 | 22200 | 0.0001 | - |
0.9690 | 22250 | 0.0003 | - |
0.9711 | 22300 | 0.0004 | - |
0.9733 | 22350 | 0.0014 | - |
0.9755 | 22400 | 0.0021 | - |
0.9777 | 22450 | 0.0105 | - |
0.9798 | 22500 | 0.0009 | - |
0.9820 | 22550 | 0.0003 | - |
0.9842 | 22600 | 0.0006 | - |
0.9864 | 22650 | 0.0007 | - |
0.9885 | 22700 | 0.0021 | - |
0.9907 | 22750 | 0.003 | - |
0.9929 | 22800 | 0.0099 | - |
0.9951 | 22850 | 0.001 | - |
0.9973 | 22900 | 0.0521 | - |
0.9994 | 22950 | 0.0003 | - |
Framework Versions
- Python: 3.10.13
- SetFit: 1.0.1
- Sentence Transformers: 2.2.2
- Transformers: 4.36.2
- PyTorch: 2.0.1
- Datasets: 2.16.1
- Tokenizers: 0.15.0
Citation
BibTeX
@article{https://doi.org/10.48550/arxiv.2209.11055,
doi = {10.48550/ARXIV.2209.11055},
url = {https://arxiv.org/abs/2209.11055},
author = {Tunstall, Lewis and Reimers, Nils and Jo, Unso Eun Seo and Bates, Luke and Korat, Daniel and Wasserblat, Moshe and Pereg, Oren},
keywords = {Computation and Language (cs.CL), FOS: Computer and information sciences, FOS: Computer and information sciences},
title = {Efficient Few-Shot Learning Without Prompts},
publisher = {arXiv},
year = {2022},
copyright = {Creative Commons Attribution 4.0 International}
}
- Downloads last month
- 22
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.
Model tree for Jalajkx/all_mpnetcric-setfit-model
Base model
sentence-transformers/all-mpnet-base-v2