SetFit with Vishal24/bert-1ds-domain

This is a SetFit model trained on the Vishal24/BCG_classifier dataset that can be used for Text Classification. This SetFit model uses Vishal24/bert-1ds-domain as the Sentence Transformer embedding model. A SetFitHead instance is used for classification.

The model has been trained using an efficient few-shot learning technique that involves:

  1. Fine-tuning a Sentence Transformer with contrastive learning.
  2. Training a classification head with features from the fine-tuned Sentence Transformer.

Model Details

Model Description

Model Sources

Model Labels

Label Examples
0
  • 'mois'
  • 'time skincare soap'
  • 'paraben free'
1
  • 'tomato ketchup 1kg flipkart'
  • 'sunsilk keratin yogurt shampoo lusciously thick long'
  • 'wow aloevera soap'

Evaluation

Metrics

Label F1
all 0.9233

Uses

Direct Use for Inference

First install the SetFit library:

pip install setfit

Then you can load this model and run inference.

from setfit import SetFitModel

# Download from the 🤗 Hub
model = SetFitModel.from_pretrained("Vishal24/BCG-classifier")
# Run inference
preds = model("hazelnut")

Training Details

Training Set Metrics

Training set Min Median Max
Word count 1 3.4474 19
Label Training Sample Count
0 2252
1 1262

Training Hyperparameters

  • batch_size: (16, 2)
  • num_epochs: (3, 3)
  • max_steps: -1
  • sampling_strategy: oversampling
  • num_iterations: 20
  • body_learning_rate: (2e-05, 1e-05)
  • head_learning_rate: 0.01
  • loss: CosineSimilarityLoss
  • distance_metric: cosine_distance
  • margin: 0.25
  • end_to_end: False
  • use_amp: False
  • warmup_proportion: 0.1
  • seed: 42
  • eval_max_steps: -1
  • load_best_model_at_end: False

Training Results

Epoch Step Training Loss Validation Loss
0.0001 1 0.2765 -
0.0057 50 0.2529 -
0.0114 100 0.252 -
0.0171 150 0.2657 -
0.0228 200 0.2735 -
0.0285 250 0.236 -
0.0341 300 0.2366 -
0.0398 350 0.2316 -
0.0455 400 0.185 -
0.0512 450 0.1396 -
0.0569 500 0.2137 -
0.0626 550 0.093 -
0.0683 600 0.1219 -
0.0740 650 0.0974 -
0.0797 700 0.2257 -
0.0854 750 0.0951 -
0.0911 800 0.0994 -
0.0968 850 0.0752 -
0.1024 900 0.0848 -
0.1081 950 0.015 -
0.1138 1000 0.0541 -
0.1195 1050 0.0357 -
0.1252 1100 0.0314 -
0.1309 1150 0.0557 -
0.1366 1200 0.0027 -
0.1423 1250 0.0387 -
0.1480 1300 0.0026 -
0.1537 1350 0.044 -
0.1594 1400 0.0499 -
0.1651 1450 0.001 -
0.1707 1500 0.0007 -
0.1764 1550 0.0008 -
0.1821 1600 0.0009 -
0.1878 1650 0.053 -
0.1935 1700 0.1111 -
0.1992 1750 0.0018 -
0.2049 1800 0.0009 -
0.2106 1850 0.0008 -
0.2163 1900 0.0011 -
0.2220 1950 0.0042 -
0.2277 2000 0.0005 -
0.2334 2050 0.0023 -
0.2390 2100 0.0003 -
0.2447 2150 0.0004 -
0.2504 2200 0.055 -
0.2561 2250 0.0584 -
0.2618 2300 0.06 -
0.2675 2350 0.0004 -
0.2732 2400 0.0022 -
0.2789 2450 0.0005 -
0.2846 2500 0.0014 -
0.2903 2550 0.0008 -
0.2960 2600 0.0004 -
0.3017 2650 0.0118 -
0.3073 2700 0.0892 -
0.3130 2750 0.0004 -
0.3187 2800 0.0061 -
0.3244 2850 0.0601 -
0.3301 2900 0.0003 -
0.3358 2950 0.0007 -
0.3415 3000 0.0006 -
0.3472 3050 0.0002 -
0.3529 3100 0.0002 -
0.3586 3150 0.0005 -
0.3643 3200 0.0003 -
0.3699 3250 0.0002 -
0.3756 3300 0.0008 -
0.3813 3350 0.0002 -
0.3870 3400 0.0513 -
0.3927 3450 0.0003 -
0.3984 3500 0.0002 -
0.4041 3550 0.0006 -
0.4098 3600 0.0005 -
0.4155 3650 0.0003 -
0.4212 3700 0.0002 -
0.4269 3750 0.0002 -
0.4326 3800 0.0005 -
0.4382 3850 0.0001 -
0.4439 3900 0.0002 -
0.4496 3950 0.0001 -
0.4553 4000 0.0003 -
0.4610 4050 0.0001 -
0.4667 4100 0.0595 -
0.4724 4150 0.0002 -
0.4781 4200 0.0001 -
0.4838 4250 0.0002 -
0.4895 4300 0.0001 -
0.4952 4350 0.0002 -
0.5009 4400 0.0001 -
0.5065 4450 0.0001 -
0.5122 4500 0.0002 -
0.5179 4550 0.0001 -
0.5236 4600 0.0014 -
0.5293 4650 0.0001 -
0.5350 4700 0.0001 -
0.5407 4750 0.0002 -
0.5464 4800 0.0001 -
0.5521 4850 0.0419 -
0.5578 4900 0.0001 -
0.5635 4950 0.0001 -
0.5692 5000 0.0001 -
0.5748 5050 0.0001 -
0.5805 5100 0.0001 -
0.5862 5150 0.0001 -
0.5919 5200 0.0001 -
0.5976 5250 0.0001 -
0.6033 5300 0.0001 -
0.6090 5350 0.0001 -
0.6147 5400 0.0 -
0.6204 5450 0.0 -
0.6261 5500 0.0001 -
0.6318 5550 0.0 -
0.6375 5600 0.0001 -
0.6431 5650 0.0001 -
0.6488 5700 0.0006 -
0.6545 5750 0.0001 -
0.6602 5800 0.0001 -
0.6659 5850 0.0001 -
0.6716 5900 0.0001 -
0.6773 5950 0.0001 -
0.6830 6000 0.0002 -
0.6887 6050 0.0002 -
0.6944 6100 0.0001 -
0.7001 6150 0.0001 -
0.7057 6200 0.0001 -
0.7114 6250 0.0 -
0.7171 6300 0.0001 -
0.7228 6350 0.0001 -
0.7285 6400 0.0001 -
0.7342 6450 0.0001 -
0.7399 6500 0.0002 -
0.7456 6550 0.0001 -
0.7513 6600 0.0001 -
0.7570 6650 0.0 -
0.7627 6700 0.0001 -
0.7684 6750 0.0001 -
0.7740 6800 0.0001 -
0.7797 6850 0.0003 -
0.7854 6900 0.0515 -
0.7911 6950 0.0001 -
0.7968 7000 0.0003 -
0.8025 7050 0.0001 -
0.8082 7100 0.0001 -
0.8139 7150 0.0001 -
0.8196 7200 0.0 -
0.8253 7250 0.0001 -
0.8310 7300 0.0 -
0.8367 7350 0.0001 -
0.8423 7400 0.0001 -
0.8480 7450 0.0001 -
0.8537 7500 0.0001 -
0.8594 7550 0.0 -
0.8651 7600 0.0 -
0.8708 7650 0.0 -
0.8765 7700 0.0 -
0.8822 7750 0.0014 -
0.8879 7800 0.0001 -
0.8936 7850 0.0001 -
0.8993 7900 0.0 -
0.9050 7950 0.0001 -
0.9106 8000 0.0002 -
0.9163 8050 0.0001 -
0.9220 8100 0.0 -
0.9277 8150 0.0 -
0.9334 8200 0.0001 -
0.9391 8250 0.0 -
0.9448 8300 0.0001 -
0.9505 8350 0.0004 -
0.9562 8400 0.0001 -
0.9619 8450 0.0 -
0.9676 8500 0.001 -
0.9732 8550 0.0001 -
0.9789 8600 0.0001 -
0.9846 8650 0.0 -
0.9903 8700 0.0 -
0.9960 8750 0.0001 -
1.0017 8800 0.0002 -
1.0074 8850 0.0 -
1.0131 8900 0.0 -
1.0188 8950 0.0 -
1.0245 9000 0.0001 -
1.0302 9050 0.0 -
1.0359 9100 0.0 -
1.0415 9150 0.0 -
1.0472 9200 0.0 -
1.0529 9250 0.0 -
1.0586 9300 0.0 -
1.0643 9350 0.0 -
1.0700 9400 0.0001 -
1.0757 9450 0.0 -
1.0814 9500 0.0 -
1.0871 9550 0.0 -
1.0928 9600 0.0 -
1.0985 9650 0.0 -
1.1042 9700 0.0001 -
1.1098 9750 0.0002 -
1.1155 9800 0.0097 -
1.1212 9850 0.0 -
1.1269 9900 0.0 -
1.1326 9950 0.0001 -
1.1383 10000 0.0 -
1.1440 10050 0.0 -
1.1497 10100 0.0001 -
1.1554 10150 0.0004 -
1.1611 10200 0.0 -
1.1668 10250 0.0 -
1.1725 10300 0.0 -
1.1781 10350 0.0 -
1.1838 10400 0.0001 -
1.1895 10450 0.0 -
1.1952 10500 0.0 -
1.2009 10550 0.0 -
1.2066 10600 0.0 -
1.2123 10650 0.0 -
1.2180 10700 0.0001 -
1.2237 10750 0.0 -
1.2294 10800 0.0 -
1.2351 10850 0.0001 -
1.2408 10900 0.0305 -
1.2464 10950 0.0617 -
1.2521 11000 0.0 -
1.2578 11050 0.0 -
1.2635 11100 0.0 -
1.2692 11150 0.0 -
1.2749 11200 0.0 -
1.2806 11250 0.0 -
1.2863 11300 0.0 -
1.2920 11350 0.0 -
1.2977 11400 0.0 -
1.3034 11450 0.0 -
1.3090 11500 0.0 -
1.3147 11550 0.0 -
1.3204 11600 0.0 -
1.3261 11650 0.0 -
1.3318 11700 0.0 -
1.3375 11750 0.0 -
1.3432 11800 0.0 -
1.3489 11850 0.0 -
1.3546 11900 0.0 -
1.3603 11950 0.0 -
1.3660 12000 0.0 -
1.3717 12050 0.0 -
1.3773 12100 0.0 -
1.3830 12150 0.0 -
1.3887 12200 0.0 -
1.3944 12250 0.0 -
1.4001 12300 0.0 -
1.4058 12350 0.0 -
1.4115 12400 0.0 -
1.4172 12450 0.0 -
1.4229 12500 0.0 -
1.4286 12550 0.0 -
1.4343 12600 0.0 -
1.4400 12650 0.0 -
1.4456 12700 0.0 -
1.4513 12750 0.0 -
1.4570 12800 0.0 -
1.4627 12850 0.0 -
1.4684 12900 0.0 -
1.4741 12950 0.0 -
1.4798 13000 0.0 -
1.4855 13050 0.0 -
1.4912 13100 0.0 -
1.4969 13150 0.0001 -
1.5026 13200 0.0 -
1.5083 13250 0.0 -
1.5139 13300 0.0 -
1.5196 13350 0.0 -
1.5253 13400 0.0 -
1.5310 13450 0.0 -
1.5367 13500 0.0001 -
1.5424 13550 0.0 -
1.5481 13600 0.0 -
1.5538 13650 0.0 -
1.5595 13700 0.0001 -
1.5652 13750 0.0001 -
1.5709 13800 0.0 -
1.5766 13850 0.0001 -
1.5822 13900 0.0 -
1.5879 13950 0.0 -
1.5936 14000 0.0 -
1.5993 14050 0.0 -
1.6050 14100 0.0 -
1.6107 14150 0.0 -
1.6164 14200 0.0 -
1.6221 14250 0.0 -
1.6278 14300 0.0 -
1.6335 14350 0.0 -
1.6392 14400 0.0 -
1.6448 14450 0.0 -
1.6505 14500 0.0 -
1.6562 14550 0.0 -
1.6619 14600 0.0 -
1.6676 14650 0.0 -
1.6733 14700 0.0 -
1.6790 14750 0.0 -
1.6847 14800 0.0 -
1.6904 14850 0.0 -
1.6961 14900 0.0 -
1.7018 14950 0.0 -
1.7075 15000 0.0 -
1.7131 15050 0.0 -
1.7188 15100 0.0 -
1.7245 15150 0.0001 -
1.7302 15200 0.0 -
1.7359 15250 0.0 -
1.7416 15300 0.0002 -
1.7473 15350 0.0 -
1.7530 15400 0.0 -
1.7587 15450 0.0 -
1.7644 15500 0.0 -
1.7701 15550 0.0 -
1.7758 15600 0.0 -
1.7814 15650 0.0 -
1.7871 15700 0.0 -
1.7928 15750 0.0 -
1.7985 15800 0.0 -
1.8042 15850 0.0 -
1.8099 15900 0.0 -
1.8156 15950 0.0 -
1.8213 16000 0.0 -
1.8270 16050 0.0 -
1.8327 16100 0.0 -
1.8384 16150 0.0001 -
1.8441 16200 0.0 -
1.8497 16250 0.0 -
1.8554 16300 0.0 -
1.8611 16350 0.0 -
1.8668 16400 0.0 -
1.8725 16450 0.0 -
1.8782 16500 0.0 -
1.8839 16550 0.0 -
1.8896 16600 0.0 -
1.8953 16650 0.0 -
1.9010 16700 0.0 -
1.9067 16750 0.0 -
1.9124 16800 0.0 -
1.9180 16850 0.0 -
1.9237 16900 0.0 -
1.9294 16950 0.0 -
1.9351 17000 0.0 -
1.9408 17050 0.0 -
1.9465 17100 0.0 -
1.9522 17150 0.0 -
1.9579 17200 0.0 -
1.9636 17250 0.0 -
1.9693 17300 0.0 -
1.9750 17350 0.0 -
1.9806 17400 0.0 -
1.9863 17450 0.0 -
1.9920 17500 0.0 -
1.9977 17550 0.0 -
2.0034 17600 0.0 -
2.0091 17650 0.0 -
2.0148 17700 0.0 -
2.0205 17750 0.0 -
2.0262 17800 0.0 -
2.0319 17850 0.0523 -
2.0376 17900 0.0 -
2.0433 17950 0.0 -
2.0489 18000 0.0 -
2.0546 18050 0.0 -
2.0603 18100 0.0 -
2.0660 18150 0.0 -
2.0717 18200 0.0 -
2.0774 18250 0.0 -
2.0831 18300 0.0 -
2.0888 18350 0.0 -
2.0945 18400 0.0 -
2.1002 18450 0.0 -
2.1059 18500 0.0 -
2.1116 18550 0.0 -
2.1172 18600 0.0 -
2.1229 18650 0.0 -
2.1286 18700 0.0 -
2.1343 18750 0.0 -
2.1400 18800 0.0 -
2.1457 18850 0.0 -
2.1514 18900 0.0 -
2.1571 18950 0.0 -
2.1628 19000 0.0 -
2.1685 19050 0.0 -
2.1742 19100 0.0 -
2.1799 19150 0.0 -
2.1855 19200 0.0 -
2.1912 19250 0.0 -
2.1969 19300 0.0 -
2.2026 19350 0.0 -
2.2083 19400 0.0 -
2.2140 19450 0.0 -
2.2197 19500 0.0 -
2.2254 19550 0.0 -
2.2311 19600 0.0 -
2.2368 19650 0.0 -
2.2425 19700 0.0 -
2.2482 19750 0.0 -
2.2538 19800 0.0 -
2.2595 19850 0.0 -
2.2652 19900 0.0 -
2.2709 19950 0.0 -
2.2766 20000 0.0 -
2.2823 20050 0.0 -
2.2880 20100 0.0 -
2.2937 20150 0.0 -
2.2994 20200 0.0 -
2.3051 20250 0.0 -
2.3108 20300 0.0 -
2.3164 20350 0.0 -
2.3221 20400 0.0 -
2.3278 20450 0.0 -
2.3335 20500 0.0 -
2.3392 20550 0.0 -
2.3449 20600 0.0 -
2.3506 20650 0.0 -
2.3563 20700 0.0 -
2.3620 20750 0.0 -
2.3677 20800 0.0 -
2.3734 20850 0.0 -
2.3791 20900 0.0 -
2.3847 20950 0.0 -
2.3904 21000 0.0 -
2.3961 21050 0.0 -
2.4018 21100 0.0 -
2.4075 21150 0.0 -
2.4132 21200 0.0 -
2.4189 21250 0.0 -
2.4246 21300 0.0 -
2.4303 21350 0.0 -
2.4360 21400 0.0 -
2.4417 21450 0.0 -
2.4474 21500 0.0 -
2.4530 21550 0.0 -
2.4587 21600 0.0 -
2.4644 21650 0.0 -
2.4701 21700 0.0 -
2.4758 21750 0.0 -
2.4815 21800 0.0 -
2.4872 21850 0.0 -
2.4929 21900 0.0 -
2.4986 21950 0.0 -
2.5043 22000 0.0 -
2.5100 22050 0.0 -
2.5157 22100 0.0 -
2.5213 22150 0.0 -
2.5270 22200 0.0 -
2.5327 22250 0.0 -
2.5384 22300 0.0 -
2.5441 22350 0.0 -
2.5498 22400 0.0 -
2.5555 22450 0.0 -
2.5612 22500 0.0 -
2.5669 22550 0.0 -
2.5726 22600 0.0 -
2.5783 22650 0.0 -
2.5839 22700 0.0 -
2.5896 22750 0.0 -
2.5953 22800 0.0 -
2.6010 22850 0.0 -
2.6067 22900 0.0 -
2.6124 22950 0.0 -
2.6181 23000 0.0 -
2.6238 23050 0.0 -
2.6295 23100 0.0 -
2.6352 23150 0.0 -
2.6409 23200 0.0 -
2.6466 23250 0.0 -
2.6522 23300 0.0 -
2.6579 23350 0.0 -
2.6636 23400 0.0 -
2.6693 23450 0.0 -
2.6750 23500 0.0 -
2.6807 23550 0.0 -
2.6864 23600 0.0 -
2.6921 23650 0.0 -
2.6978 23700 0.0 -
2.7035 23750 0.0 -
2.7092 23800 0.0 -
2.7149 23850 0.0 -
2.7205 23900 0.0 -
2.7262 23950 0.0 -
2.7319 24000 0.0 -
2.7376 24050 0.0 -
2.7433 24100 0.0 -
2.7490 24150 0.0 -
2.7547 24200 0.0 -
2.7604 24250 0.0 -
2.7661 24300 0.0 -
2.7718 24350 0.0 -
2.7775 24400 0.0 -
2.7832 24450 0.0 -
2.7888 24500 0.0 -
2.7945 24550 0.0 -
2.8002 24600 0.0 -
2.8059 24650 0.0 -
2.8116 24700 0.0 -
2.8173 24750 0.0 -
2.8230 24800 0.0 -
2.8287 24850 0.0 -
2.8344 24900 0.0 -
2.8401 24950 0.0 -
2.8458 25000 0.0 -
2.8515 25050 0.0 -
2.8571 25100 0.0 -
2.8628 25150 0.0 -
2.8685 25200 0.0 -
2.8742 25250 0.0 -
2.8799 25300 0.0 -
2.8856 25350 0.0 -
2.8913 25400 0.0 -
2.8970 25450 0.0 -
2.9027 25500 0.0 -
2.9084 25550 0.0 -
2.9141 25600 0.0 -
2.9197 25650 0.0 -
2.9254 25700 0.0 -
2.9311 25750 0.0 -
2.9368 25800 0.0 -
2.9425 25850 0.0 -
2.9482 25900 0.0 -
2.9539 25950 0.0 -
2.9596 26000 0.0 -
2.9653 26050 0.0 -
2.9710 26100 0.0 -
2.9767 26150 0.0 -
2.9824 26200 0.0 -
2.9880 26250 0.0 -
2.9937 26300 0.0 -
2.9994 26350 0.0 -

Framework Versions

  • Python: 3.10.12
  • SetFit: 1.0.3
  • Sentence Transformers: 3.3.1
  • Transformers: 4.41.2
  • PyTorch: 2.1.0+cu118
  • Datasets: 2.20.0
  • Tokenizers: 0.19.1

Citation

BibTeX

@article{https://doi.org/10.48550/arxiv.2209.11055,
    doi = {10.48550/ARXIV.2209.11055},
    url = {https://arxiv.org/abs/2209.11055},
    author = {Tunstall, Lewis and Reimers, Nils and Jo, Unso Eun Seo and Bates, Luke and Korat, Daniel and Wasserblat, Moshe and Pereg, Oren},
    keywords = {Computation and Language (cs.CL), FOS: Computer and information sciences, FOS: Computer and information sciences},
    title = {Efficient Few-Shot Learning Without Prompts},
    publisher = {arXiv},
    year = {2022},
    copyright = {Creative Commons Attribution 4.0 International}
}
Downloads last month
25
Safetensors
Model size
108M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for Vishal24/BCG-classifier

Finetuned
(1)
this model

Dataset used to train Vishal24/BCG-classifier

Evaluation results