SetFit with klue/roberta-base

This is a SetFit model that can be used for Text Classification. This SetFit model uses klue/roberta-base as the Sentence Transformer embedding model. A LogisticRegression instance is used for classification.

The model has been trained using an efficient few-shot learning technique that involves:

  1. Fine-tuning a Sentence Transformer with contrastive learning.
  2. Training a classification head with features from the fine-tuned Sentence Transformer.

Model Details

Model Description

  • Model Type: SetFit
  • Sentence Transformer body: klue/roberta-base
  • Classification head: a LogisticRegression instance
  • Maximum Sequence Length: 512 tokens
  • Number of Classes: 7 classes

Model Sources

Model Labels

Label Examples
6
  • '이희 마블 에센스 헤어 팩트 다크브라운 본품 LotteOn > 뷰티 > 헤어스타일링 > 염색약 LotteOn > 뷰티 > 헤어스타일링 > 염색약'
  • '더마클라센 스타일앤 볼륨짱짱 흑채 스프레이 블랙 120ml x5 MinSellAmount (#M)바디/헤어>헤어스타일링>염색약 Gmarket > 뷰티 > 바디/헤어 > 헤어스타일링 > 염색약'
  • '이희 마블 에센스 헤어 팩트 블랙 본품 (#M)홈>화장품/미용>헤어케어>탈모케어 Naverstore > 화장품/미용 > 헤어케어 > 탈모케어'
2
  • '웰라 크레아틴 플러스 쉐이프 N 펌 에멀전/건강/파마약 (#M)화장품/미용>헤어스타일링>파마약>웨이브 AD > traverse > Naverstore > 화장품/미용 > 헤어케어 > 파마약 > 웨이브'
  • '아모스 루미네이터 익스트림/하드/노멀/소프트/택 MinSellAmount (#M)바디/헤어>헤어스타일링>탈색제 Gmarket > 뷰티 > 바디/헤어 > 헤어스타일링 > 탈색제'
  • '아모스 실키블루밍 펌 1제2제 SET 파마약 MinSellAmount (#M)바디/헤어>헤어스타일링>파마약 Gmarket > 뷰티 > 바디/헤어 > 헤어스타일링 > 파마약'
5
  • '300ml펌프형 아르드포 헤어젤 (#M)SSG.COM/헤어/바디/헤어스타일링/헤어왁스/젤 ssg > 뷰티 > 헤어/바디 > 헤어스타일링 > 헤어왁스/젤'
  • '아르드포 헤어케어 헤어젤 180ml (#M)SSG.COM/헤어/바디/헤어기기/소품/기타헤어기기 ssg > 뷰티 > 헤어/바디 > 헤어기기/소품 > 기타헤어기기'
  • '(NC)LG 아르드포 헤어젤 튜브 180ml (#M)SSG.COM/헤어/바디/헤어기기/소품/기타헤어기기 ssg > 뷰티 > 헤어/바디 > 헤어기기/소품 > 기타헤어기기'
0
  • '엘라스틴 살롱드컬러 새치염색약 100g x3개 +샴푸 증정 03)밝은갈색 ssg > 뷰티 > 헤어/바디 > 헤어스타일링 > 염색약;ssg > 뷰티 > 미용기기/소품 > 바디관리기기;ssg > 뷰티 > 헤어/바디 > 헤어케어 > 샴푸;ssg > 뷰티 > 헤어/바디 > 헤어케어;ssg > 뷰티 > 헤어/바디 > 헤어스타일링 ssg > 뷰티 > 헤어/바디 > 헤어스타일링'
  • '댕기머리 포르테 프레스티지 4종옵션 /한방칼라크림 새치머리 염색약 4호 (자연갈색) (#M)11st>헤어케어>염색약>새치용염색약 11st > 뷰티 > 헤어케어 > 염색약 > 새치용염색약'
  • '리엔 흑모비책 골드 염색약 1입 x3개 자연갈색 (#M)바디/헤어>헤어스타일링>염색약 Gmarket > 뷰티 > 바디/헤어 > 헤어스타일링 > 염색약'
4
  • '아르드포 헤어스프레이 280ml (#M)SSG.COM/헤어/바디/헤어기기/소품/기타헤어기기 ssg > 뷰티 > 헤어/바디 > 헤어기기/소품 > 기타헤어기기'
  • '꽃을든남자 헤어케어시스템 헤어 스프레이(달콤한과일향) 300ml x 5개 MinSellAmount (#M)바디/헤어>헤어스타일링>헤어스프레이 Gmarket > 뷰티 > 바디/헤어 > 헤어스타일링 > 헤어스프레이'
  • '헤드스파7 블루밍매직 헤어스타일러 50ml MinSellAmount (#M)바디/헤어>헤어케어>헤어트리트먼트 Gmarket > 뷰티 > 바디/헤어 > 헤어케어 > 헤어트리트먼트'
1
  • '미쟝센 컬링에센스2X 숏스타일 150ml x2 LotteOn > 뷰티 > 헤어/바디 > 헤어케어 > 트리트먼트/헤어팩 LotteOn > 뷰티 > 헤어/바디 > 헤어케어 > 트리트먼트/헤어팩'
  • '미쟝센 컬링에센스2X 숏스타일 230ml 미쟝센 컬링에센스2X 숏스타일 230ml 홈>헤어케어>스타일링/에센스X>헤어에센스X;홈>헤어케어>스타일링/에센스>헤어에센스;(#M)홈>헤어케어>에센스>에센스 OLIVEYOUNG > 헤어케어 > 에센스 > 에센스'
  • '4개)미쟝센스테이지컬렉션 컬링에센스2X 탄력웨이브150ml 선택없음 Coupang > 뷰티 > 헤어 > 헤어스타일링 > 컬크림;(#M)쿠팡 홈>뷰티>헤어>헤어스타일링>컬크림 Coupang > 뷰티 > 헤어 > 헤어스타일링 > 컬크림'
3
  • '128 브러쉬 단품없음 LotteOn > 뷰티 > 뷰티기기 > 액세서리/소모품 LotteOn > 뷰티 > 뷰티기기 > 액세서리/소모품'
  • '리엔 (엘라스틴) 살롱드 컬러 팡팡 헤어쿠션 (짙은갈색) x 3개 짙은갈색 (#M)바디/헤어>헤어스타일링>염색약 Gmarket > 뷰티 > 바디/헤어 > 헤어스타일링 > 염색약'
  • '[이지피지] 해피펀치 헤어 커버스틱 3.5g (옵션) 옵션:1호 라이트 헤어 쿠팡 홈>뷰티>뷰티소품>피부관리기>롤러/마사지기;쿠팡 홈>선물스토어>생일선물>여성선물>이미용가전>롤링미용기기;쿠팡 홈>선물스토어>생일>이미용가전>셀프스킨케어>롤링미용기기;(#M)쿠팡 홈>뷰티>헤어>염색/파마>헤어메이크업>헤어섀도/마스카라 Coupang > 뷰티 > 헤어 > 염색/파마 > 헤어메이크업 > 헤어섀도/마스카라'

Evaluation

Metrics

Label Accuracy
all 0.9465

Uses

Direct Use for Inference

First install the SetFit library:

pip install setfit

Then you can load this model and run inference.

from setfit import SetFitModel

# Download from the 🤗 Hub
model = SetFitModel.from_pretrained("mini1013/master_item_top_bt12")
# Run inference
preds = model("아모스 스타일 익스프레션 몰딩 글레이즈300ml MinSellAmount (#M)바디/헤어>헤어스타일링>헤어글레이즈 Gmarket > 뷰티 > 바디/헤어 > 헤어스타일링 > 헤어글레이즈")

Training Details

Training Set Metrics

Training set Min Median Max
Word count 11 22.4371 93
Label Training Sample Count
0 50
1 50
2 50
3 50
4 50
5 50
6 50

Training Hyperparameters

  • batch_size: (64, 64)
  • num_epochs: (30, 30)
  • max_steps: -1
  • sampling_strategy: oversampling
  • num_iterations: 100
  • body_learning_rate: (2e-05, 1e-05)
  • head_learning_rate: 0.01
  • loss: CosineSimilarityLoss
  • distance_metric: cosine_distance
  • margin: 0.25
  • end_to_end: False
  • use_amp: False
  • warmup_proportion: 0.1
  • l2_weight: 0.01
  • seed: 42
  • eval_max_steps: -1
  • load_best_model_at_end: False

Training Results

Epoch Step Training Loss Validation Loss
0.0018 1 0.5286 -
0.0914 50 0.4469 -
0.1828 100 0.4235 -
0.2742 150 0.361 -
0.3656 200 0.2736 -
0.4570 250 0.1705 -
0.5484 300 0.0988 -
0.6399 350 0.0709 -
0.7313 400 0.0516 -
0.8227 450 0.0467 -
0.9141 500 0.0477 -
1.0055 550 0.0442 -
1.0969 600 0.0241 -
1.1883 650 0.0238 -
1.2797 700 0.0213 -
1.3711 750 0.0248 -
1.4625 800 0.0202 -
1.5539 850 0.0209 -
1.6453 900 0.0206 -
1.7367 950 0.0203 -
1.8282 1000 0.0229 -
1.9196 1050 0.011 -
2.0110 1100 0.0003 -
2.1024 1150 0.0002 -
2.1938 1200 0.0002 -
2.2852 1250 0.0001 -
2.3766 1300 0.0003 -
2.4680 1350 0.0001 -
2.5594 1400 0.0001 -
2.6508 1450 0.0 -
2.7422 1500 0.0 -
2.8336 1550 0.0 -
2.9250 1600 0.0 -
3.0165 1650 0.0 -
3.1079 1700 0.0 -
3.1993 1750 0.0 -
3.2907 1800 0.0 -
3.3821 1850 0.0 -
3.4735 1900 0.0 -
3.5649 1950 0.0004 -
3.6563 2000 0.0003 -
3.7477 2050 0.0004 -
3.8391 2100 0.001 -
3.9305 2150 0.0005 -
4.0219 2200 0.0 -
4.1133 2250 0.0 -
4.2048 2300 0.0 -
4.2962 2350 0.0 -
4.3876 2400 0.0 -
4.4790 2450 0.0 -
4.5704 2500 0.0 -
4.6618 2550 0.0 -
4.7532 2600 0.0 -
4.8446 2650 0.0 -
4.9360 2700 0.0003 -
5.0274 2750 0.0 -
5.1188 2800 0.0 -
5.2102 2850 0.0 -
5.3016 2900 0.0 -
5.3931 2950 0.0 -
5.4845 3000 0.0 -
5.5759 3050 0.0 -
5.6673 3100 0.0 -
5.7587 3150 0.0 -
5.8501 3200 0.0 -
5.9415 3250 0.0 -
6.0329 3300 0.0 -
6.1243 3350 0.0001 -
6.2157 3400 0.0009 -
6.3071 3450 0.0008 -
6.3985 3500 0.0007 -
6.4899 3550 0.0001 -
6.5814 3600 0.0 -
6.6728 3650 0.0 -
6.7642 3700 0.0 -
6.8556 3750 0.0 -
6.9470 3800 0.0 -
7.0384 3850 0.0 -
7.1298 3900 0.0 -
7.2212 3950 0.0 -
7.3126 4000 0.0 -
7.4040 4050 0.0 -
7.4954 4100 0.0 -
7.5868 4150 0.0 -
7.6782 4200 0.0 -
7.7697 4250 0.0002 -
7.8611 4300 0.0 -
7.9525 4350 0.0 -
8.0439 4400 0.0 -
8.1353 4450 0.0 -
8.2267 4500 0.0 -
8.3181 4550 0.0 -
8.4095 4600 0.0 -
8.5009 4650 0.0 -
8.5923 4700 0.0 -
8.6837 4750 0.0 -
8.7751 4800 0.0 -
8.8665 4850 0.0 -
8.9580 4900 0.0 -
9.0494 4950 0.0 -
9.1408 5000 0.0 -
9.2322 5050 0.0 -
9.3236 5100 0.0 -
9.4150 5150 0.0 -
9.5064 5200 0.0003 -
9.5978 5250 0.0013 -
9.6892 5300 0.0 -
9.7806 5350 0.0 -
9.8720 5400 0.0 -
9.9634 5450 0.0 -
10.0548 5500 0.0 -
10.1463 5550 0.0 -
10.2377 5600 0.0 -
10.3291 5650 0.0 -
10.4205 5700 0.0 -
10.5119 5750 0.0 -
10.6033 5800 0.0 -
10.6947 5850 0.0 -
10.7861 5900 0.0 -
10.8775 5950 0.0 -
10.9689 6000 0.0 -
11.0603 6050 0.0 -
11.1517 6100 0.0 -
11.2431 6150 0.0 -
11.3346 6200 0.0 -
11.4260 6250 0.0 -
11.5174 6300 0.0 -
11.6088 6350 0.0 -
11.7002 6400 0.0 -
11.7916 6450 0.0 -
11.8830 6500 0.0 -
11.9744 6550 0.0 -
12.0658 6600 0.0 -
12.1572 6650 0.0 -
12.2486 6700 0.0 -
12.3400 6750 0.0 -
12.4314 6800 0.0 -
12.5229 6850 0.0 -
12.6143 6900 0.0 -
12.7057 6950 0.0 -
12.7971 7000 0.0 -
12.8885 7050 0.0 -
12.9799 7100 0.0 -
13.0713 7150 0.0 -
13.1627 7200 0.0 -
13.2541 7250 0.0 -
13.3455 7300 0.0 -
13.4369 7350 0.0 -
13.5283 7400 0.0 -
13.6197 7450 0.0 -
13.7112 7500 0.0 -
13.8026 7550 0.0 -
13.8940 7600 0.0 -
13.9854 7650 0.0 -
14.0768 7700 0.0 -
14.1682 7750 0.0 -
14.2596 7800 0.0 -
14.3510 7850 0.0 -
14.4424 7900 0.0 -
14.5338 7950 0.0 -
14.6252 8000 0.0 -
14.7166 8050 0.0 -
14.8080 8100 0.0 -
14.8995 8150 0.0 -
14.9909 8200 0.0 -
15.0823 8250 0.0 -
15.1737 8300 0.0 -
15.2651 8350 0.0 -
15.3565 8400 0.0 -
15.4479 8450 0.0 -
15.5393 8500 0.0 -
15.6307 8550 0.0 -
15.7221 8600 0.0 -
15.8135 8650 0.0 -
15.9049 8700 0.0 -
15.9963 8750 0.0 -
16.0878 8800 0.0 -
16.1792 8850 0.0 -
16.2706 8900 0.0 -
16.3620 8950 0.0 -
16.4534 9000 0.0 -
16.5448 9050 0.0 -
16.6362 9100 0.0 -
16.7276 9150 0.0 -
16.8190 9200 0.0 -
16.9104 9250 0.0 -
17.0018 9300 0.0 -
17.0932 9350 0.0 -
17.1846 9400 0.0 -
17.2761 9450 0.0 -
17.3675 9500 0.0 -
17.4589 9550 0.0 -
17.5503 9600 0.0 -
17.6417 9650 0.0 -
17.7331 9700 0.0 -
17.8245 9750 0.0 -
17.9159 9800 0.0 -
18.0073 9850 0.0 -
18.0987 9900 0.0 -
18.1901 9950 0.0 -
18.2815 10000 0.0 -
18.3729 10050 0.0 -
18.4644 10100 0.0 -
18.5558 10150 0.0 -
18.6472 10200 0.0 -
18.7386 10250 0.0 -
18.8300 10300 0.0 -
18.9214 10350 0.0 -
19.0128 10400 0.0 -
19.1042 10450 0.0 -
19.1956 10500 0.0 -
19.2870 10550 0.0 -
19.3784 10600 0.0 -
19.4698 10650 0.0 -
19.5612 10700 0.0 -
19.6527 10750 0.0 -
19.7441 10800 0.0 -
19.8355 10850 0.0 -
19.9269 10900 0.0 -
20.0183 10950 0.0 -
20.1097 11000 0.0 -
20.2011 11050 0.0 -
20.2925 11100 0.0 -
20.3839 11150 0.0 -
20.4753 11200 0.0 -
20.5667 11250 0.0 -
20.6581 11300 0.0 -
20.7495 11350 0.0 -
20.8410 11400 0.0 -
20.9324 11450 0.0 -
21.0238 11500 0.0 -
21.1152 11550 0.0 -
21.2066 11600 0.0 -
21.2980 11650 0.0 -
21.3894 11700 0.0 -
21.4808 11750 0.0 -
21.5722 11800 0.0 -
21.6636 11850 0.0 -
21.7550 11900 0.0 -
21.8464 11950 0.0 -
21.9378 12000 0.0 -
22.0293 12050 0.0 -
22.1207 12100 0.0 -
22.2121 12150 0.0 -
22.3035 12200 0.0 -
22.3949 12250 0.0 -
22.4863 12300 0.0 -
22.5777 12350 0.0 -
22.6691 12400 0.0 -
22.7605 12450 0.0 -
22.8519 12500 0.0 -
22.9433 12550 0.0 -
23.0347 12600 0.0 -
23.1261 12650 0.0 -
23.2176 12700 0.0 -
23.3090 12750 0.0 -
23.4004 12800 0.0 -
23.4918 12850 0.0 -
23.5832 12900 0.0 -
23.6746 12950 0.0 -
23.7660 13000 0.0 -
23.8574 13050 0.0 -
23.9488 13100 0.0 -
24.0402 13150 0.0 -
24.1316 13200 0.0 -
24.2230 13250 0.0 -
24.3144 13300 0.0 -
24.4059 13350 0.0 -
24.4973 13400 0.0 -
24.5887 13450 0.0 -
24.6801 13500 0.0 -
24.7715 13550 0.0 -
24.8629 13600 0.0 -
24.9543 13650 0.0 -
25.0457 13700 0.0 -
25.1371 13750 0.0 -
25.2285 13800 0.0 -
25.3199 13850 0.0 -
25.4113 13900 0.0 -
25.5027 13950 0.0 -
25.5941 14000 0.0 -
25.6856 14050 0.0 -
25.7770 14100 0.0 -
25.8684 14150 0.0 -
25.9598 14200 0.0 -
26.0512 14250 0.0 -
26.1426 14300 0.0 -
26.2340 14350 0.0 -
26.3254 14400 0.0 -
26.4168 14450 0.0 -
26.5082 14500 0.0 -
26.5996 14550 0.0 -
26.6910 14600 0.0 -
26.7824 14650 0.0 -
26.8739 14700 0.0 -
26.9653 14750 0.0 -
27.0567 14800 0.0 -
27.1481 14850 0.0 -
27.2395 14900 0.0 -
27.3309 14950 0.0 -
27.4223 15000 0.0 -
27.5137 15050 0.0 -
27.6051 15100 0.0 -
27.6965 15150 0.0 -
27.7879 15200 0.0 -
27.8793 15250 0.0 -
27.9707 15300 0.0 -
28.0622 15350 0.0 -
28.1536 15400 0.0 -
28.2450 15450 0.0 -
28.3364 15500 0.0 -
28.4278 15550 0.0 -
28.5192 15600 0.0 -
28.6106 15650 0.0 -
28.7020 15700 0.0 -
28.7934 15750 0.0 -
28.8848 15800 0.0 -
28.9762 15850 0.0 -
29.0676 15900 0.0 -
29.1590 15950 0.0 -
29.2505 16000 0.0 -
29.3419 16050 0.0 -
29.4333 16100 0.0 -
29.5247 16150 0.0 -
29.6161 16200 0.0 -
29.7075 16250 0.0 -
29.7989 16300 0.0 -
29.8903 16350 0.0 -
29.9817 16400 0.0 -

Framework Versions

  • Python: 3.10.12
  • SetFit: 1.1.0
  • Sentence Transformers: 3.3.1
  • Transformers: 4.44.2
  • PyTorch: 2.2.0a0+81ea7a4
  • Datasets: 3.2.0
  • Tokenizers: 0.19.1

Citation

BibTeX

@article{https://doi.org/10.48550/arxiv.2209.11055,
    doi = {10.48550/ARXIV.2209.11055},
    url = {https://arxiv.org/abs/2209.11055},
    author = {Tunstall, Lewis and Reimers, Nils and Jo, Unso Eun Seo and Bates, Luke and Korat, Daniel and Wasserblat, Moshe and Pereg, Oren},
    keywords = {Computation and Language (cs.CL), FOS: Computer and information sciences, FOS: Computer and information sciences},
    title = {Efficient Few-Shot Learning Without Prompts},
    publisher = {arXiv},
    year = {2022},
    copyright = {Creative Commons Attribution 4.0 International}
}
Downloads last month
10,379
Safetensors
Model size
111M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for mini1013/master_item_top_bt12

Base model

klue/roberta-base
Finetuned
(176)
this model

Evaluation results