--- base_model: sentence-transformers/all-mpnet-base-v2 library_name: setfit metrics: - accuracy pipeline_tag: text-classification tags: - setfit - sentence-transformers - text-classification - generated_from_setfit_trainer widget: - text: 'Sure! Support it 100 percent. Good opportunity to watch a president follow the law and accept consequences rather that whine and complain like a toddler. ' - text: 'Steve During Prime Minister Ardern''s leadership, the first eighteen months of the pandemic resulted in virtually no cases of Covid or Covid deaths and New Zealand has suffered less than twenty-five hundred deaths from Covid to date. After the deadliest shooting in New Zealand''s history, in her role as the youngest leader ever elected in the country, she mourned with a grief-stricken nation and responded to the crisis by changing the gun laws in seven days. It makes me want to weep thinking of the compassionate and intelligent leadership New Zealand has enjoyed under Prime Minister Ardern. It''s a magnificent place and she is a credit to her country. ' - text: 'I am very happy for her. I think she has made absolutely the right decision. I have been very critical of some of the policies she endorsed although I understood the reasoning behind them. She was a shining beacon in the earlier years but at some point she lost her firm grip on principle and became captive to doctrinaire theories that did not always serve the country despite the best of intentions. Ardern is a very great soul and I don''t doubt that there is an even more brilliant future still ahead of her, one that will allow her to lead on the international stage without compromising her personal principles. Meantime she deserves time to regroup, heal, and spend precious time with her family. Personally I hope Chris Hipkins steps into her shoes although he also has a young family and would have to make similar sacrifices. He has shown himself to be very able and decent, and like Ardern is a master communicator. ' - text: 'I spoke with an elderly gentlemen with a British accent today in the local library here in New Zealand who said he had never voted for Ardern because she had been living in an unmarried relationship and to compound this issue had insulted the Queen by appearing before her while pregnant. A point that keeps being overlooked is that Ardern leaves office not only with record low unemployment but having set in train a major social housing program and removed restrictions that prevented housing intensification. These in time will hopefully reduce both house prices and rents, thus alleviating child poverty. Ardern also dramatically raised the insulation standards for new houses. which will mean that they are warmer and healthierArdern totally replaced the bureaucratic Resource Management Act which had been blamed for nearly 20 years by business and right wing commentators for preventing development. Legislation was also passed that will fund the clean-up of the country’s woeful drinking, stormwater and sewerage systems. Compared with her predecessors John Key and Bill English, Ardern at least tried to deal with many of the country''s long standing issues. While still the popular preferred prime minister leaving now removes herself as a lightning rod for the haters while allowing her successor to drop any upcoming planned legislation that is considered to be controversial. At the same time the successor has 9 months to develop their relationship with voters. ' - text: 'Jeff In some states, felons are not allowed to vote after they''ve completed their sentences. See Florida. Florida wants felons to pay fines after they''ve been released, only in most cases, the government can''t tell the formerly imprisoned how much is owed. ' inference: true model-index: - name: SetFit with sentence-transformers/all-mpnet-base-v2 results: - task: type: text-classification name: Text Classification dataset: name: Unknown type: unknown split: test metrics: - type: accuracy value: 1.0 name: Accuracy --- # SetFit with sentence-transformers/all-mpnet-base-v2 This is a [SetFit](https://github.com/huggingface/setfit) model that can be used for Text Classification. This SetFit model uses [sentence-transformers/all-mpnet-base-v2](https://huggingface.co/sentence-transformers/all-mpnet-base-v2) as the Sentence Transformer embedding model. A [LogisticRegression](https://scikit-learn.org/stable/modules/generated/sklearn.linear_model.LogisticRegression.html) instance is used for classification. The model has been trained using an efficient few-shot learning technique that involves: 1. Fine-tuning a [Sentence Transformer](https://www.sbert.net) with contrastive learning. 2. Training a classification head with features from the fine-tuned Sentence Transformer. ## Model Details ### Model Description - **Model Type:** SetFit - **Sentence Transformer body:** [sentence-transformers/all-mpnet-base-v2](https://huggingface.co/sentence-transformers/all-mpnet-base-v2) - **Classification head:** a [LogisticRegression](https://scikit-learn.org/stable/modules/generated/sklearn.linear_model.LogisticRegression.html) instance - **Maximum Sequence Length:** 384 tokens - **Number of Classes:** 2 classes ### Model Sources - **Repository:** [SetFit on GitHub](https://github.com/huggingface/setfit) - **Paper:** [Efficient Few-Shot Learning Without Prompts](https://arxiv.org/abs/2209.11055) - **Blogpost:** [SetFit: Efficient Few-Shot Learning Without Prompts](https://huggingface.co/blog/setfit) ### Model Labels | Label | Examples | |:------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | yes | | | no | | ## Evaluation ### Metrics | Label | Accuracy | |:--------|:---------| | **all** | 1.0 | ## Uses ### Direct Use for Inference First install the SetFit library: ```bash pip install setfit ``` Then you can load this model and run inference. ```python from setfit import SetFitModel # Download from the 🤗 Hub model = SetFitModel.from_pretrained("davidadamczyk/setfit-model-5") # Run inference preds = model("Sure! Support it 100 percent. Good opportunity to watch a president follow the law and accept consequences rather that whine and complain like a toddler. ") ``` ## Training Details ### Training Set Metrics | Training set | Min | Median | Max | |:-------------|:----|:-------|:----| | Word count | 16 | 90.75 | 249 | | Label | Training Sample Count | |:------|:----------------------| | no | 18 | | yes | 22 | ### Training Hyperparameters - batch_size: (16, 16) - num_epochs: (1, 1) - max_steps: -1 - sampling_strategy: oversampling - num_iterations: 120 - body_learning_rate: (2e-05, 2e-05) - head_learning_rate: 2e-05 - loss: CosineSimilarityLoss - distance_metric: cosine_distance - margin: 0.25 - end_to_end: False - use_amp: False - warmup_proportion: 0.1 - l2_weight: 0.01 - seed: 42 - eval_max_steps: -1 - load_best_model_at_end: False ### Training Results | Epoch | Step | Training Loss | Validation Loss | |:------:|:----:|:-------------:|:---------------:| | 0.0017 | 1 | 0.3081 | - | | 0.0833 | 50 | 0.1044 | - | | 0.1667 | 100 | 0.001 | - | | 0.25 | 150 | 0.0003 | - | | 0.3333 | 200 | 0.0002 | - | | 0.4167 | 250 | 0.0002 | - | | 0.5 | 300 | 0.0001 | - | | 0.5833 | 350 | 0.0001 | - | | 0.6667 | 400 | 0.0001 | - | | 0.75 | 450 | 0.0001 | - | | 0.8333 | 500 | 0.0001 | - | | 0.9167 | 550 | 0.0001 | - | | 1.0 | 600 | 0.0001 | - | ### Framework Versions - Python: 3.10.13 - SetFit: 1.1.0 - Sentence Transformers: 3.0.1 - Transformers: 4.45.2 - PyTorch: 2.4.0+cu124 - Datasets: 2.21.0 - Tokenizers: 0.20.0 ## Citation ### BibTeX ```bibtex @article{https://doi.org/10.48550/arxiv.2209.11055, doi = {10.48550/ARXIV.2209.11055}, url = {https://arxiv.org/abs/2209.11055}, author = {Tunstall, Lewis and Reimers, Nils and Jo, Unso Eun Seo and Bates, Luke and Korat, Daniel and Wasserblat, Moshe and Pereg, Oren}, keywords = {Computation and Language (cs.CL), FOS: Computer and information sciences, FOS: Computer and information sciences}, title = {Efficient Few-Shot Learning Without Prompts}, publisher = {arXiv}, year = {2022}, copyright = {Creative Commons Attribution 4.0 International} } ```