davidadamczyk's picture
Add SetFit model
48803a2 verified
metadata
base_model: sentence-transformers/all-mpnet-base-v2
library_name: setfit
metrics:
  - accuracy
pipeline_tag: text-classification
tags:
  - setfit
  - sentence-transformers
  - text-classification
  - generated_from_setfit_trainer
widget:
  - text: >
      As soon as I saw my human pull out my favorite red leash, my tail started
      wagging and I started barking enthusiastically. I had been waiting all day
      to go to my favorite place in the whole world, outside. When my human
      clipped my leash to my collar, I felt my heart sing with joy. Finally! As
      soon as I stepped out the door, I felt the cool autumn breeze wash over me
      like a wave. The squirrels scattered inside of the trees and the birds
      whistled in harmonies that floated through the autumn breeze. The trees
      were a deep, rich emerald color that drifted me off into a new universe.
      That is what I loved about the outside, it was always so peaceful and
      serene. We walked a few blocks admiring nature's beauty until we suddenly
      halted to a stop. I looked around but found nothing that looked out of the
      ordinary. My human opened the car door and placed me in the back seat. My
      heart started to beat so fast I thought it would burst out of my chest and
      my mind was racing. Where are we going? Millions of dreadful thoughts
      popped into my brain. By the time we arrived, my fur was soaked with
      sweat. As soon as I walked out the door, I stopped in my tracks. In front
      of me was a place I can only describe as paradise. Behind the white gate,
      there were clusters of dogs and rubber balls crowding the green grass.
      What more could a dog ever dream of? My heart sang with joy as I stepped
      through the gate. I knew then that this would be the best day of my life.
  - text: >
      Rock bottom interest rates and easy money, maybe. But many of these truly
      tech companies like Microsoft, Apple, Facebook and so on have huge cash
      reserves.  I live in Gatesville, Seattle, and I will offer another
      explanation or at least a contributing factor. A senior software engineer
      at Microsoft makes anywhere from a new hire at $250K per year with gold
      plated benefits up to $500K per year for someone with a few years under
      their belt.  Microsoft hires numerous "independent contractors" at half or
      less than what they pay full time employees also with substantially lesser
      benefits who work from home. Look for them to increase their base of
      independent contractors as long as the government lets them get away with
      it.
  - text: >
      “Amid this dynamic environment, we delivered record results in fiscal year
      2022: We reported $198 billion in revenue and $83 billion in operating
      income. And the Microsoft Cloud surpassed $100 billion in annualized
      revenue for the first time.”- From Microsoft’s 2022 Annual Report
      Shareholder’s Letter
  - text: >
      Paresh Y Murudkar Hypothesis: Google wants it leaked.  OpenAI has by being
      public acquired huge amount of attention.  Although Google will likely
      achieve partity with OpenAI shortly, their immediate danger is to become
      the default definition of the technology.  Microsoft found out years ago
      that even though Bing had reached technical parity with Google, the public
      had been convinced to search for something was to "Google It.'Thus, Google
      has to ghet out there with its own stuff, before the "GPT It" because the
      next generation term for search.
  - text: >
      Mor -- You sound like someone who has never experienced real hardship.
      Your idea that homelessness is a "lifestyle", as if it were freely chosen,
      suggests you have never been there. Try to imagine this: Your employer has
      a big layoff, and with two week's severance, you lose your job. For a
      while, you get by on unemployment and your spouse's part-time income. But
      then unemployment runs out because your industry has tanked in your state.
      You search fruitlessly for a job, and begin to get really depressed. Your
      spouse is diagnosed with cancer, and to pay for their treatment, you sell
      your modest home and move in with your brother-in-law and his family,
      living in their basement, sharing their one bathroom. Your teenage child
      who has been uprooted to a new town and school starts taking drugs and
      acting out, getting arrested, coming home really late, making a lot of
      noise, being very depressed and angry at everyone. The brother-in-law says
      his sister with cancer can stay but your teen cannot. You two move into
      another relative's basement, but that doesn't last long. Your teen
      disappears, leaves a note "I can't stand it anymore. Sorry, love you,
      gotta go." You run out of your last cash sending it to help your wife. The
      relative can't afford to feed you. You end up on the street. Open your
      mind.
inference: true
model-index:
  - name: SetFit with sentence-transformers/all-mpnet-base-v2
    results:
      - task:
          type: text-classification
          name: Text Classification
        dataset:
          name: Unknown
          type: unknown
          split: test
        metrics:
          - type: accuracy
            value: 1
            name: Accuracy

SetFit with sentence-transformers/all-mpnet-base-v2

This is a SetFit model that can be used for Text Classification. This SetFit model uses sentence-transformers/all-mpnet-base-v2 as the Sentence Transformer embedding model. A LogisticRegression instance is used for classification.

The model has been trained using an efficient few-shot learning technique that involves:

  1. Fine-tuning a Sentence Transformer with contrastive learning.
  2. Training a classification head with features from the fine-tuned Sentence Transformer.

Model Details

Model Description

Model Sources

Model Labels

Label Examples
yes
  • 'MS: Invests $10B into ChatGPT and then immediately lays off 10,000 workers to pay for it.\n'
  • 'Skepticism aside, it's way too late to stop or even realistically control A.I. The genie is literally out of the bottle, with more sophisticated iterations of A.I. to come. There's too much financial momentum behind it. OpenAI, the research lab behind the viral ChatGPT chatbot, is in talks to sell existing shares in a tender offer that would value the company at around $29 billion, making it one of the most valuable U.S. startups on paper. Microsoft Corp. has also been in advanced talks to increase its investment in OpenAI. In 2019, Microsoft invested $1 billion in OpenAI and became its preferred partner for commercializing new technologies for services like the search engine Bing and the design app Microsoft Design. Other backers include Tesla CEO Elon Musk, LinkedIn co-founder Reid Hoffman. There are over 100 AI companies developing various Machine learning tasks, new features coming daily. ChatGPT is a genuine productivity boost and a technological wonder. It can write code in Python, TypeScript, and many other languages at my command. It does have bugs in the code, but they are fixable. The possibilities are endless. I can't imagine what version 2.0 or 3.0 would look like. For better and/or worse, this is the future. It is incredible, even at this early stage. This technology is mind-blowing and will unquestionably change the world. As Victor Hugo said, " A force more powerful than all of the armies in the world is an idea whose time has come." Indeed it has.\n'
  • 'Microsoft Bets Big on the Creator of ChatGPT in Race to Dominate A.I. As a new chatbot wows the world with its conversational talents, a resurgent tech giant is poised to reap the benefits while doubling down on a relationship with the start-up OpenAI. When a chatbot called ChatGPT hit the internet late last year, executives at a number of Silicon Valley companies worried they were suddenly dealing with new artificial intelligence technology that could disrupt their businesses. As a new chatbot wows the world with its conversational talents, a resurgent tech giant is poised to reap the benefits while doubling down on a relationship with the start-up OpenAI.\n'
no
  • "The tragedy of this war, any war, is overwhelming. A city of 100,000 reduced to ruble and the smell of corpses. One can easily imagine all the families who went about their lives prior to the invasion. Schools ringing with children sounds. Shops and eateries filled with patrons, exchanging smiles, saying hello, friends getting together. Homes secure, places of family warmth, humor, love. All gone. Gone in this lifetime. Gone in the blink of a mad man's perverted notion of his needs. We have our mad men and women too - in our congress. We just saw their shameful show. Just the appetizer for a lousy meal to come. In response to the brave Ukrainians who resist, who fight and die, will the mad ones in the new congress stand for freedom or turn away?Will they do as the French did 250 years ago when they came to our aid against a king or will they allow King Putin to have his way?Americans have freedom in their blood. Make that blood boil if this congress forgets that and turns its back on the fight against a king.\n"
  • 'The dangers of gas stoves are found in only a few studies funded by anti-fossil fuel groups. Anyone who distrusts studies by Exxon, big pharma, big tobacco, should be skeptical of these as well."The science" (tm) does not support these studies that proport to say that gas stoves are a specific problem. NO(x) forms at 2800 F under high pressure, and typically from Nitrogen in the fuel, not the air, where it is relatively stable, being bound to another Nitrogen as N2. Natural gas does not contain Nitrogen, and cooktops do not operate at high pressure. Likewise, natural gas, burning in excess air (open flame) does not produce significant CO. It is indeed a clean burning fuel.Cooking does release particulates and gasses, smoke and smells, but that does not depend on how the food is heated. Cooking bacon smells the same on electric or gas or charcoal or wood (may actually smell better on wood and charcoal) or dung (well maybe not dung).\n'
  • 'When my electricity goes down due to winter storms, I still have hot water for showers, a place to cook food and heat all via my gas water heater, gas fireplace and gas cooktop. Easy to ignite with a match. We can briefly open windows to air out fumes. I’ll never willingly go all electric.\n'

Evaluation

Metrics

Label Accuracy
all 1.0

Uses

Direct Use for Inference

First install the SetFit library:

pip install setfit

Then you can load this model and run inference.

from setfit import SetFitModel

# Download from the 🤗 Hub
model = SetFitModel.from_pretrained("davidadamczyk/my-awesome-setfit-model")
# Run inference
preds = model("“Amid this dynamic environment, we delivered record results in fiscal year 2022: We reported $198 billion in revenue and $83 billion in operating income. And the Microsoft Cloud surpassed $100 billion in annualized revenue for the first time.”- From Microsoft’s 2022 Annual Report Shareholder’s Letter
")

Training Details

Training Set Metrics

Training set Min Median Max
Word count 13 132.875 296
Label Training Sample Count
no 18
yes 22

Training Hyperparameters

  • batch_size: (16, 16)
  • num_epochs: (1, 1)
  • max_steps: -1
  • sampling_strategy: oversampling
  • num_iterations: 20
  • body_learning_rate: (2e-05, 2e-05)
  • head_learning_rate: 2e-05
  • loss: CosineSimilarityLoss
  • distance_metric: cosine_distance
  • margin: 0.25
  • end_to_end: False
  • use_amp: False
  • warmup_proportion: 0.1
  • l2_weight: 0.01
  • seed: 42
  • eval_max_steps: -1
  • load_best_model_at_end: False

Training Results

Epoch Step Training Loss Validation Loss
0.01 1 0.3469 -
0.5 50 0.0603 -
1.0 100 0.0011 -

Framework Versions

  • Python: 3.10.13
  • SetFit: 1.1.0
  • Sentence Transformers: 3.0.1
  • Transformers: 4.45.2
  • PyTorch: 2.4.0+cu124
  • Datasets: 2.21.0
  • Tokenizers: 0.20.0

Citation

BibTeX

@article{https://doi.org/10.48550/arxiv.2209.11055,
    doi = {10.48550/ARXIV.2209.11055},
    url = {https://arxiv.org/abs/2209.11055},
    author = {Tunstall, Lewis and Reimers, Nils and Jo, Unso Eun Seo and Bates, Luke and Korat, Daniel and Wasserblat, Moshe and Pereg, Oren},
    keywords = {Computation and Language (cs.CL), FOS: Computer and information sciences, FOS: Computer and information sciences},
    title = {Efficient Few-Shot Learning Without Prompts},
    publisher = {arXiv},
    year = {2022},
    copyright = {Creative Commons Attribution 4.0 International}
}