Chernoffface's picture
Add SetFit model
f0b6614 verified
|
raw
history blame
13 kB
---
library_name: setfit
tags:
- setfit
- sentence-transformers
- text-classification
- generated_from_setfit_trainer
base_model: sentence-transformers/paraphrase-MiniLM-L6-v2
metrics:
- accuracy
widget:
- text: Serious Games Einführung in die Thematik Serious Games Grundlagen Anwendungsgebiete
und Trends Die Einzelthemen umfassen unter anderem Einführung in Serious Games
Game Development Game Design Game Technology Tools und Engines Personalisierung
und Adaption Interactive Digital Storytelling Authoring und Content Generation
Multiplayer Games Game Interfaces und Sensor Technology Effects Affects und User
Experience Mobile Games Serious Games Anwendungsbereiche und Beispiele Die Übungen
enthalten Theorie und Praxisanteile Dabei wird die Verwendung einer Game Engine
gelehrt.
- text: Aerobotics Seminar Einführung in die Aufgabenstellung die vorhandene Infrastruktur
und den zu durchlaufenden Entwicklungsprozess Entwurf und Implementierung von
Algorithmen zur Flugregelung in Gruppenarbeit Diskussion des Fortschritts in regelmäßigen
Flugdemonstration Abschließende Präsentation und Dokumentation
- text: "Seminar Intraoperative Imaging and Machine Learning For many applications\
\ techniques like deep learning allow for considerably faster algorithm development\
\ and allow to automate tasks that were performed manually in the past In medical\
\ imaging a large variety of tasks that interfere with clinical workflows has\
\ the potential for automation However at the same time new challenges arise like\
\ data privacy regulations and ethics concerns In this seminar we want to develop\
\ an application that allows for the automation of an based intraoperative planning\
\ or measurement procedure from a holistic perspective To this end we will invite\
\ a surgeon to explain the medical background and visit the operating room to\
\ understand the surgeons\x92 needs while performing the task Having understood\
\ the underlying medical problem we will look into topics of data privacy code\
\ of ethics prototype development and UI design for surgeons Furthermore we will\
\ touch regulatory requirements necessary for releasing software to clinics At\
\ the end of the seminar the students will have developed and documented a prototypical\
\ application for the indented intraoperative use case Students will be able to\
\ visit an operation room following the rules of such an environment perform their\
\ own literature research on a given subject independently research this subject\
\ according to data privacy and ethical standard present and introduce the subject\
\ to their student peers give a scientific talk in English according to international\
\ conference standards describe their results in a scientific report"
- text: Plattformen und Systeme für eLearning Platforms and Systems for eLearning
Mit dieser Vorlesung wird eine Übersicht über technische Systeme und Plattformen
im Bereich des eLearning gegeben insbesondere über Learning Management Systeme
LMS Prüfungssysteme bis hin zu Campus Management Systemen Neben der Struktur und
dem Einsatz werden auch Austauschformate sowie Individuallösungen für digitale
Lernszenarien vorgestellt Neben den reinen funktionalen Softwareanforderung und
deren Realisierungen werden insbesondere auch die Anforderungen aus Sicht der
Lehrenden und Studierenden behandelt Die Benutzungsoberflächen der verwendeten
Systeme müssen dafür eine gute User Experience aufweisen welche durch Methoden
der messbar werden Diese werden mit dem Fokus auf didaktische Szenarien behandelt
Grundsätzlich müssen im Lehr Lernkontext personenbezogene Daten benutzt werden
damit ggf diverse Analysen durchgeführt werden können Diese bilden die Grundlage
für die Learning Analytics Die Anforderungen des Datenschutzes sind zu berücksichtigen
Neben einer theoretischen Übersicht werden anhand aktueller Systeme verschiedene
didaktische Szenarien umgesetzt und nach technischen Kriterien analysiert Innerhalb
der Übung werden dafür einzelne Beispiele mit einem aktuellen System vorgestellt
und auf Herausforderungen eingegangen Diese werden mit aktuellen Forschungsergebnissen
verglichen und kritisch diskutiert In den Übungen sind Hausübungen oder Kleinprojekte
in Teams zu bearbeiten und in den Übungsgruppen zu präsentieren und die Lösungen
zu verteidigen.
- text: Seminar Internet Technology Das Seminar behandelt aktuelle Themen der Systems
industrielle Kommunikation konfigurierbare Netze Clouds Sicherheit und Privatsphäre
sowie Modellierung Evaluierung und Verifikation von Kommunikationssystemen und
protokollen.
pipeline_tag: text-classification
inference: false
---
# SetFit with sentence-transformers/paraphrase-MiniLM-L6-v2
This is a [SetFit](https://github.com/huggingface/setfit) model that can be used for Text Classification. This SetFit model uses [sentence-transformers/paraphrase-MiniLM-L6-v2](https://huggingface.co/sentence-transformers/paraphrase-MiniLM-L6-v2) as the Sentence Transformer embedding model. A OneVsRestClassifier instance is used for classification.
The model has been trained using an efficient few-shot learning technique that involves:
1. Fine-tuning a [Sentence Transformer](https://www.sbert.net) with contrastive learning.
2. Training a classification head with features from the fine-tuned Sentence Transformer.
## Model Details
### Model Description
- **Model Type:** SetFit
- **Sentence Transformer body:** [sentence-transformers/paraphrase-MiniLM-L6-v2](https://huggingface.co/sentence-transformers/paraphrase-MiniLM-L6-v2)
- **Classification head:** a OneVsRestClassifier instance
- **Maximum Sequence Length:** 128 tokens
<!-- - **Number of Classes:** Unknown -->
<!-- - **Training Dataset:** [Unknown](https://huggingface.co/datasets/unknown) -->
<!-- - **Language:** Unknown -->
<!-- - **License:** Unknown -->
### Model Sources
- **Repository:** [SetFit on GitHub](https://github.com/huggingface/setfit)
- **Paper:** [Efficient Few-Shot Learning Without Prompts](https://arxiv.org/abs/2209.11055)
- **Blogpost:** [SetFit: Efficient Few-Shot Learning Without Prompts](https://huggingface.co/blog/setfit)
## Uses
### Direct Use for Inference
First install the SetFit library:
```bash
pip install setfit
```
Then you can load this model and run inference.
```python
from setfit import SetFitModel
# Download from the 🤗 Hub
model = SetFitModel.from_pretrained("Chernoffface/fs-setfit-multilable-model")
# Run inference
preds = model("Seminar Internet Technology Das Seminar behandelt aktuelle Themen der Systems industrielle Kommunikation konfigurierbare Netze Clouds Sicherheit und Privatsphäre sowie Modellierung Evaluierung und Verifikation von Kommunikationssystemen und protokollen.")
```
<!--
### Downstream Use
*List how someone could finetune this model on their own dataset.*
-->
<!--
### Out-of-Scope Use
*List how the model may foreseeably be misused and address what users ought not to do with the model.*
-->
<!--
## Bias, Risks and Limitations
*What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.*
-->
<!--
### Recommendations
*What are recommendations with respect to the foreseeable issues? For example, filtering explicit content.*
-->
## Training Details
### Training Set Metrics
| Training set | Min | Median | Max |
|:-------------|:----|:---------|:----|
| Word count | 3 | 131.6738 | 514 |
### Training Hyperparameters
- batch_size: (16, 16)
- num_epochs: (1, 1)
- max_steps: -1
- sampling_strategy: oversampling
- num_iterations: 20
- body_learning_rate: (2e-05, 2e-05)
- head_learning_rate: 2e-05
- loss: CosineSimilarityLoss
- distance_metric: cosine_distance
- margin: 0.25
- end_to_end: False
- use_amp: False
- warmup_proportion: 0.1
- l2_weight: 0.01
- seed: 42
- eval_max_steps: -1
- load_best_model_at_end: False
### Training Results
| Epoch | Step | Training Loss | Validation Loss |
|:------:|:----:|:-------------:|:---------------:|
| 0.0014 | 1 | 0.3334 | - |
| 0.0716 | 50 | 0.2411 | - |
| 0.1433 | 100 | 0.2124 | - |
| 0.2149 | 150 | 0.186 | - |
| 0.2865 | 200 | 0.1806 | - |
| 0.3582 | 250 | 0.1759 | - |
| 0.4298 | 300 | 0.1705 | - |
| 0.5014 | 350 | 0.1542 | - |
| 0.5731 | 400 | 0.1559 | - |
| 0.6447 | 450 | 0.1524 | - |
| 0.7163 | 500 | 0.1438 | - |
| 0.7880 | 550 | 0.1507 | - |
| 0.8596 | 600 | 0.14 | - |
| 0.9312 | 650 | 0.1466 | - |
| 0.0006 | 1 | 0.1157 | - |
| 0.0287 | 50 | 0.1266 | - |
| 0.0573 | 100 | 0.1325 | - |
| 0.0860 | 150 | 0.1237 | - |
| 0.1147 | 200 | 0.12 | - |
| 0.1433 | 250 | 0.1189 | - |
| 0.1720 | 300 | 0.1094 | - |
| 0.2007 | 350 | 0.1028 | - |
| 0.2294 | 400 | 0.0993 | - |
| 0.2580 | 450 | 0.1003 | - |
| 0.2867 | 500 | 0.0898 | - |
| 0.3154 | 550 | 0.0875 | - |
| 0.3440 | 600 | 0.0847 | - |
| 0.3727 | 650 | 0.0879 | - |
| 0.4014 | 700 | 0.0801 | - |
| 0.4300 | 750 | 0.0754 | - |
| 0.4587 | 800 | 0.0791 | - |
| 0.4874 | 850 | 0.0715 | - |
| 0.5161 | 900 | 0.0781 | - |
| 0.5447 | 950 | 0.0765 | - |
| 0.5734 | 1000 | 0.0718 | - |
| 0.6021 | 1050 | 0.0786 | - |
| 0.6307 | 1100 | 0.073 | - |
| 0.6594 | 1150 | 0.0705 | - |
| 0.6881 | 1200 | 0.072 | - |
| 0.7167 | 1250 | 0.0673 | - |
| 0.7454 | 1300 | 0.066 | - |
| 0.7741 | 1350 | 0.0671 | - |
| 0.8028 | 1400 | 0.0631 | - |
| 0.8314 | 1450 | 0.0673 | - |
| 0.8601 | 1500 | 0.0638 | - |
| 0.8888 | 1550 | 0.0674 | - |
| 0.9174 | 1600 | 0.0613 | - |
| 0.9461 | 1650 | 0.063 | - |
| 0.9748 | 1700 | 0.0682 | - |
| 0.0014 | 1 | 0.0497 | - |
| 0.0716 | 50 | 0.0584 | - |
| 0.1433 | 100 | 0.0663 | - |
| 0.2149 | 150 | 0.0682 | - |
| 0.2865 | 200 | 0.0616 | - |
| 0.3582 | 250 | 0.0657 | - |
| 0.4298 | 300 | 0.0593 | - |
| 0.5014 | 350 | 0.0593 | - |
| 0.5731 | 400 | 0.0565 | - |
| 0.6447 | 450 | 0.0595 | - |
| 0.7163 | 500 | 0.0589 | - |
| 0.7880 | 550 | 0.0649 | - |
| 0.8596 | 600 | 0.0554 | - |
| 0.9312 | 650 | 0.0601 | - |
### Framework Versions
- Python: 3.12.3
- SetFit: 1.1.0
- Sentence Transformers: 3.0.0
- Transformers: 4.43.1
- PyTorch: 2.3.1+cu121
- Datasets: 2.20.0
- Tokenizers: 0.19.1
## Citation
### BibTeX
```bibtex
@article{https://doi.org/10.48550/arxiv.2209.11055,
doi = {10.48550/ARXIV.2209.11055},
url = {https://arxiv.org/abs/2209.11055},
author = {Tunstall, Lewis and Reimers, Nils and Jo, Unso Eun Seo and Bates, Luke and Korat, Daniel and Wasserblat, Moshe and Pereg, Oren},
keywords = {Computation and Language (cs.CL), FOS: Computer and information sciences, FOS: Computer and information sciences},
title = {Efficient Few-Shot Learning Without Prompts},
publisher = {arXiv},
year = {2022},
copyright = {Creative Commons Attribution 4.0 International}
}
```
<!--
## Glossary
*Clearly define terms in order to be accessible across audiences.*
-->
<!--
## Model Card Authors
*Lists the people who create the model card, providing recognition and accountability for the detailed work that goes into its construction.*
-->
<!--
## Model Card Contact
*Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.*
-->