File size: 1,135 Bytes
7e6f7c8 5d2eaaf 7e6f7c8 5d2eaaf 7e6f7c8 5d2eaaf 7e6f7c8 5d2eaaf 7e6f7c8 5d2eaaf 7e6f7c8 5d2eaaf 7e6f7c8 5d2eaaf 7e6f7c8 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 |
---
library_name: transformers, peft
datasets:
- glue
---
# Model Card for Model ID
This model is a peft version of the roberta-large finetuned on the mrpc task of glue dataset
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This model is finetuned on the mrpc task of Glue dataset which essentially compares two statements and decides and gets a label as to if they are equivalent or not. Dataset example is shown below

The model was tested on the testing set and gave an accuracy of 86.6% and F1 score of 90%
Similar fine tuning and evaluation can be done on the other tasks of the GLUE dataset by loading the correspodning config files or defining appropriate LORA config uing sample code :
- **Developed by:** PEFT library example
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** roberta-large
|