File size: 834 Bytes
daac535 96dfb72 019c7d2 96dfb72 019c7d2 96dfb72 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 |
---
library_name: transformers
tags: []
---
# Labor Saving Stated Aim Classifier
This is a roberta-base model that is trained to classify whether a set of explicit stated aims extracted from a British [historical patent](https://huggingface.co/datasets/matthewleechen/300YearsOfBritishPatents) includes a labor-saving objective.
Labels were manually generated and then checked with Gemini 2.0 Flash with the attached [prompt](https://huggingface.co/matthewleechen/labor-saving_stated_aim_classifier/blob/main/labor-saving_prompt.txt).
Hyperparameters:
lr = 6e-5
batch size = 128
Test set results:
```text
{'eval_loss': 0.28036537766456604,
'eval_accuracy': 0.9,
'eval_precision': 0.9,
'eval_recall': 0.9,
'eval_f1': 0.9,
'eval_runtime': 0.4135,
'eval_samples_per_second': 241.832,
'eval_steps_per_second': 2.418}
```
|