|
--- |
|
library_name: transformers |
|
language: |
|
- en |
|
metrics: |
|
- f1 |
|
- roc_auc |
|
- accuracy |
|
base_model: |
|
- distilbert/distilbert-base-uncased |
|
pipeline_tag: text-classification |
|
--- |
|
|
|
# Dacon 악성 URL 분류 AI 경진대회 |
|
|
|
[링크]("https://dacon.io/competitions/official/236451/overview/description") |
|
|
|
 |
|
|
|
|
|
base_model : distilbert/distilbert-base-uncased |
|
|
|
tune parameters |
|
|
|
```plantext |
|
batch_size : 64 |
|
epocs : 5 |
|
learning_rate : 2e-5 |
|
weight_decay : 0.01 |
|
``` |
|
|
|
|
|
# Model Training Results |
|
|
|
| Step | Training Loss | Validation Loss | Accuracy | F1 Score | |
|
|------|--------------|----------------|----------|----------| |
|
| 200 | 0.024300 | 0.453651 | 0.929750 | 0.928860 | |
|
| 400 | 0.016700 | 0.543543 | 0.917750 | 0.916868 | |
|
| 600 | 0.007600 | 0.604909 | 0.921250 | 0.921414 | |
|
| 800 | 0.008500 | 0.588405 | 0.922750 | 0.922533 | |
|
| 1000 | 0.007300 | 0.618596 | 0.925500 | 0.925138 | |
|
| 1200 | 0.006300 | 0.628956 | 0.923250 | 0.922644 | |
|
|
|
|
|
---- |
|
|
|
2025.02.17 기준 |
|
|
|
|
|
 |
|
|
|
- roc_auc_socre = 0.94 |
|
- `Dacon` 순위 24위 |