Update README.md
Browse files
README.md
CHANGED
@@ -7,16 +7,10 @@ language:
|
|
7 |
pipeline_tag: text-classification
|
8 |
---
|
9 |
|
10 |
-
|
11 |
-
|
12 |
-
Included classes;
|
13 |
|
14 |
-
|
15 |
-
- Kızdırma/Hakaret
|
16 |
-
- Cinsiyetçilik
|
17 |
-
- Irkçılık
|
18 |
-
|
19 |
-
3388 tweets were used in the training of the model. Accordingly, the success rates in education are as follows;
|
20 |
|
21 |
| | INSULT | OTHER | PROFANITY | RACIST | SEXIST |
|
22 |
| ------ | ------ | ------ | ------ | ------ | ------ |
|
@@ -29,6 +23,15 @@ Included classes;
|
|
29 |
- Precision: 0.9570284225256961
|
30 |
- Accuracy: 0.956
|
31 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
32 |
## Dependency
|
33 |
pip install torch torchvision torchaudio
|
34 |
pip install tf-keras
|
|
|
7 |
pipeline_tag: text-classification
|
8 |
---
|
9 |
|
10 |
+
# About the model
|
11 |
+
This model is designed for text classification, specifically for identifying offensive content in Turkish text. The model classifies text into five categories: INSULT, OTHER, PROFANITY, RACIST, and SEXIST.
|
|
|
12 |
|
13 |
+
## Model Metrics
|
|
|
|
|
|
|
|
|
|
|
14 |
|
15 |
| | INSULT | OTHER | PROFANITY | RACIST | SEXIST |
|
16 |
| ------ | ------ | ------ | ------ | ------ | ------ |
|
|
|
23 |
- Precision: 0.9570284225256961
|
24 |
- Accuracy: 0.956
|
25 |
|
26 |
+
## Training Information
|
27 |
+
- Device : macOS 14.5 23F79 arm64 | GPU: Apple M2 Max | Memory: 5840MiB / 32768MiB
|
28 |
+
- Training completed in 0:22:54 (hh:mm:ss)
|
29 |
+
- Optimizer: AdamW
|
30 |
+
- learning_rate: 2e-5
|
31 |
+
- eps: 1e-8
|
32 |
+
- epochs: 10
|
33 |
+
- Batch size: 64
|
34 |
+
|
35 |
## Dependency
|
36 |
pip install torch torchvision torchaudio
|
37 |
pip install tf-keras
|