Spaces:
Runtime error
Runtime error
Yuanjing Zhu
commited on
adding badges
Browse files
README.md
CHANGED
@@ -12,6 +12,8 @@ pinned: false
|
|
12 |
|
13 |
# Reddit Explicit Text Classifier
|
14 |
|
|
|
|
|
15 |
In this project, we created a text classifier Hugging Face Spaces app and Gradio interface that classifies not safe for work (NSFW) content, specifically text that is considered inappropriate and unprofessional. We used a pre-trained DistilBERT transformer model for the sentiment analysis. The model was fine-tuned on Reddit posts and predicts 2 classes - which are NSFW and safe for work (SFW).
|
16 |
|
17 |
### Get Reddit data
|
|
|
12 |
|
13 |
# Reddit Explicit Text Classifier
|
14 |
|
15 |
+
[](https://github.com/YZhu0225/reddit_text_classification/actions/workflows/main.yml) [](https://github.com/YZhu0225/reddit_text_classification/actions/workflows/sync_to_hugging_face_hub.yml)
|
16 |
+
|
17 |
In this project, we created a text classifier Hugging Face Spaces app and Gradio interface that classifies not safe for work (NSFW) content, specifically text that is considered inappropriate and unprofessional. We used a pre-trained DistilBERT transformer model for the sentiment analysis. The model was fine-tuned on Reddit posts and predicts 2 classes - which are NSFW and safe for work (SFW).
|
18 |
|
19 |
### Get Reddit data
|