|
--- |
|
license: bigscience-openrail-m |
|
language: |
|
- en |
|
metrics: |
|
- accuracy |
|
library_name: transformers |
|
pipeline_tag: text-classification |
|
tags: |
|
- BERT |
|
- DistilBERT |
|
- text-classfication |
|
--- |
|
# Elon Musk Detector (EMD) |
|
|
|
<!-- Provide a quick summary of what the model is/does. --> |
|
|
|
Elon Musk Detector uses DistilBERT to classify is a tweet (or post) was made by Elon Musk or Not. |
|
|
|
## Model Details |
|
|
|
This model uses transformers and DistilBERT to classify text, based on 2 datasets. |
|
|
|
### Model Description |
|
|
|
<!-- Provide a longer summary of what this model is. --> |
|
|
|
|
|
|
|
- **Developed by:** Kokohachi |
|
- **Model type:** BERT |
|
- **Language(s) (NLP):** English |
|
- **License:** BigScience OpenRAIL-M |
|
- **Finetuned from model:** DistilBERT |
|
|
|
## Uses |
|
|
|
This model should be used only for academic purposes, and should not be used for business purposes. |
|
### Direct Use |
|
|
|
`git clone https://huggingface.co/kix-intl/elon-musk-detector` |
|
|
|
`from transformers import DistilBERTTokenizer, DistilBERTModel` |
|
|
|
#### Load the tokenizer |
|
`loaded_tokenizer = DistilBERTTokenizer.from_pretrained(save_directory)` |
|
|
|
#### Load the model |
|
`loaded_model = DistilBERTModel.from_pretrained(save_directory)` |
|
|
|
|
|
## Training Details |
|
|
|
### Training Data |
|
#### Elon Musk Tweets |
|
https://www.kaggle.com/datasets/gpreda/elon-musk-tweets |
|
#### Twitter Tweets Sentiment Dataset |
|
https://www.kaggle.com/datasets/yasserh/twitter-tweets-sentiment-dataset |
|
|
|
### Training Procedure |
|
https://huggingface.co/kix-intl/elon-musk-detector/blob/main/EMD.ipynb |
|
|
|
## Model Card Contact |
|
|
|
[email protected] |