Meshwa commited on
Commit
c430890
1 Parent(s): 81a33c1

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +29 -0
README.md CHANGED
@@ -91,3 +91,32 @@ If you have any questions or need further assistance, please feel free to reach
91
  | train | 0.000917 | 0.999567 | 0.999280 | 0.999474 | 0.999089 |
92
  | val | 0.001835 | 0.999478 | 0.999172 | 0.999349 | 0.998999 |
93
  | test | 0.002225 | 0.999369 | 0.999051 | 0.999203 | 0.998901 |
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
91
  | train | 0.000917 | 0.999567 | 0.999280 | 0.999474 | 0.999089 |
92
  | val | 0.001835 | 0.999478 | 0.999172 | 0.999349 | 0.998999 |
93
  | test | 0.002225 | 0.999369 | 0.999051 | 0.999203 | 0.998901 |
94
+
95
+
96
+ # BibTeX entry and citation info of the initial Distil-Bert-Base-Uncased
97
+ ```
98
+ @article{Sanh2019DistilBERTAD,
99
+ title={DistilBERT, a distilled version of BERT: smaller, faster, cheaper and lighter},
100
+ author={Victor Sanh and Lysandre Debut and Julien Chaumond and Thomas Wolf},
101
+ journal={ArXiv},
102
+ year={2019},
103
+ volume={abs/1910.01108}
104
+ }
105
+ ```
106
+
107
+ ## Citation
108
+
109
+ If you use the fine-tuned DistilBERT model provided in this repository, please cite it as follows:
110
+
111
+ ```
112
+ @article{Meshwa2023,
113
+ title={Fine-Tuned DistilBERT for Text Classification},
114
+ author={Meshwa},
115
+ journal={Hugging Face Model Hub},
116
+ year={2023},
117
+ url={https://huggingface.co/Meshwa/Distill-Bert-Automation-Command-Classification}
118
+ ```
119
+
120
+ Meshwa. (2023). Fine-Tuned DistilBERT for Text Classification. Hugging Face Model Hub. URL: [https://huggingface.co/Meshwa/Distill-Bert-Automation-Command-Classification](https://huggingface.co/Meshwa/Distill-Bert-Automation-Command-Classification)
121
+
122
+ I appreciate your acknowledgment if you find this model useful in your work.