AdamCodd commited on
Commit
6825d86
1 Parent(s): e058338

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -33,7 +33,7 @@ This model is a fine-tuned version of [bert-tiny](prajjwal1/bert-tiny) on [amazo
33
  ## Model description
34
 
35
  TinyBERT is 7.5 times smaller and 9.4 times faster on inference compared to its teacher BERT model (while DistilBERT is 40% smaller and 1.6 times faster than BERT).
36
- Compared to the [distilbert model](https://huggingface.co/AdamCodd/distilbert-base-uncased-finetuned-sentiment-amazon) which was trained on 10% of the dataset, this model was trained on the full dataset (3.6M of samples).
37
 
38
  ## Intended uses & limitations
39
  While this model may not be as accurate as the distilbert model, its performance should be enough for most use cases.
 
33
  ## Model description
34
 
35
  TinyBERT is 7.5 times smaller and 9.4 times faster on inference compared to its teacher BERT model (while DistilBERT is 40% smaller and 1.6 times faster than BERT).
36
+ This model was trained using the entire dataset (3.6M of samples) in constrast to the [distilbert model](https://huggingface.co/AdamCodd/distilbert-base-uncased-finetuned-sentiment-amazon) which was trained on only 10% of the dataset.
37
 
38
  ## Intended uses & limitations
39
  While this model may not be as accurate as the distilbert model, its performance should be enough for most use cases.