AdamCodd commited on
Commit
0362806
1 Parent(s): f272817

Create README.md

Browse files
Files changed (1) hide show
  1. README.md +54 -0
README.md ADDED
@@ -0,0 +1,54 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ datasets:
3
+ - amazon_polarity
4
+ ---
5
+ # tinybert-sentiment-amazon
6
+
7
+ This model is a fine-tuned version of [bert-tiny](prajjwal1/bert-tiny) on [amazon-polarity dataset](https://huggingface.co/datasets/amazon_polarity).
8
+
9
+ ## Model description
10
+
11
+ TinyBERT is 7.5 times smaller and 9.4 times faster on inference compared to its teacher BERT model (while DistilBERT is 40% smaller and 1.6 times faster than BERT).
12
+ Compared to the [distilbert model](https://huggingface.co/AdamCodd/distilbert-base-uncased-finetuned-sentiment-amazon) which was trained on 10% of the dataset, this model was trained on the full dataset (3.6M of samples).
13
+
14
+ ## Intended uses & limitations
15
+ While this model may not be as accurate as the distilbert model, its performance should be enough for most use cases.
16
+
17
+ ```python
18
+ from transformers import pipeline
19
+
20
+ # Create the pipeline
21
+ sentiment_classifier = pipeline('text-classification', model='AdamCodd/tinybert-sentiment-amazon')
22
+
23
+ # Now you can use the pipeline to classify emotions
24
+ result = sentiment_classifier("This product doesn't fit me at all.")
25
+ print(result)
26
+ #[{'label': 'negative', 'score': 0.9969743490219116}]
27
+ ```
28
+
29
+ ## Training and evaluation data
30
+
31
+ More information needed
32
+
33
+ ## Training procedure
34
+
35
+ ### Training hyperparameters
36
+
37
+ The following hyperparameters were used during training:
38
+ - learning_rate: 3e-05
39
+ - train_batch_size: 32
40
+ - eval_batch_size: 32
41
+ - seed: 1270
42
+ - optimizer: AdamW with betas=(0.9,0.999) and epsilon=1e-08
43
+ - lr_scheduler_type: linear
44
+ - lr_scheduler_warmup_steps: 150
45
+ - num_epochs: 1
46
+ - weight_decay: 0.01
47
+
48
+ ### Framework versions
49
+
50
+ - Transformers 4.35.0
51
+ - Pytorch lightning 2.1.0
52
+ - Tokenizers 0.14.1
53
+
54
+ If you want to support me, you can [here](https://ko-fi.com/adamcodd).