hannahisrael03 commited on
Commit
68c4de7
·
verified ·
1 Parent(s): e035c15

ham-vit-skin-lesion-model-2

Browse files
Files changed (4) hide show
  1. README.md +19 -14
  2. model.safetensors +2 -2
  3. preprocessor_config.json +23 -0
  4. training_args.bin +2 -2
README.md CHANGED
@@ -1,8 +1,11 @@
1
  ---
 
2
  license: apache-2.0
3
- base_model: t5-small
4
  tags:
5
  - generated_from_trainer
 
 
6
  model-index:
7
  - name: results
8
  results: []
@@ -13,9 +16,10 @@ should probably proofread and complete it, then remove this comment. -->
13
 
14
  # results
15
 
16
- This model is a fine-tuned version of [t5-small](https://huggingface.co/t5-small) on an unknown dataset.
17
  It achieves the following results on the evaluation set:
18
- - Loss: 2.8240
 
19
 
20
  ## Model description
21
 
@@ -36,24 +40,25 @@ More information needed
36
  The following hyperparameters were used during training:
37
  - learning_rate: 2e-05
38
  - train_batch_size: 16
39
- - eval_batch_size: 16
40
  - seed: 42
41
- - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
42
  - lr_scheduler_type: linear
 
43
  - num_epochs: 3
44
 
45
  ### Training results
46
 
47
- | Training Loss | Epoch | Step | Validation Loss |
48
- |:-------------:|:-----:|:----:|:---------------:|
49
- | No log | 1.0 | 313 | 2.8942 |
50
- | 3.2737 | 2.0 | 626 | 2.8346 |
51
- | 3.2737 | 3.0 | 939 | 2.8240 |
52
 
53
 
54
  ### Framework versions
55
 
56
- - Transformers 4.39.3
57
- - Pytorch 2.2.1+cu121
58
- - Datasets 2.18.0
59
- - Tokenizers 0.15.2
 
1
  ---
2
+ library_name: transformers
3
  license: apache-2.0
4
+ base_model: google/vit-base-patch16-224-in21k
5
  tags:
6
  - generated_from_trainer
7
+ metrics:
8
+ - accuracy
9
  model-index:
10
  - name: results
11
  results: []
 
16
 
17
  # results
18
 
19
+ This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on an unknown dataset.
20
  It achieves the following results on the evaluation set:
21
+ - Loss: 1.4268
22
+ - Accuracy: 0.6397
23
 
24
  ## Model description
25
 
 
40
  The following hyperparameters were used during training:
41
  - learning_rate: 2e-05
42
  - train_batch_size: 16
43
+ - eval_batch_size: 8
44
  - seed: 42
45
+ - optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
46
  - lr_scheduler_type: linear
47
+ - lr_scheduler_warmup_steps: 500
48
  - num_epochs: 3
49
 
50
  ### Training results
51
 
52
+ | Training Loss | Epoch | Step | Validation Loss | Accuracy |
53
+ |:-------------:|:-----:|:----:|:---------------:|:--------:|
54
+ | 1.6205 | 1.0 | 3095 | 1.6163 | 0.6932 |
55
+ | 1.449 | 2.0 | 6190 | 1.4643 | 0.7016 |
56
+ | 1.4321 | 3.0 | 9285 | 1.4268 | 0.6397 |
57
 
58
 
59
  ### Framework versions
60
 
61
+ - Transformers 4.48.3
62
+ - Pytorch 2.5.1+cu124
63
+ - Datasets 3.3.2
64
+ - Tokenizers 0.21.0
model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:973376800baa251512eb3f757e91080d535c7949a9a8d37e10f1e7466e4789f3
3
- size 437964800
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:67c874403bc804b1c15380534c60053528f3982e74cad8083e0c2bba1317784e
3
+ size 343247504
preprocessor_config.json ADDED
@@ -0,0 +1,23 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "do_convert_rgb": null,
3
+ "do_normalize": true,
4
+ "do_rescale": true,
5
+ "do_resize": true,
6
+ "image_mean": [
7
+ 0.5,
8
+ 0.5,
9
+ 0.5
10
+ ],
11
+ "image_processor_type": "ViTImageProcessor",
12
+ "image_std": [
13
+ 0.5,
14
+ 0.5,
15
+ 0.5
16
+ ],
17
+ "resample": 2,
18
+ "rescale_factor": 0.00392156862745098,
19
+ "size": {
20
+ "height": 224,
21
+ "width": 224
22
+ }
23
+ }
training_args.bin CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:f08670c00c9cd5415cfc6856d405d3bd2cb5293490459a4042d7ef3ee2db9dac
3
- size 4984
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:ba6792a1db163359ee36bdbc912d940cda64986d4fe80e9baf3a7136bbaa93e1
3
+ size 5304