--- base_model: gpt2 library_name: Distily license: mit tags: - generated_from_trainer model-index: - name: distily_bench_gpt2_activation_loss_b results: [] --- # distily_bench_gpt2_activation_loss_b This student model is distilled from the teacher model [gpt2](https://huggingface.co/gpt2) using the dataset (unspecified). The [Distily](https://github.com/lapp0/distily) library was used for this distillation. It achieves the following results on the evaluation set: - eval_enwikippl: 248.6255 - eval_frwikippl: 1465.2275 - eval_zhwikippl: 910.6450 - eval_loss: 1.4609 - eval_runtime: 17.1765 - eval_samples_per_second: 58.219 - eval_steps_per_second: 7.277 ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - distillation_objective: DistillationObjective(logits_loss_component=LossComponent(label=logits, weight=1, loss_fn=kl, layer_mapper=None, projector=None), hs_loss_component=LossComponent(label=hs, weight=2.0, loss_fn=mse_sum, layer_mapper=None, projector=None), attn_loss_component=LossComponent(label=attn, weight=0, loss_fn=None, layer_mapper=None, projector=None)) - train_embeddings: True - learning_rate: 4e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: constant - num_epochs: 1.0 ### Resource Usage Peak GPU Memory: 8.0903 GB ### Eval-Phase Metrics | step | epoch | enwikippl | frwikippl | loss | runtime | samples_per_second | steps_per_second | zhwikippl | | --- | --- | --- | --- | --- | --- | --- | --- | --- | | **teacher eval** | | 30.2086 | 57.2728 | | | | | 18.1784 | | 0 | 0 | 55922.5859 | 57739.5352 | 7.7238 | 17.0878 | 58.521 | 7.315 | 57324.7461 | | 1000 | 0.0808 | 920.8879 | 4922.0625 | 2.3268 | 17.0862 | 58.527 | 7.316 | 22360.4277 | | 2000 | 0.1616 | 620.1771 | 3480.3069 | 2.0488 | 17.0703 | 58.581 | 7.323 | 9003.6973 | | 3000 | 0.2424 | 497.6692 | 3095.0298 | 1.9112 | 17.0714 | 58.578 | 7.322 | 2670.2615 | | 4000 | 0.3232 | 420.1666 | 2924.0510 | 1.7926 | 17.0681 | 58.589 | 7.324 | 1505.7640 | | 5000 | 0.4040 | 363.9979 | 2463.9880 | 1.6927 | 17.0999 | 58.48 | 7.31 | 1190.3823 | | 6000 | 0.4848 | 321.6444 | 2008.3508 | 1.6180 | 17.0952 | 58.496 | 7.312 | 2308.5518 | | 7000 | 0.5657 | 288.7571 | 1772.2247 | 1.5521 | 17.1061 | 58.459 | 7.307 | 943.5735 | | 8000 | 0.6465 | 268.0555 | 1636.7375 | 1.5025 | 17.0661 | 58.596 | 7.324 | 1002.2805 | | 9000 | 0.7273 | 248.6255 | 1465.2275 | 1.4609 | 17.1765 | 58.219 | 7.277 | 910.6450 | | 10000 | 0.8081 | 230.5145 | 1351.8748 | 1.4215 | 17.0631 | 58.606 | 7.326 | 754.1554 | | 11000 | 0.8889 | 218.0646 | 1356.4580 | 1.3820 | 17.0844 | 58.533 | 7.317 | 892.8242 | | 12000 | 0.9697 | 200.7094 | 1234.1702 | 1.3464 | 17.0571 | 58.627 | 7.328 | 822.1012 | | 12375 | 1.0 | 195.8138 | 1216.7174 | 1.3332 | 17.1185 | 58.416 | 7.302 | 906.7622 | ### Framework versions - Distily 0.2.0 - Transformers 4.44.0 - Pytorch 2.3.0 - Datasets 2.21.0