--- base_model: google/pegasus-large tags: - generated_from_trainer datasets: - arrow model-index: - name: RoBERTa_Pegasus_dependent_V1 results: [] --- # RoBERTa_Pegasus_dependent_V1 This model is a fine-tuned version of [google/pegasus-large](https://huggingface.co/google/pegasus-large) on the arrow dataset. It achieves the following results on the evaluation set: - Loss: 2.1922 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 2 - eval_batch_size: 1 - seed: 42 - distributed_type: multi-GPU - num_devices: 4 - gradient_accumulation_steps: 2 - total_train_batch_size: 16 - total_eval_batch_size: 4 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 40 ### Training results | Training Loss | Epoch | Step | Validation Loss | |:-------------:|:-------:|:----:|:---------------:| | 5.0795 | 0.9927 | 68 | 3.9061 | | 3.6946 | 2.0 | 137 | 3.2272 | | 3.1058 | 2.9927 | 205 | 2.6658 | | 2.6695 | 4.0 | 274 | 2.4699 | | 2.511 | 4.9927 | 342 | 2.3796 | | 2.3613 | 6.0 | 411 | 2.3305 | | 2.3187 | 6.9927 | 479 | 2.3024 | | 2.2335 | 8.0 | 548 | 2.2800 | | 2.2252 | 8.9927 | 616 | 2.2639 | | 2.1548 | 10.0 | 685 | 2.2515 | | 2.1536 | 10.9927 | 753 | 2.2412 | | 2.0982 | 12.0 | 822 | 2.2338 | | 2.0988 | 12.9927 | 890 | 2.2248 | | 2.0441 | 14.0 | 959 | 2.2204 | | 2.0552 | 14.9927 | 1027 | 2.2164 | | 2.0084 | 16.0 | 1096 | 2.2111 | | 2.0178 | 16.9927 | 1164 | 2.2076 | | 1.9719 | 18.0 | 1233 | 2.2051 | | 1.9901 | 18.9927 | 1301 | 2.2043 | | 1.9464 | 20.0 | 1370 | 2.2009 | | 1.9631 | 20.9927 | 1438 | 2.2000 | | 1.9233 | 22.0 | 1507 | 2.1981 | | 1.9394 | 22.9927 | 1575 | 2.1961 | | 1.9012 | 24.0 | 1644 | 2.1947 | | 1.9213 | 24.9927 | 1712 | 2.1943 | | 1.8843 | 26.0 | 1781 | 2.1942 | | 1.9025 | 26.9927 | 1849 | 2.1942 | | 1.8715 | 28.0 | 1918 | 2.1950 | | 1.8906 | 28.9927 | 1986 | 2.1936 | | 1.8559 | 30.0 | 2055 | 2.1939 | | 1.8774 | 30.9927 | 2123 | 2.1934 | | 1.8468 | 32.0 | 2192 | 2.1913 | | 1.8685 | 32.9927 | 2260 | 2.1916 | | 1.834 | 34.0 | 2329 | 2.1924 | | 1.8582 | 34.9927 | 2397 | 2.1929 | | 1.8333 | 36.0 | 2466 | 2.1925 | | 1.859 | 36.9927 | 2534 | 2.1922 | ### Framework versions - Transformers 4.40.1 - Pytorch 2.2.1+cu121 - Datasets 2.17.1 - Tokenizers 0.19.1