flan-t5-base-text_summarization_data

This model is a fine-tuned version of google/flan-t5-base on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 1.7386
  • Rouge1: 43.6615
  • Rouge2: 20.349
  • Rougel: 40.1032
  • Rougelsum: 40.1589
  • Gen Len: 14.6434

Model description

This is a text summarization model.

For more information on how it was created, check out the following link: https://github.com/DunnBC22/NLP_Projects/blob/main/Text%20Summarization/Text-Summarized%20Data%20-%20Comparison/Flan-T5%20-%20Text%20Summarization%20-%201%20Epoch.ipynb

Intended uses & limitations

This model is intended to demonstrate my ability to solve a complex problem using technology.

Training and evaluation data

Dataset Source: https://www.kaggle.com/datasets/cuitengfeui/textsummarization-data

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 16
  • eval_batch_size: 16
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 1

Training results

Training Loss Epoch Step Validation Loss Rouge1 Rouge2 Rougel Rougelsum Gen Len
2.0287 1.0 1197 1.7386 43.6615 20.349 40.1032 40.1589 14.6434

Framework versions

  • Transformers 4.26.1
  • Pytorch 1.12.1
  • Datasets 2.9.0
  • Tokenizers 0.12.1
Downloads last month
18
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for DunnBC22/flan-t5-base-text_summarization_data

Finetunes
1 model

Space using DunnBC22/flan-t5-base-text_summarization_data 1

Collection including DunnBC22/flan-t5-base-text_summarization_data