Fine tuned T5 base model with Simple English Wikipedia Dataset

This model is fine tuned with articles from Simple English Wikipedia for article generation. Used around 25,000 articles for training.

How to use

We have to use "writeWiki: " part at the begining of each prompt.

You can use this model directly with a pipeline for text generation. This example generates a different sequence each time it's run:

>>> from transformers import pipeline
>>> generator = pipeline('text2text-generation', model='Suchinthana/T5-Base-Wikigen')
>>> generator("writeWiki: Microcontroller", do_sample=True, max_length=250)
Downloads last month
14
Safetensors
Model size
223M params
Tensor type
F32
·
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.

Dataset used to train Suchinthana/T5-Base-Wikigen