Malaysian Seq2Seq
Collection
Trained on 17B tokens, 81GB of cleaned texts, able to understand standard Malay, local Malay, local Mandarin, Manglish, and local Tamil.
•
8 items
•
Updated
Pretrained T5 small on both standard and local language model for Malay.
t5-small-bahasa-cased
model was pretrained on multiple tasks. Below is list of tasks we trained on,
Preparing steps can reproduce at https://github.com/huseinzol05/malaya/tree/master/pretrained-model/t5/prepare
soalan: {string}
, trained using Natural QA.ringkasan: {string}
, for abstractive summarization.tajuk: {string}
, for abstractive title.parafrasa: {string}
, for abstractive paraphrase.terjemah Inggeris ke Melayu: {string}
, for EN-MS translation.terjemah Melayu ke Inggeris: {string}
, for MS-EN translation.grafik pengetahuan: {string}
, for MS text to EN Knowledge Graph triples format.ayat1: {string1} ayat2: {string2}
, semantic similarity.