This Roberta model is trained from scratch using Masked Language Modelling task on a collection of medical reports