DisorRoBERTa
DisorRoBERTa is a double-domain adaptation of a RoBERTa language model (a variation of DisorBERT). First, is adapted to social media language, and then, adapted to the mental health domain. In both steps, it incorporated a lexical resource to guide the masking process of the language model and, therefore, to help it in paying more attention to words related to mental disorders.
We follow the standard procedure for fine-tuning a masked language model in Huggingface’s NLP Course 🤗.
For training the model, we used a batch size of 256, Adam optimizer, with a learning rate of 1e-5, and cross-entropy as a loss function. We trained the model for three epochs using a GPU NVIDIA Tesla V100 32GB SXM2.
Usage
Use a pipeline as a high-level helper
from transformers import pipeline
pipe = pipeline("fill-mask", model="citiusLTL/DisorRoBERTa")
Load model directly
from transformers import AutoTokenizer, AutoModelForMaskedLM
tokenizer = AutoTokenizer.from_pretrained("citiusLTL/DisorRoBERTa")
model = AutoModelForMaskedLM.from_pretrained("citiusLTL/DisorRoBERTa")
- Downloads last month
- 156
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.