README.md
Browse filesDisorBERT is a BERT base model trained via double-domain adaptation of a language model. First, we adapted the model to social media language, and then, we adapted it to the mental health domain. In both steps, we incorporated a lexical resource to guide the masking process of the language model and, therefore, to help it in paying more attention to words related to mental disorders.
We trained our model for three epochs, using a batch size of 128, a learning rate of 2e^{-5}, on a GPU NVIDIA Tesla V100 32GB SXM2.
Paper
For more details, refer to the paper DisorbertBERT: A Double Domain Adaptation Model for Detecting Signs of Mental Disorders in Social Media.
@inproceedings{aragon-etal-2023-disorbert,
title = "{D}isor{BERT}: A Double Domain Adaptation Model for Detecting Signs of Mental Disorders in Social Media",
author = "Aragon, Mario and
Lopez Monroy, Adrian Pastor and
Gonzalez, Luis and
Losada, David E. and
Montes, Manuel",
booktitle = "Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)",
month = jul,
year = "2023",
address = "Toronto, Canada",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2023.acl-long.853",
doi = "10.18653/v1/2023.acl-long.853",
pages = "15305--15318",
}