Model Card for EnvironmentalBERT-base
Model Description
Based on this paper, this is the EnvironmentalBERT-base language model. A language model that is trained to better understand environmental texts in the ESG domain.
Using the DistilRoBERTa model as a starting point, the EnvironmentalBERT-base Language Model is additionally pre-trained on a text corpus comprising environmental-related annual reports, sustainability reports, and corporate and general news.
More details can be found in the paper
@article{Schimanski23ESGBERT,
title={{Bridiging the Gap in ESG Measurement: Using NLP to Quantify Environmental, Social, and Governance Communication}},
author={Tobias Schimanski and Andrin Reding and Nico Reding and Julia Bingler and Mathias Kraus and Markus Leippold},
year={2023},
journal={Available on SSRN: https://papers.ssrn.com/sol3/papers.cfm?abstract_id=4622514},
}
- Downloads last month
- 71
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.