Edit model card

Model Card for distilroberta-base-climate-d-s

Model Description

This is the ClimateBERT language model based on the DIV-SELECT and SIM-SELECT sample selection strategy.

Note: We generally recommend choosing the distilroberta-base-climate-f language model over this language model (unless you have good reasons not to).

Using the DistilRoBERTa model as starting point, the ClimateBERT Language Model is additionally pre-trained on a text corpus comprising climate-related research paper abstracts, corporate and general news and reports from companies. The underlying methodology can be found in our language model research paper.

Climate performance model card

distilroberta-base-climate-d-s
1. Is the resulting model publicly available? Yes
2. How much time does the training of the final model take? 48 hours
3. How much time did all experiments take (incl. hyperparameter search)? 350 hours
4. What was the power of GPU and CPU? 0.7 kW
5. At which geo location were the computations performed? Germany
6. What was the energy mix at the geo location? 470 gCO2eq/kWh
7. How much CO2eq was emitted to train the final model? 15.79 kg
8. How much CO2eq was emitted for all experiments? 115.15 kg
9. What is the average CO2eq emission for the inference of one sample? 0.62 mg
10. Which positive environmental impact can be expected from this work? This work can be categorized as a building block tools following Jin et al (2021). It supports the training of NLP models in the field of climate change and, thereby, have a positive environmental impact in the future.
11. Comments Block pruning could decrease CO2eq emissions

Citation Information

@inproceedings{wkbl2022climatebert,
    title={{ClimateBERT: A Pretrained Language Model for Climate-Related Text}},
    author={Webersinke, Nicolas and Kraus, Mathias and Bingler, Julia and Leippold, Markus},
    booktitle={Proceedings of AAAI 2022 Fall Symposium: The Role of AI in Responding to Climate Challenges},
    year={2022},
    doi={https://doi.org/10.48550/arXiv.2212.13631},
}
Downloads last month
31
Safetensors
Model size
82.4M params
Tensor type
I64
·
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.