Compact models that mimic vector space of the "crisistransformers/CT-M1-Complete". These models should be fine-tuned on downstream tasks like BERT.

CrisisTransformers
university
AI & ML interests
Natural Language Processing, Social Computing, Crisis Informatics
Recent Activity
Organization Card
CrisisTransformers | State-of-the-art contextual and semantically meaningful sentence embeddings for crisis-related social media texts
CrisisTransformers is a family of pre-trained language models and sentence encoders introduced in the following papers:
- Pre-trained models and sentence encoders: CrisisTransformers: Pre-trained language models and sentence encoders for crisis-related social media texts
- Multi-lingual sentence encoders: Semantically Enriched Cross-Lingual Sentence Embeddings for Crisis-related Social Media Texts
- Mini models: "Actionable Help" in Crises: A Novel Dataset and Resource-Efficient Models for Identifying Request and Offer Social Media Posts
The models were trained on a massive corpus of over 15 billion word tokens from tweets associated with 30+ crisis events, such as disease outbreaks, natural disasters, conflicts, etc.
Collections
3
models
14

crisistransformers/tiny
Updated
•
19
•
1

crisistransformers/small
Updated
•
29
•
1

crisistransformers/medium
Updated
•
31
•
1

crisistransformers/CT-M1-BestLoss
Fill-Mask
•
Updated
•
24

crisistransformers/CT-M3-Complete
Fill-Mask
•
Updated
•
63
•
1

crisistransformers/CT-M3-BestLoss
Fill-Mask
•
Updated
•
10

crisistransformers/CT-M3-OneLook
Fill-Mask
•
Updated
•
18

crisistransformers/CT-M2-Complete
Fill-Mask
•
Updated
•
94

crisistransformers/CT-M2-BestLoss
Fill-Mask
•
Updated
•
10

crisistransformers/CT-M2-OneLook
Fill-Mask
•
Updated
•
15
datasets
None public yet