The BERT model pretrained on the news corpus (https://huggingface.co/datasets/yyu/wiki_corpus). Used in the paper ReGen: Zero-Shot Text Classification via Training Data Generation with Progressive Dense Retrieval.

See github: https://github.com/yueyu1030/ReGen and paper: https://arxiv.org/abs/2305.10703 for details.

Downloads last month
122
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.