Tulu 3 Models Collection All models released with Tulu 3 -- state of the art open post-training recipes. • 7 items • Updated 28 days ago • 29
Awesome SFT datasets Collection A curated list of interesting datasets to fine-tune language models with. • 43 items • Updated Apr 12 • 123
Phi-3 Collection Phi-3 family of small language and multi-modal models. Language models are available in short- and long-context lengths. • 26 items • Updated Nov 14 • 536
The Big Benchmarks Collection Collection Gathering benchmark spaces on the hub (beyond the Open LLM Leaderboard) • 13 items • Updated Nov 18 • 176
Gemma-APS Release Collection Gemma models for text-to-propositions segmentation. The models are distilled from fine-tuned Gemini Pro model applied to multi-domain synthetic data. • 3 items • Updated 12 days ago • 19
Zeroshot Classifiers Collection These are my current best zeroshot classifiers. Some of my older models are downloaded more often, but the models in this collection are newer/better. • 11 items • Updated Apr 3 • 115
view article Article Training and Finetuning Embedding Models with Sentence Transformers v3 May 28 • 167
Llama 3.1 GPTQ, AWQ, and BNB Quants Collection Optimised Quants for high-throughput deployments! Compatible with Transformers, TGI & VLLM 🤗 • 9 items • Updated Sep 26 • 56
Qwen2 Collection Qwen2 language models, including pretrained and instruction-tuned models of 5 sizes, including 0.5B, 1.5B, 7B, 57B-A14B, and 72B. • 39 items • Updated 28 days ago • 351
EVIDENT PlatVR [datasets] Collection This work is supported by the Ministry of Industry, Trade and Tourism, Spain (AEI-010500-2023-280). • 3 items • Updated Apr 17 • 1
EVIDENT PlatVR [models] Collection This work is supported by the Ministry of Industry, Trade and Tourism, Spain (AEI-010500-2023-280). • 3 items • Updated Apr 17 • 1
MT5 release Collection The MT5 release follows the T5 family, but is pretrained on multilingual data. The update UMT5 models are pretrained on an updated corpus. • 10 items • Updated 12 days ago • 16
Flan-T5 release Collection The Flan-T5 covers 4 checkpoints of different sizes each time. It also includes upgrades versions trained using Universal sampling • 7 items • Updated 12 days ago • 21
T5 release Collection The original T5 transformer release was done in two steps, the original T5 checkpoints and the improved T5v1 • 9 items • Updated 12 days ago • 11
ELECTRA release Collection This collection regroups the ELECTRA models released by the Google team. • 6 items • Updated 12 days ago • 8
ALBERT release Collection The ALBERT release was done in two steps, over 4 checkpoints of different sizes each time. The first version is noted as "v1", the second as "v2". • 8 items • Updated 12 days ago • 5