-
SemiEvol: Semi-supervised Fine-tuning for LLM Adaptation
Paper • 2410.14745 • Published • 45 -
MiniPLM: Knowledge Distillation for Pre-Training Language Models
Paper • 2410.17215 • Published • 12 -
Why Does the Effective Context Length of LLMs Fall Short?
Paper • 2410.18745 • Published • 15 -
LOGO -- Long cOntext aliGnment via efficient preference Optimization
Paper • 2410.18533 • Published • 40
Collections
Discover the best community collections!
Collections including paper arxiv:2410.17215
-
Your Mixture-of-Experts LLM Is Secretly an Embedding Model For Free
Paper • 2410.10814 • Published • 48 -
MiniPLM: Knowledge Distillation for Pre-Training Language Models
Paper • 2410.17215 • Published • 12 -
CompassJudger-1: All-in-one Judge Model Helps Model Evaluation and Evolution
Paper • 2410.16256 • Published • 55 -
CCI3.0-HQ: a large-scale Chinese dataset of high quality designed for pre-training large language models
Paper • 2410.18505 • Published • 8
-
Aligning Teacher with Student Preferences for Tailored Training Data Generation
Paper • 2406.19227 • Published • 24 -
Pre-training Distillation for Large Language Models: A Design Space Exploration
Paper • 2410.16215 • Published • 15 -
Baichuan Alignment Technical Report
Paper • 2410.14940 • Published • 44 -
MiniPLM: Knowledge Distillation for Pre-Training Language Models
Paper • 2410.17215 • Published • 12
-
PDFTriage: Question Answering over Long, Structured Documents
Paper • 2309.08872 • Published • 53 -
Adapting Large Language Models via Reading Comprehension
Paper • 2309.09530 • Published • 75 -
Table-GPT: Table-tuned GPT for Diverse Table Tasks
Paper • 2310.09263 • Published • 39 -
Context-Aware Meta-Learning
Paper • 2310.10971 • Published • 16
-
Self-Rewarding Language Models
Paper • 2401.10020 • Published • 143 -
Orion-14B: Open-source Multilingual Large Language Models
Paper • 2401.12246 • Published • 10 -
MambaByte: Token-free Selective State Space Model
Paper • 2401.13660 • Published • 49 -
MM-LLMs: Recent Advances in MultiModal Large Language Models
Paper • 2401.13601 • Published • 44