HybridNorm: Towards Stable and Efficient Transformer Training via Hybrid Normalization Paper • 2503.04598 • Published 8 days ago • 17
SimpleVQA: Multimodal Factuality Evaluation for Multimodal Large Language Models Paper • 2502.13059 • Published 24 days ago
Scale-Distribution Decoupling: Enabling Stable and Effective Training of Large Language Models Paper • 2502.15499 • Published 21 days ago • 13
Over-Tokenized Transformer: Vocabulary is Generally Worth Scaling Paper • 2501.16975 • Published Jan 28 • 26
Polynomial Composition Activations: Unleashing the Dynamics of Large Language Models Paper • 2411.03884 • Published Nov 6, 2024 • 26
OpenCoder: The Open Cookbook for Top-Tier Code Large Language Models Paper • 2411.04905 • Published Nov 7, 2024 • 116
M2rc-Eval: Massively Multilingual Repository-level Code Completion Evaluation Paper • 2410.21157 • Published Oct 28, 2024 • 6
HanoiT: Enhancing Context-aware Translation via Selective Context Paper • 2301.06825 • Published Jan 17, 2023
LongIns: A Challenging Long-context Instruction-based Exam for LLMs Paper • 2406.17588 • Published Jun 25, 2024 • 23
UniCoder: Scaling Code Large Language Model via Universal Code Paper • 2406.16441 • Published Jun 24, 2024 • 2
GanLM: Encoder-Decoder Pre-training with an Auxiliary Discriminator Paper • 2212.10218 • Published Dec 20, 2022
Synthesizing Text-to-SQL Data from Weak and Strong LLMs Paper • 2408.03256 • Published Aug 6, 2024 • 11
TableBench: A Comprehensive and Complex Benchmark for Table Question Answering Paper • 2408.09174 • Published Aug 17, 2024 • 52
FuzzCoder: Byte-level Fuzzing Test via Large Language Model Paper • 2409.01944 • Published Sep 3, 2024 • 45
Towards a Unified View of Preference Learning for Large Language Models: A Survey Paper • 2409.02795 • Published Sep 4, 2024 • 72
OmniBench: Towards The Future of Universal Omni-Language Models Paper • 2409.15272 • Published Sep 23, 2024 • 28
HelloBench: Evaluating Long Text Generation Capabilities of Large Language Models Paper • 2409.16191 • Published Sep 24, 2024 • 42