Why Does the Effective Context Length of LLMs Fall Short? Paper • 2410.18745 • Published 12 days ago • 16
Language Models can Self-Lengthen to Generate Long Texts Paper • 2410.23933 • Published 5 days ago • 15
ShadowKV: KV Cache in Shadows for High-Throughput Long-Context LLM Inference Paper • 2410.21465 • Published 8 days ago • 9