LLM-Drop Model weights of paper "What Matters in Transformers? Not All Attention is Needed" (https://arxiv.org/abs/2406.15786) s1ghhh/Llama-2-13b-Drop8Block 13B • Updated Sep 8, 2024 • 3 • 2 s1ghhh/Llama-2-13b-Drop4Block 13B • Updated Sep 8, 2024 • 3 • 2 s1ghhh/Llama-2-13b-Drop4Attn 13B • Updated Sep 8, 2024 • 3 • 2 s1ghhh/Llama-2-13b-Drop8Attn 13B • Updated Sep 8, 2024 • 3 • 2
LLM-Drop Model weights of paper "What Matters in Transformers? Not All Attention is Needed" (https://arxiv.org/abs/2406.15786) s1ghhh/Llama-2-13b-Drop8Block 13B • Updated Sep 8, 2024 • 3 • 2 s1ghhh/Llama-2-13b-Drop4Block 13B • Updated Sep 8, 2024 • 3 • 2 s1ghhh/Llama-2-13b-Drop4Attn 13B • Updated Sep 8, 2024 • 3 • 2 s1ghhh/Llama-2-13b-Drop8Attn 13B • Updated Sep 8, 2024 • 3 • 2