GreenBit LLMs

This is GreenBitAI's pretrained low-bit LLMs with extreme compression yet still strong performance.

Please refer to our Github page for the code to run the model and more information.

zero-shot evaluation

Repository (Phi Family) Avg Acc. OpenBQ ARC-E Winogr. HellaS. ARC-C PIQA BoolQ RACE ANLI-R1 ANLI-R2 ANLI-R3 WiC
Phi-3-mini-128k-instruct-layer-mix-bpw-2.2 0.510 0.270 0.706 0.648 0.479 0.411 0.736 0.783 0.381 0.393 0.38 0.399 0.536
Phi-3-mini-128k-instruct-layer-mix-bpw-2.5 0.514 0.290 0.719 0.656 0.488 0.401 0.750 0.778 0.401 0.392 0.410 0.407 0.493
Phi-3-mini-128k-instruct-layer-mix-bpw-3.0 0.548 0.318 0.761 0.663 0.519 0.453 0.777 0.798 0.393 0.473 0.404 0.442 0.579
Phi-3-mini-128k-instruct-layer-mix-bpw-4.0 0.582 0.346 0.779 0.708 0.582 0.495 0.787 0.840 0.412 0.529 0.459 0.448 0.606
Phi-3-mini-128k-instruct 0.586 0.342 0.785 0.731 0.596 0.512 0.782 0.851 0.401 0.547 0.464 0.432 0.594

5-shot evaluation

Repository (Phi Family) Avg Acc. OpenBQ ARC-E Winogr. HellaS. ARC-C PIQA BoolQ RACE ANLI-R1 ANLI-R2 ANLI-R3 WiC
Phi-3-mini-128k-instruct-layer-mix-bpw-2.2 0.534 0.302 0.738 0.659 0.487/0.636 0.438 0.744 0.793 0.408 0.421 0.404 0.439 0.583
Phi-3-mini-128k-instruct-layer-mix-bpw-2.5 0.543 0.310 0.771 0.671 0.501/0.657 0.441 0.763 0.799 0.405 0.453 0.427 0.443 0.534
Phi-3-mini-128k-instruct-layer-mix-bpw-3.0 0.563 0.346 0.796 0.687 0.528/0.694 0.500 0.782 0.809 0.410 0.473 0.394 0.474 0.565
Phi-3-mini-128k-instruct-layer-mix-bpw-4.0 0.602 0.374 0.817 0.725 0.598/0.768 0.542 0.766 0.864 0.428 0.523 0.456 0.497 0.658
Phi-3-mini-128k-instruct 0.608 0.408 0.825 0.725 0.608/0.781 0.534 0.768 0.866 0.538 0.483 0.515 0.627
Downloads last month
14
Safetensors
Model size
619M params
Tensor type
I32
·
FP16
·
I16
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Collection including GreenBitAI/Phi-3-mini-4k-instruct-layer-mix-bpw-3.0