metadata
license: llama3
library_name: transformers
tags:
- llama3
Badger ι Llama 3 8B Instruct
Badger is a recursive maximally pairwise disjoint normalized fourier interpolation of the following models:
# Badger Iota
models = [
'L3-TheSpice-8b-v0.8.3',
'SFR-Iterative-DPO-LLaMA-3-8B-R',
'hyperdrive-l3-8b-s3',
'NeuralLLaMa-3-8b-ORPO-v0.3',
'llama-3-cat-8b-instruct-pytorch',
'meta-llama-3-8b-instruct-hf-ortho-baukit-5fail-3000total-bf16',
'Llama-3-Instruct-8B-SimPO',
'opus-v1.2-llama-3-8b-instruct-run3.5-epoch2.5',
'badger-zeta',
'badger-eta',
'Llama-3-8B-Instruct-Gradient-1048k',
'Mahou-1.0-llama3-8B',
'badger-l3-instruct-32k',
'LLaMAntino-3-ANITA-8B-Inst-DPO-ITA',
]
Hyperdrive
Hyperdrive is an in-process yet incomplete llama 3 base trained on my hyperdrive dataset.
Results
I'm really liking this one so far in my tests, and performs well using the instruct template.