File size: 1,044 Bytes
27d1355
 
 
 
 
 
 
 
 
3582495
27d1355
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
---
license: llama3
library_name: transformers
tags:
- llama3
---

![image/png](https://cdn-uploads.huggingface.co/production/uploads/65b19c1b098c85365af5a83e/uiZN2FW_kLgUA2f93V_pw.png)

# Badger ι Llama 3 8B Instruct

Badger is a *recursive maximally pairwise disjoint normalized fourier interpolation* of the following models:
```python
# Badger Iota
models = [
 'L3-TheSpice-8b-v0.8.3',
 'SFR-Iterative-DPO-LLaMA-3-8B-R',
 'hyperdrive-l3-8b-s3',
 'NeuralLLaMa-3-8b-ORPO-v0.3',
 'llama-3-cat-8b-instruct-pytorch',
 'meta-llama-3-8b-instruct-hf-ortho-baukit-5fail-3000total-bf16',
 'Llama-3-Instruct-8B-SimPO',
 'opus-v1.2-llama-3-8b-instruct-run3.5-epoch2.5',
 'badger-zeta',
 'badger-eta',
 'Llama-3-8B-Instruct-Gradient-1048k',
 'Mahou-1.0-llama3-8B',
 'badger-l3-instruct-32k',
 'LLaMAntino-3-ANITA-8B-Inst-DPO-ITA',
]
```

### Hyperdrive

Hyperdrive is an in-process yet incomplete llama 3 base trained on my hyperdrive dataset.

### Results

I'm really liking this one so far in my tests, and performs well using the instruct template.