athirdpath
commited on
Commit
•
cc56afd
1
Parent(s):
56d9833
Update README.md
Browse files
README.md
CHANGED
@@ -9,18 +9,6 @@ pipeline_tag: text-generation
|
|
9 |
<p align="center"><img src="https://iili.io/JzixYiP.png"/>
|
10 |
<p align="center"><font size="6"><b><a href="https://iili.io/Jzix7WB.png">NSFW - Erotic(?) Writing Example - NSFW</font></a></b></p>
|
11 |
<p align="center"><font size="3"> <b>(That's not what it's finetuned for, okay? He's a grower.)</b></font></p>
|
12 |
-
|
13 |
-
### Training Log
|
14 |
-
|
15 |
-
1) Well, I've reduced per-step loss by 0.36 (from 1.57 to 1.21) in a third of an epoch. To compare, the (meh) 13b Mistral glue LoRA reduced per-step loss by 0.37 (2.16 to 1.79) over an entire 4 epochs!
|
16 |
-
|
17 |
-
2) First 2 evals both came in at < 1, MUCH better than the 13b attempt. Verdict is that Mistral (or 7Bs in general, I'd guess) can't survive more than one cut, meaningfully. Per-step loss reduced by 0.47 @ 1 epoch.
|
18 |
-
|
19 |
-
3) Halfway there. Eval loss < 0.85 for the last 2 evals, promising. Per-step loss down to ~1.07, a reduction of ~33%!
|
20 |
-
|
21 |
-
4) 80% done. Curve is greatly flattened, so 3 epochs seems like it was the right call. Eval down to 0.81 and per-step loss down to 0.93. Can't wait to test!
|
22 |
-
|
23 |
-
5) Done! Testing time.
|
24 |
|
25 |
### Dataset
|
26 |
|
|
|
9 |
<p align="center"><img src="https://iili.io/JzixYiP.png"/>
|
10 |
<p align="center"><font size="6"><b><a href="https://iili.io/Jzix7WB.png">NSFW - Erotic(?) Writing Example - NSFW</font></a></b></p>
|
11 |
<p align="center"><font size="3"> <b>(That's not what it's finetuned for, okay? He's a grower.)</b></font></p>
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
12 |
|
13 |
### Dataset
|
14 |
|