Update README.md
Browse files
README.md
CHANGED
@@ -37,7 +37,7 @@ PPO_Pygway combines `ppo_hh_gpt-j`, `Janeway-6b` and `Pygmalion-6b`; all three m
|
|
37 |
```
|
38 |
With X & Y being the model weighs, and A/B being how strongly they are represented within the final value.
|
39 |
The intent of this is to elevate the end-model by borrowing the strongly represented aspects out of each base model,
|
40 |
-
but may
|
41 |
|
42 |
Blend was done in FP32 and output saved in FP16 for reduced storage needs.
|
43 |
|
|
|
37 |
```
|
38 |
With X & Y being the model weighs, and A/B being how strongly they are represented within the final value.
|
39 |
The intent of this is to elevate the end-model by borrowing the strongly represented aspects out of each base model,
|
40 |
+
but may also weaken other faces of each model, which can be desirable if the base models have problematic traits that need to be worked on.
|
41 |
|
42 |
Blend was done in FP32 and output saved in FP16 for reduced storage needs.
|
43 |
|