Update README.md
Browse files
README.md
CHANGED
@@ -31,7 +31,7 @@ Or any other topic, and the model will carry on in this back and forth style.
|
|
31 |
For more details, check out the related source models, especially [Pygmalion/Pygmalion-6b](https://huggingface.co/Pygmalion/Pygmalion-6b) for more information on how to utilize the chat bot formatting expected.
|
32 |
|
33 |
In a similar manner to fine-tuning, merging weights does not add information but transforms it, therefore it is important to consider trade-offs.
|
34 |
-
PPO_Pygway combines `ppo_hh_gpt-j`, `Janeway-6b` and `Pygmalion-6b`; all three models were blended in a two step process using
|
35 |
```
|
36 |
(X*A + Y*B)
|
37 |
```
|
|
|
31 |
For more details, check out the related source models, especially [Pygmalion/Pygmalion-6b](https://huggingface.co/Pygmalion/Pygmalion-6b) for more information on how to utilize the chat bot formatting expected.
|
32 |
|
33 |
In a similar manner to fine-tuning, merging weights does not add information but transforms it, therefore it is important to consider trade-offs.
|
34 |
+
PPO_Pygway combines `ppo_hh_gpt-j`, `Janeway-6b` and `Pygmalion-6b`; all three models were blended in a two step process using a simple weighted parameter method
|
35 |
```
|
36 |
(X*A + Y*B)
|
37 |
```
|