Update README.md
Browse files
README.md
CHANGED
@@ -14,7 +14,7 @@ This is a a merged model, using an weighted parameter blend strategy at a (20:20
|
|
14 |
|
15 |
By their respective authors.
|
16 |
|
17 |
-
**Warning:
|
18 |
|
19 |
### Intended Use:
|
20 |
|
@@ -28,7 +28,7 @@ You: "I am doing just fine, thank you."
|
|
28 |
Or any other topic, and the model will carry on in this back and forth style.
|
29 |
|
30 |
## Information:
|
31 |
-
For more details, check out the related source models, especially [Pygmalion-6b](https://huggingface.co/Pygmalion/Pygmalion-6b) for more information on how to utilize the chat bot formatting expected.
|
32 |
|
33 |
In a similar manner to fine-tuning, merging weights does not add information but transforms it, therefore it is important to consider trade-offs.
|
34 |
PPO_Pygway combines `ppo_hh_gpt-j`, `Janeway-6b` and `Pygmalion-6b`; all three models were blended in a two step process using the a simple weighted parameter method
|
|
|
14 |
|
15 |
By their respective authors.
|
16 |
|
17 |
+
**Warning: PPO_Pygway-6b may generate NSFW or inappropriate content due to the base models (Mainly [Pygmalion/Pygmalion-6b](https://huggingface.co/Pygmalion/Pygmalion-6b)) being trained on general user logs, and internet archives.**
|
18 |
|
19 |
### Intended Use:
|
20 |
|
|
|
28 |
Or any other topic, and the model will carry on in this back and forth style.
|
29 |
|
30 |
## Information:
|
31 |
+
For more details, check out the related source models, especially [Pygmalion/Pygmalion-6b](https://huggingface.co/Pygmalion/Pygmalion-6b) for more information on how to utilize the chat bot formatting expected.
|
32 |
|
33 |
In a similar manner to fine-tuning, merging weights does not add information but transforms it, therefore it is important to consider trade-offs.
|
34 |
PPO_Pygway combines `ppo_hh_gpt-j`, `Janeway-6b` and `Pygmalion-6b`; all three models were blended in a two step process using the a simple weighted parameter method
|