add bench
Browse files
README.md
CHANGED
@@ -39,7 +39,7 @@ This is abliterated model of [google/gemma-2-2b-jpn-it](https://huggingface.co/g
|
|
39 |
described by mlabonne.
|
40 |
|
41 |
Layer 24 of the original model was chosen for abliteration.
|
42 |
-
I also created models with layer
|
43 |
These three layers were chosen due to they all produce uncensored response
|
44 |
after respective layer was abliterated.
|
45 |
|
@@ -57,7 +57,8 @@ Click on the model name go to the raw score json generated by Open LLM Leaderboa
|
|
57 |
| [gemma-2-2b-jpn-it](https://huggingface.co/datasets/open-llm-leaderboard/results/blob/main/google/gemma-2-2b-jpn-it/results_2024-10-15T15-21-39.173019.json) | 30.82 | 54.11 | 41.43 | 0.0 | 27.52 | 37.17 | 24.67 |
|
58 |
| [gemma-2-2b-jpn-it-abliterated-17](https://huggingface.co/datasets/open-llm-leaderboard/results/raw/main/ymcki/gemma-2-2b-jpn-it-abliterated-17/results_2024-10-18T15-18-46.821674.json) | 30.29 | 52.65 | 40.46 | 0.0 | 27.18 | 36.90 | 24.55 |
|
59 |
| [gemma-2-2b-jpn-it-abliterated-18](https://huggingface.co/datasets/open-llm-leaderboard/results/raw/main/ymcki/gemma-2-2b-jpn-it-abliterated-18/results_2024-10-18T15-41-42.399571.json) | 30.61 | 53.02 | 40.96 | 0.0 | 27.35 | 37.30 | 25.05 |
|
60 |
-
| gemma-2-2b-jpn-it-abliterated-24 |
|
|
|
61 |
|
62 |
It is only slightly dumber than the original.
|
63 |
|
|
|
39 |
described by mlabonne.
|
40 |
|
41 |
Layer 24 of the original model was chosen for abliteration.
|
42 |
+
I also created models with layer 17 and 18 abliterated respectively for comparison.
|
43 |
These three layers were chosen due to they all produce uncensored response
|
44 |
after respective layer was abliterated.
|
45 |
|
|
|
57 |
| [gemma-2-2b-jpn-it](https://huggingface.co/datasets/open-llm-leaderboard/results/blob/main/google/gemma-2-2b-jpn-it/results_2024-10-15T15-21-39.173019.json) | 30.82 | 54.11 | 41.43 | 0.0 | 27.52 | 37.17 | 24.67 |
|
58 |
| [gemma-2-2b-jpn-it-abliterated-17](https://huggingface.co/datasets/open-llm-leaderboard/results/raw/main/ymcki/gemma-2-2b-jpn-it-abliterated-17/results_2024-10-18T15-18-46.821674.json) | 30.29 | 52.65 | 40.46 | 0.0 | 27.18 | 36.90 | 24.55 |
|
59 |
| [gemma-2-2b-jpn-it-abliterated-18](https://huggingface.co/datasets/open-llm-leaderboard/results/raw/main/ymcki/gemma-2-2b-jpn-it-abliterated-18/results_2024-10-18T15-41-42.399571.json) | 30.61 | 53.02 | 40.96 | 0.0 | 27.35 | 37.30 | 25.05 |
|
60 |
+
| [gemma-2-2b-jpn-it-abliterated-24](https://huggingface.co/datasets/open-llm-leaderboard/results/raw/main/ymcki/gemma-2-2b-jpn-it-abliterated-24/results_2024-10-25T16-29-46.542899.json) | 30.61 | 51.37 | 40.77 | 0.0 | 27.77 | 39.02 | 24.73 |
|
61 |
+
|
62 |
|
63 |
It is only slightly dumber than the original.
|
64 |
|