Upload README.md
Browse files
README.md
CHANGED
@@ -66,14 +66,16 @@ Click on the model name go to the raw score json generated by Open LLM Leaderboa
|
|
66 |
| ----- | ------- | ------ | ----|--------- | ---- | ---- | -------- |
|
67 |
| [gemma-2-2b-jpn-it](https://huggingface.co/datasets/open-llm-leaderboard/results/blob/main/google/gemma-2-2b-jpn-it/results_2024-10-15T15-21-39.173019.json) | 30.82 | 54.11 | 41.43 | 0.0 | 27.52 | 37.17 | 24.67 |
|
68 |
| [gemma-2-2b-ORPO-jpn-it-abliterated-18-merge (5 epoches)](https://huggingface.co/datasets/open-llm-leaderboard/results/raw/main/ymcki/gemma-2-2b-ORPO-jpn-it-abliterated-18-merge/results_2024-10-30T17-06-58.119904.json) | 29.26 | 49.16 | 38.15 | 2.49 | 28.19 | 33.07 | 24.51 |
|
69 |
-
| gemma-2-2b-ORPO-jpn-it-abliterated-18-merge (10 epoches) |
|
70 |
| [gemma-2-2b-ORPO-jpn-it-abliterated-18 (5 epoches)](https://huggingface.co/datasets/open-llm-leaderboard/results/raw/main/ymcki/gemma-2-2b-ORPO-jpn-it-abliterated-18/results_2024-10-30T22-19-29.202883.json) | 29.57 | 48.05 | 41.26 | 0.0 | 27.18 | 36.51 | 24.43 |
|
71 |
-
| [gemma-2-2b-ORPO-jpn-it-abliterated-18 (10 epoches)](https://huggingface.co/datasets/open-llm-leaderboard/results/raw/main/ymcki/gemma-2-2b-ORPO-jpn-it-abliterated-18/results_2024-11-
|
72 |
| [gemma-2-2b-jpn-it-abliterated-17](https://huggingface.co/datasets/open-llm-leaderboard/results/raw/main/ymcki/gemma-2-2b-jpn-it-abliterated-17/results_2024-10-18T15-18-46.821674.json) | 30.29 | 52.65 | 40.46 | 0.0 | 27.18 | 36.90 | 24.55 |
|
73 |
| [gemma-2-2b-jpn-it-abliterated-18](https://huggingface.co/datasets/open-llm-leaderboard/results/raw/main/ymcki/gemma-2-2b-jpn-it-abliterated-18/results_2024-10-18T15-41-42.399571.json) | 30.61 | 53.02 | 40.96 | 0.0 | 27.35 | 37.30 | 25.05 |
|
74 |
| [gemma-2-2b-jpn-it-abliterated-24](https://huggingface.co/datasets/open-llm-leaderboard/results/raw/main/ymcki/gemma-2-2b-jpn-it-abliterated-24/results_2024-10-25T16-29-46.542899.json) | 30.61 | 51.37 | 40.77 | 0.0 | 27.77 | 39.02 | 24.73 |
|
75 |
| [gemma-2-2b-jpn-it-abliterated-17-18-24](https://huggingface.co/datasets/open-llm-leaderboard/results/raw/main/ymcki/gemma-2-2b-jpn-it-abliterated-17-18-24/results_2024-11-06T19-05-49.169139.json) | 29.17 | 51.33 | 37.82 | 0.0 | 28.10 | 34.92 | 22.82 |
|
76 |
|
|
|
|
|
77 |
## How to run this model
|
78 |
|
79 |
```py
|
|
|
66 |
| ----- | ------- | ------ | ----|--------- | ---- | ---- | -------- |
|
67 |
| [gemma-2-2b-jpn-it](https://huggingface.co/datasets/open-llm-leaderboard/results/blob/main/google/gemma-2-2b-jpn-it/results_2024-10-15T15-21-39.173019.json) | 30.82 | 54.11 | 41.43 | 0.0 | 27.52 | 37.17 | 24.67 |
|
68 |
| [gemma-2-2b-ORPO-jpn-it-abliterated-18-merge (5 epoches)](https://huggingface.co/datasets/open-llm-leaderboard/results/raw/main/ymcki/gemma-2-2b-ORPO-jpn-it-abliterated-18-merge/results_2024-10-30T17-06-58.119904.json) | 29.26 | 49.16 | 38.15 | 2.49 | 28.19 | 33.07 | 24.51 |
|
69 |
+
| [gemma-2-2b-ORPO-jpn-it-abliterated-18-merge (10 epoches)](https://huggingface.co/datasets/open-llm-leaderboard/results/raw/main/ymcki/gemma-2-2b-ORPO-jpn-it-abliterated-18-merge/results_2024-11-18T07-53-54.972969.json) | 30.65 | 53.81 | 41.21 | 0.83 | 28.36 | 35.05 | 24.61 |
|
70 |
| [gemma-2-2b-ORPO-jpn-it-abliterated-18 (5 epoches)](https://huggingface.co/datasets/open-llm-leaderboard/results/raw/main/ymcki/gemma-2-2b-ORPO-jpn-it-abliterated-18/results_2024-10-30T22-19-29.202883.json) | 29.57 | 48.05 | 41.26 | 0.0 | 27.18 | 36.51 | 24.43 |
|
71 |
+
| [gemma-2-2b-ORPO-jpn-it-abliterated-18 (10 epoches)](https://huggingface.co/datasets/open-llm-leaderboard/results/raw/main/ymcki/gemma-2-2b-ORPO-jpn-it-abliterated-18/results_2024-11-18T08-02-58.149334.json) | 29.68 | 47.76 | 40.20 | 0.38 | 28.86 | 37.43 | 23.45 |
|
72 |
| [gemma-2-2b-jpn-it-abliterated-17](https://huggingface.co/datasets/open-llm-leaderboard/results/raw/main/ymcki/gemma-2-2b-jpn-it-abliterated-17/results_2024-10-18T15-18-46.821674.json) | 30.29 | 52.65 | 40.46 | 0.0 | 27.18 | 36.90 | 24.55 |
|
73 |
| [gemma-2-2b-jpn-it-abliterated-18](https://huggingface.co/datasets/open-llm-leaderboard/results/raw/main/ymcki/gemma-2-2b-jpn-it-abliterated-18/results_2024-10-18T15-41-42.399571.json) | 30.61 | 53.02 | 40.96 | 0.0 | 27.35 | 37.30 | 25.05 |
|
74 |
| [gemma-2-2b-jpn-it-abliterated-24](https://huggingface.co/datasets/open-llm-leaderboard/results/raw/main/ymcki/gemma-2-2b-jpn-it-abliterated-24/results_2024-10-25T16-29-46.542899.json) | 30.61 | 51.37 | 40.77 | 0.0 | 27.77 | 39.02 | 24.73 |
|
75 |
| [gemma-2-2b-jpn-it-abliterated-17-18-24](https://huggingface.co/datasets/open-llm-leaderboard/results/raw/main/ymcki/gemma-2-2b-jpn-it-abliterated-17-18-24/results_2024-11-06T19-05-49.169139.json) | 29.17 | 51.33 | 37.82 | 0.0 | 28.10 | 34.92 | 22.82 |
|
76 |
|
77 |
+
The abliterated-18-merge model is slightly better than the abliterated-18 model but slightly worse than the original instruct model.
|
78 |
+
|
79 |
## How to run this model
|
80 |
|
81 |
```py
|