End of training
Browse files- README.md +11 -11
- adapter_model.bin +1 -1
README.md
CHANGED
@@ -107,7 +107,7 @@ xformers_attention: null
|
|
107 |
|
108 |
This model is a fine-tuned version of [unsloth/codegemma-7b-it](https://huggingface.co/unsloth/codegemma-7b-it) on the None dataset.
|
109 |
It achieves the following results on the evaluation set:
|
110 |
-
- Loss: 0.
|
111 |
|
112 |
## Model description
|
113 |
|
@@ -142,16 +142,16 @@ The following hyperparameters were used during training:
|
|
142 |
| Training Loss | Epoch | Step | Validation Loss |
|
143 |
|:-------------:|:------:|:----:|:---------------:|
|
144 |
| 0.5845 | 0.0008 | 1 | 0.6241 |
|
145 |
-
| 0.
|
146 |
-
| 0.
|
147 |
-
| 0.
|
148 |
-
| 0.
|
149 |
-
| 0.
|
150 |
-
| 0.
|
151 |
-
| 0.
|
152 |
-
| 0.
|
153 |
-
| 0.
|
154 |
-
| 0.
|
155 |
|
156 |
|
157 |
### Framework versions
|
|
|
107 |
|
108 |
This model is a fine-tuned version of [unsloth/codegemma-7b-it](https://huggingface.co/unsloth/codegemma-7b-it) on the None dataset.
|
109 |
It achieves the following results on the evaluation set:
|
110 |
+
- Loss: 0.3544
|
111 |
|
112 |
## Model description
|
113 |
|
|
|
142 |
| Training Loss | Epoch | Step | Validation Loss |
|
143 |
|:-------------:|:------:|:----:|:---------------:|
|
144 |
| 0.5845 | 0.0008 | 1 | 0.6241 |
|
145 |
+
| 0.5454 | 0.0041 | 5 | 0.5515 |
|
146 |
+
| 0.4878 | 0.0082 | 10 | 0.4380 |
|
147 |
+
| 0.3675 | 0.0122 | 15 | 0.3925 |
|
148 |
+
| 0.3737 | 0.0163 | 20 | 0.3816 |
|
149 |
+
| 0.367 | 0.0204 | 25 | 0.3707 |
|
150 |
+
| 0.3414 | 0.0245 | 30 | 0.3651 |
|
151 |
+
| 0.3589 | 0.0286 | 35 | 0.3596 |
|
152 |
+
| 0.5255 | 0.0327 | 40 | 0.3551 |
|
153 |
+
| 0.3261 | 0.0367 | 45 | 0.3554 |
|
154 |
+
| 0.3163 | 0.0408 | 50 | 0.3544 |
|
155 |
|
156 |
|
157 |
### Framework versions
|
adapter_model.bin
CHANGED
@@ -1,3 +1,3 @@
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
-
oid sha256:
|
3 |
size 200157226
|
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:d33702a9cb3c4ef029a259e39d5215c79c5e4e9a2b4d10db7cff8715ef7db258
|
3 |
size 200157226
|