Update README.md
Browse files
README.md
CHANGED
@@ -21,13 +21,11 @@ The finetuning session got completed in 1 hour and 30 minutes and costed us only
|
|
21 |
#### Hyperparameters & Run details:
|
22 |
- Model Path: meta-llama/Llama-2-7b-hf
|
23 |
- Dataset: garage-bAInd/Open-Platypus
|
24 |
-
- Learning rate: 0.
|
25 |
- Number of epochs: 5
|
26 |
- Data split: Training: 90% / Validation: 10%
|
27 |
- Gradient accumulation steps: 1
|
28 |
|
29 |
-
Loss metrics:
|
30 |
-
![training loss](train-loss.png "Training loss")
|
31 |
|
32 |
---
|
33 |
license: apache-2.0
|
|
|
21 |
#### Hyperparameters & Run details:
|
22 |
- Model Path: meta-llama/Llama-2-7b-hf
|
23 |
- Dataset: garage-bAInd/Open-Platypus
|
24 |
+
- Learning rate: 0.0002
|
25 |
- Number of epochs: 5
|
26 |
- Data split: Training: 90% / Validation: 10%
|
27 |
- Gradient accumulation steps: 1
|
28 |
|
|
|
|
|
29 |
|
30 |
---
|
31 |
license: apache-2.0
|