Update README.md
Browse files
README.md
CHANGED
@@ -7,7 +7,8 @@ language:
|
|
7 |
---
|
8 |
This repo contains a low-rank adapter for LLaMA-13b fit on the Stanford Alpaca dataset.
|
9 |
|
10 |
-
This version of the weights was trained on dual RTX3090
|
|
|
11 |
|
12 |
Epochs: 10
|
13 |
Batch size: 128
|
|
|
7 |
---
|
8 |
This repo contains a low-rank adapter for LLaMA-13b fit on the Stanford Alpaca dataset.
|
9 |
|
10 |
+
This version of the weights was trained on a dual RTX3090 system, powered by solar energy. Total training time was about 24 hours.
|
11 |
+
The following hyperparameters were used:
|
12 |
|
13 |
Epochs: 10
|
14 |
Batch size: 128
|