Update README.md
Browse files
README.md
CHANGED
@@ -8,8 +8,8 @@ This model is a causal language model based on the `microsoft/phi-1_5` and has b
|
|
8 |
- **Base Model**: `microsoft/phi-1_5`
|
9 |
- **Fine-tuning Dataset**: `vicgalle/alpaca-gpt4`
|
10 |
- **Hardware**: NVIDIA 3090ti
|
11 |
-
- **Training Duration**:
|
12 |
-
- **VRAM Consumption**: Approx. 20 GB
|
13 |
- **Token Max Length**: 2048
|
14 |
- **Model Size**: 1.5billion + qlora weights merged
|
15 |
|
@@ -82,3 +82,11 @@ Because the base model is microsoft phi-1.5b model, this fine-tuned model is pro
|
|
82 |
I am a medical doctor interested in ML/NLP field.
|
83 |
If you have any advice, suggestions, or opportunities, or simply want to discuss the fascinating intersection of medicine and technology, please don't hesitate to reach out.
|
84 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
8 |
- **Base Model**: `microsoft/phi-1_5`
|
9 |
- **Fine-tuning Dataset**: `vicgalle/alpaca-gpt4`
|
10 |
- **Hardware**: NVIDIA 3090ti
|
11 |
+
- **Training Duration**: 14 hours
|
12 |
+
- **VRAM Consumption**: Approx. 20 GB
|
13 |
- **Token Max Length**: 2048
|
14 |
- **Model Size**: 1.5billion + qlora weights merged
|
15 |
|
|
|
82 |
I am a medical doctor interested in ML/NLP field.
|
83 |
If you have any advice, suggestions, or opportunities, or simply want to discuss the fascinating intersection of medicine and technology, please don't hesitate to reach out.
|
84 |
|
85 |
+
|
86 |
+
---
|
87 |
+
language:
|
88 |
+
- English
|
89 |
+
license: MICROSOFT RESEARCH LICENSE
|
90 |
+
datasets:
|
91 |
+
- vicgalle/alpaca-gpt4
|
92 |
+
---
|