File size: 514 Bytes
de43766
 
2054f35
 
 
 
de43766
 
2f292a1
de43766
2054f35
de43766
2054f35
 
 
de43766
2054f35
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
---
library_name: transformers
license: apache-2.0
base_model: PharMolix/BioMedGPT-LM-7B
language:
- en
---

16-bit version of weights from `PharMolix/BioMedGPT-LM-7B`, for easier download / finetuning / model-merging

Code

```python
import torch
from transformers import AutoModelForCausalLM, AutoTokenizer

m2 = AutoModelForCausalLM.from_pretrained("PharMolix/BioMedGPT-LM-7B",
                                          torch_dtype=torch.float16,
                                          device_map="auto")
```