library_name: transformers | |
license: apache-2.0 | |
base_model: PharMolix/BioMedGPT-LM-7B | |
language: | |
- en | |
16-bit version of weights from `PharMolix/BioMedGPT-LM-7B`, for easier download / finetuning / model-merging | |
Code | |
```python | |
import torch | |
from transformers import AutoModelForCausalLM, AutoTokenizer | |
m2 = AutoModelForCausalLM.from_pretrained("PharMolix/BioMedGPT-LM-7B", | |
torch_dtype=torch.float16, | |
device_map="auto") | |
``` |