Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
chaoyi-wu
/
PMC_LLAMA_7B_10_epoch
like
6
Text Generation
Transformers
PyTorch
allenai/s2orc
llama
medical
text-generation-inference
Inference Endpoints
License:
apache-2.0
Model card
Files
Files and versions
Community
5
Train
Deploy
Use this model
refs/pr/5
PMC_LLAMA_7B_10_epoch
1 contributor
History:
10 commits
SFconvertbot
Adding `safetensors` variant of this model
3223398
verified
30 days ago
.gitattributes
1.48 kB
initial commit
over 1 year ago
README.md
1.08 kB
Update README.md
over 1 year ago
config.json
601 Bytes
Upload 7 files
over 1 year ago
generation_config.json
137 Bytes
Upload 7 files
over 1 year ago
model-00001-of-00003.safetensors
9.88 GB
LFS
Adding `safetensors` variant of this model
30 days ago
model-00002-of-00003.safetensors
9.89 GB
LFS
Adding `safetensors` variant of this model
30 days ago
model-00003-of-00003.safetensors
7.18 GB
LFS
Adding `safetensors` variant of this model
30 days ago
model.safetensors.index.json
28.1 kB
Adding `safetensors` variant of this model
30 days ago
pytorch_model-00001-of-00003.bin
pickle
Detected Pickle imports (4)
"torch.BFloat16Storage"
,
"torch._utils._rebuild_tensor_v2"
,
"collections.OrderedDict"
,
"torch.FloatStorage"
What is a pickle import?
9.88 GB
LFS
Upload 3 files
over 1 year ago
pytorch_model-00002-of-00003.bin
pickle
Detected Pickle imports (4)
"torch.BFloat16Storage"
,
"torch._utils._rebuild_tensor_v2"
,
"collections.OrderedDict"
,
"torch.FloatStorage"
What is a pickle import?
9.89 GB
LFS
Upload 3 files
over 1 year ago
pytorch_model-00003-of-00003.bin
pickle
Detected Pickle imports (4)
"torch.BFloat16Storage"
,
"torch._utils._rebuild_tensor_v2"
,
"collections.OrderedDict"
,
"torch.FloatStorage"
What is a pickle import?
7.18 GB
LFS
Upload 3 files
over 1 year ago
pytorch_model.bin.index.json
26.8 kB
Upload 7 files
over 1 year ago
special_tokens_map.json
2 Bytes
Upload 7 files
over 1 year ago
tokenizer.model
500 kB
LFS
Upload 7 files
over 1 year ago
tokenizer_config.json
141 Bytes
Update tokenizer_config.json
over 1 year ago