Hugging Face
Models
Datasets
Spaces
Posts
Docs
Solutions
Pricing
Log In
Sign Up
facebook
/
nllb-200-3.3B
like
254
Follow
AI at Meta
3,421
Translation
Transformers
PyTorch
flores-200
196 languages
m2m_100
text2text-generation
nllb
License:
cc-by-nc-4.0
Model card
Files
Files and versions
Community
15
Train
Deploy
Use this model
refs/pr/4
nllb-200-3.3B
3 contributors
History:
7 commits
lbourdois
Add multilingual to the language tag
7e6e9c7
almost 2 years ago
.gitattributes
Safe
1.22 kB
Add tokenizer files
over 2 years ago
README.md
Safe
7.66 kB
Add multilingual to the language tag
almost 2 years ago
config.json
Safe
808 Bytes
Update config.json
over 2 years ago
pytorch_model-00001-of-00003.bin
Safe
pickle
Detected Pickle imports (3)
"collections.OrderedDict"
,
"torch._utils._rebuild_tensor_v2"
,
"torch.FloatStorage"
What is a pickle import?
6.93 GB
LFS
Add modeling files
over 2 years ago
pytorch_model-00002-of-00003.bin
Safe
pickle
Detected Pickle imports (3)
"torch.FloatStorage"
,
"collections.OrderedDict"
,
"torch._utils._rebuild_tensor_v2"
What is a pickle import?
8.55 GB
LFS
Add modeling files
over 2 years ago
pytorch_model-00003-of-00003.bin
Safe
pickle
Detected Pickle imports (3)
"collections.OrderedDict"
,
"torch.FloatStorage"
,
"torch._utils._rebuild_tensor_v2"
What is a pickle import?
2.1 GB
LFS
Add modeling files
over 2 years ago
pytorch_model.bin.index.json
Safe
90 kB
Add modeling files
over 2 years ago
sentencepiece.bpe.model
Safe
4.85 MB
LFS
Add tokenizer files
over 2 years ago
special_tokens_map.json
Safe
3.55 kB
Add tokenizer files
over 2 years ago
tokenizer.json
Safe
17.3 MB
LFS
Add tokenizer files
over 2 years ago
tokenizer_config.json
Safe
564 Bytes
Add tokenizer files
over 2 years ago