pipeline_tag
stringclasses 48
values | library_name
stringclasses 198
values | text
stringlengths 1
900k
| metadata
stringlengths 2
438k
| id
stringlengths 5
122
| last_modified
null | tags
sequencelengths 1
1.84k
| sha
null | created_at
stringlengths 25
25
| arxiv
sequencelengths 0
201
| languages
sequencelengths 0
1.83k
| tags_str
stringlengths 17
9.34k
| text_str
stringlengths 0
389k
| text_lists
sequencelengths 0
722
| processed_texts
sequencelengths 1
723
|
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
fill-mask | transformers |
# bert-base-en-no-cased
We are sharing smaller versions of [bert-base-multilingual-cased](https://huggingface.co/bert-base-multilingual-cased) that handle a custom number of languages.
Unlike [distilbert-base-multilingual-cased](https://huggingface.co/distilbert-base-multilingual-cased), our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: [Load What You Need: Smaller Versions of Multilingual BERT](https://www.aclweb.org/anthology/2020.sustainlp-1.16.pdf).
## How to use
```python
from transformers import AutoTokenizer, AutoModel
tokenizer = AutoTokenizer.from_pretrained("Geotrend/bert-base-en-no-cased")
model = AutoModel.from_pretrained("Geotrend/bert-base-en-no-cased")
```
To generate other smaller versions of multilingual transformers please visit [our Github repo](https://github.com/Geotrend-research/smaller-transformers).
### How to cite
```bibtex
@inproceedings{smallermbert,
title={Load What You Need: Smaller Versions of Mutlilingual BERT},
author={Abdaoui, Amine and Pradel, Camille and Sigel, Grégoire},
booktitle={SustaiNLP / EMNLP},
year={2020}
}
```
## Contact
Please contact [email protected] for any question, feedback or request. | {"language": "multilingual", "license": "apache-2.0", "datasets": "wikipedia"} | Geotrend/bert-base-en-no-cased | null | [
"transformers",
"pytorch",
"tf",
"jax",
"bert",
"fill-mask",
"multilingual",
"dataset:wikipedia",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | [] | [
"multilingual"
] | TAGS
#transformers #pytorch #tf #jax #bert #fill-mask #multilingual #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
|
# bert-base-en-no-cased
We are sharing smaller versions of bert-base-multilingual-cased that handle a custom number of languages.
Unlike distilbert-base-multilingual-cased, our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.
## How to use
To generate other smaller versions of multilingual transformers please visit our Github repo.
### How to cite
## Contact
Please contact amine@URL for any question, feedback or request. | [
"# bert-base-en-no-cased\n\nWe are sharing smaller versions of bert-base-multilingual-cased that handle a custom number of languages.\n\nUnlike distilbert-base-multilingual-cased, our versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] | [
"TAGS\n#transformers #pytorch #tf #jax #bert #fill-mask #multilingual #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n",
"# bert-base-en-no-cased\n\nWe are sharing smaller versions of bert-base-multilingual-cased that handle a custom number of languages.\n\nUnlike distilbert-base-multilingual-cased, our versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] |
fill-mask | transformers |
# bert-base-en-pl-cased
We are sharing smaller versions of [bert-base-multilingual-cased](https://huggingface.co/bert-base-multilingual-cased) that handle a custom number of languages.
Unlike [distilbert-base-multilingual-cased](https://huggingface.co/distilbert-base-multilingual-cased), our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: [Load What You Need: Smaller Versions of Multilingual BERT](https://www.aclweb.org/anthology/2020.sustainlp-1.16.pdf).
## How to use
```python
from transformers import AutoTokenizer, AutoModel
tokenizer = AutoTokenizer.from_pretrained("Geotrend/bert-base-en-pl-cased")
model = AutoModel.from_pretrained("Geotrend/bert-base-en-pl-cased")
```
To generate other smaller versions of multilingual transformers please visit [our Github repo](https://github.com/Geotrend-research/smaller-transformers).
### How to cite
```bibtex
@inproceedings{smallermbert,
title={Load What You Need: Smaller Versions of Mutlilingual BERT},
author={Abdaoui, Amine and Pradel, Camille and Sigel, Grégoire},
booktitle={SustaiNLP / EMNLP},
year={2020}
}
```
## Contact
Please contact [email protected] for any question, feedback or request. | {"language": "multilingual", "license": "apache-2.0", "datasets": "wikipedia"} | Geotrend/bert-base-en-pl-cased | null | [
"transformers",
"pytorch",
"tf",
"jax",
"safetensors",
"bert",
"fill-mask",
"multilingual",
"dataset:wikipedia",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | [] | [
"multilingual"
] | TAGS
#transformers #pytorch #tf #jax #safetensors #bert #fill-mask #multilingual #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
|
# bert-base-en-pl-cased
We are sharing smaller versions of bert-base-multilingual-cased that handle a custom number of languages.
Unlike distilbert-base-multilingual-cased, our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.
## How to use
To generate other smaller versions of multilingual transformers please visit our Github repo.
### How to cite
## Contact
Please contact amine@URL for any question, feedback or request. | [
"# bert-base-en-pl-cased\n\nWe are sharing smaller versions of bert-base-multilingual-cased that handle a custom number of languages.\n\nUnlike distilbert-base-multilingual-cased, our versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] | [
"TAGS\n#transformers #pytorch #tf #jax #safetensors #bert #fill-mask #multilingual #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n",
"# bert-base-en-pl-cased\n\nWe are sharing smaller versions of bert-base-multilingual-cased that handle a custom number of languages.\n\nUnlike distilbert-base-multilingual-cased, our versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] |
fill-mask | transformers |
# bert-base-en-pt-cased
We are sharing smaller versions of [bert-base-multilingual-cased](https://huggingface.co/bert-base-multilingual-cased) that handle a custom number of languages.
Unlike [distilbert-base-multilingual-cased](https://huggingface.co/distilbert-base-multilingual-cased), our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: [Load What You Need: Smaller Versions of Multilingual BERT](https://www.aclweb.org/anthology/2020.sustainlp-1.16.pdf).
## How to use
```python
from transformers import AutoTokenizer, AutoModel
tokenizer = AutoTokenizer.from_pretrained("Geotrend/bert-base-en-pt-cased")
model = AutoModel.from_pretrained("Geotrend/bert-base-en-pt-cased")
```
To generate other smaller versions of multilingual transformers please visit [our Github repo](https://github.com/Geotrend-research/smaller-transformers).
### How to cite
```bibtex
@inproceedings{smallermbert,
title={Load What You Need: Smaller Versions of Mutlilingual BERT},
author={Abdaoui, Amine and Pradel, Camille and Sigel, Grégoire},
booktitle={SustaiNLP / EMNLP},
year={2020}
}
```
## Contact
Please contact [email protected] for any question, feedback or request. | {"language": ["multilingual", "en", "pt"], "license": "apache-2.0", "datasets": "wikipedia"} | Geotrend/bert-base-en-pt-cased | null | [
"transformers",
"pytorch",
"tf",
"jax",
"safetensors",
"bert",
"fill-mask",
"multilingual",
"en",
"pt",
"dataset:wikipedia",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | [] | [
"multilingual",
"en",
"pt"
] | TAGS
#transformers #pytorch #tf #jax #safetensors #bert #fill-mask #multilingual #en #pt #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
|
# bert-base-en-pt-cased
We are sharing smaller versions of bert-base-multilingual-cased that handle a custom number of languages.
Unlike distilbert-base-multilingual-cased, our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.
## How to use
To generate other smaller versions of multilingual transformers please visit our Github repo.
### How to cite
## Contact
Please contact amine@URL for any question, feedback or request. | [
"# bert-base-en-pt-cased\n\nWe are sharing smaller versions of bert-base-multilingual-cased that handle a custom number of languages.\n\nUnlike distilbert-base-multilingual-cased, our versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] | [
"TAGS\n#transformers #pytorch #tf #jax #safetensors #bert #fill-mask #multilingual #en #pt #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n",
"# bert-base-en-pt-cased\n\nWe are sharing smaller versions of bert-base-multilingual-cased that handle a custom number of languages.\n\nUnlike distilbert-base-multilingual-cased, our versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] |
fill-mask | transformers |
# bert-base-en-ro-cased
We are sharing smaller versions of [bert-base-multilingual-cased](https://huggingface.co/bert-base-multilingual-cased) that handle a custom number of languages.
Unlike [distilbert-base-multilingual-cased](https://huggingface.co/distilbert-base-multilingual-cased), our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: [Load What You Need: Smaller Versions of Multilingual BERT](https://www.aclweb.org/anthology/2020.sustainlp-1.16.pdf).
## How to use
```python
from transformers import AutoTokenizer, AutoModel
tokenizer = AutoTokenizer.from_pretrained("Geotrend/bert-base-en-ro-cased")
model = AutoModel.from_pretrained("Geotrend/bert-base-en-ro-cased")
```
To generate other smaller versions of multilingual transformers please visit [our Github repo](https://github.com/Geotrend-research/smaller-transformers).
### How to cite
```bibtex
@inproceedings{smallermbert,
title={Load What You Need: Smaller Versions of Mutlilingual BERT},
author={Abdaoui, Amine and Pradel, Camille and Sigel, Grégoire},
booktitle={SustaiNLP / EMNLP},
year={2020}
}
```
## Contact
Please contact [email protected] for any question, feedback or request. | {"language": "multilingual", "license": "apache-2.0", "datasets": "wikipedia"} | Geotrend/bert-base-en-ro-cased | null | [
"transformers",
"pytorch",
"tf",
"jax",
"bert",
"fill-mask",
"multilingual",
"dataset:wikipedia",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | [] | [
"multilingual"
] | TAGS
#transformers #pytorch #tf #jax #bert #fill-mask #multilingual #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
|
# bert-base-en-ro-cased
We are sharing smaller versions of bert-base-multilingual-cased that handle a custom number of languages.
Unlike distilbert-base-multilingual-cased, our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.
## How to use
To generate other smaller versions of multilingual transformers please visit our Github repo.
### How to cite
## Contact
Please contact amine@URL for any question, feedback or request. | [
"# bert-base-en-ro-cased\n\nWe are sharing smaller versions of bert-base-multilingual-cased that handle a custom number of languages.\n\nUnlike distilbert-base-multilingual-cased, our versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] | [
"TAGS\n#transformers #pytorch #tf #jax #bert #fill-mask #multilingual #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n",
"# bert-base-en-ro-cased\n\nWe are sharing smaller versions of bert-base-multilingual-cased that handle a custom number of languages.\n\nUnlike distilbert-base-multilingual-cased, our versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] |
fill-mask | transformers |
# bert-base-en-ru-cased
We are sharing smaller versions of [bert-base-multilingual-cased](https://huggingface.co/bert-base-multilingual-cased) that handle a custom number of languages.
Unlike [distilbert-base-multilingual-cased](https://huggingface.co/distilbert-base-multilingual-cased), our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: [Load What You Need: Smaller Versions of Multilingual BERT](https://www.aclweb.org/anthology/2020.sustainlp-1.16.pdf).
## How to use
```python
from transformers import AutoTokenizer, AutoModel
tokenizer = AutoTokenizer.from_pretrained("Geotrend/bert-base-en-ru-cased")
model = AutoModel.from_pretrained("Geotrend/bert-base-en-ru-cased")
```
To generate other smaller versions of multilingual transformers please visit [our Github repo](https://github.com/Geotrend-research/smaller-transformers).
### How to cite
```bibtex
@inproceedings{smallermbert,
title={Load What You Need: Smaller Versions of Mutlilingual BERT},
author={Abdaoui, Amine and Pradel, Camille and Sigel, Grégoire},
booktitle={SustaiNLP / EMNLP},
year={2020}
}
```
## Contact
Please contact [email protected] for any question, feedback or request.
| {"language": "multilingual", "license": "apache-2.0", "datasets": "wikipedia", "widget": [{"text": "Google generated 46 billion [MASK] in revenue."}, {"text": "Paris is the capital of [MASK]."}, {"text": "Algiers is the largest city in [MASK]."}]} | Geotrend/bert-base-en-ru-cased | null | [
"transformers",
"pytorch",
"tf",
"jax",
"safetensors",
"bert",
"fill-mask",
"multilingual",
"dataset:wikipedia",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | [] | [
"multilingual"
] | TAGS
#transformers #pytorch #tf #jax #safetensors #bert #fill-mask #multilingual #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
|
# bert-base-en-ru-cased
We are sharing smaller versions of bert-base-multilingual-cased that handle a custom number of languages.
Unlike distilbert-base-multilingual-cased, our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.
## How to use
To generate other smaller versions of multilingual transformers please visit our Github repo.
### How to cite
## Contact
Please contact amine@URL for any question, feedback or request.
| [
"# bert-base-en-ru-cased\n\nWe are sharing smaller versions of bert-base-multilingual-cased that handle a custom number of languages.\n\nUnlike distilbert-base-multilingual-cased, our versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] | [
"TAGS\n#transformers #pytorch #tf #jax #safetensors #bert #fill-mask #multilingual #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n",
"# bert-base-en-ru-cased\n\nWe are sharing smaller versions of bert-base-multilingual-cased that handle a custom number of languages.\n\nUnlike distilbert-base-multilingual-cased, our versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] |
fill-mask | transformers |
# bert-base-en-sw-cased
We are sharing smaller versions of [bert-base-multilingual-cased](https://huggingface.co/bert-base-multilingual-cased) that handle a custom number of languages.
Unlike [distilbert-base-multilingual-cased](https://huggingface.co/distilbert-base-multilingual-cased), our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: [Load What You Need: Smaller Versions of Multilingual BERT](https://www.aclweb.org/anthology/2020.sustainlp-1.16.pdf).
## How to use
```python
from transformers import AutoTokenizer, AutoModel
tokenizer = AutoTokenizer.from_pretrained("Geotrend/bert-base-en-sw-cased")
model = AutoModel.from_pretrained("Geotrend/bert-base-en-sw-cased")
```
To generate other smaller versions of multilingual transformers please visit [our Github repo](https://github.com/Geotrend-research/smaller-transformers).
### How to cite
```bibtex
@inproceedings{smallermbert,
title={Load What You Need: Smaller Versions of Mutlilingual BERT},
author={Abdaoui, Amine and Pradel, Camille and Sigel, Grégoire},
booktitle={SustaiNLP / EMNLP},
year={2020}
}
```
## Contact
Please contact [email protected] for any question, feedback or request.
| {"language": "multilingual", "license": "apache-2.0", "datasets": "wikipedia", "widget": [{"text": "Google generated 46 billion [MASK] in revenue."}, {"text": "Paris is the capital of [MASK]."}, {"text": "Algiers is the largest city in [MASK]."}]} | Geotrend/bert-base-en-sw-cased | null | [
"transformers",
"pytorch",
"tf",
"jax",
"bert",
"fill-mask",
"multilingual",
"dataset:wikipedia",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | [] | [
"multilingual"
] | TAGS
#transformers #pytorch #tf #jax #bert #fill-mask #multilingual #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
|
# bert-base-en-sw-cased
We are sharing smaller versions of bert-base-multilingual-cased that handle a custom number of languages.
Unlike distilbert-base-multilingual-cased, our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.
## How to use
To generate other smaller versions of multilingual transformers please visit our Github repo.
### How to cite
## Contact
Please contact amine@URL for any question, feedback or request.
| [
"# bert-base-en-sw-cased\n\nWe are sharing smaller versions of bert-base-multilingual-cased that handle a custom number of languages.\n\nUnlike distilbert-base-multilingual-cased, our versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] | [
"TAGS\n#transformers #pytorch #tf #jax #bert #fill-mask #multilingual #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n",
"# bert-base-en-sw-cased\n\nWe are sharing smaller versions of bert-base-multilingual-cased that handle a custom number of languages.\n\nUnlike distilbert-base-multilingual-cased, our versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] |
fill-mask | transformers |
# bert-base-en-th-cased
We are sharing smaller versions of [bert-base-multilingual-cased](https://huggingface.co/bert-base-multilingual-cased) that handle a custom number of languages.
Unlike [distilbert-base-multilingual-cased](https://huggingface.co/distilbert-base-multilingual-cased), our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: [Load What You Need: Smaller Versions of Multilingual BERT](https://www.aclweb.org/anthology/2020.sustainlp-1.16.pdf).
## How to use
```python
from transformers import AutoTokenizer, AutoModel
tokenizer = AutoTokenizer.from_pretrained("Geotrend/bert-base-en-th-cased")
model = AutoModel.from_pretrained("Geotrend/bert-base-en-th-cased")
```
To generate other smaller versions of multilingual transformers please visit [our Github repo](https://github.com/Geotrend-research/smaller-transformers).
### How to cite
```bibtex
@inproceedings{smallermbert,
title={Load What You Need: Smaller Versions of Mutlilingual BERT},
author={Abdaoui, Amine and Pradel, Camille and Sigel, Grégoire},
booktitle={SustaiNLP / EMNLP},
year={2020}
}
```
## Contact
Please contact [email protected] for any question, feedback or request.
| {"language": "multilingual", "license": "apache-2.0", "datasets": "wikipedia", "widget": [{"text": "Google generated 46 billion [MASK] in revenue."}, {"text": "Paris is the capital of [MASK]."}, {"text": "Algiers is the largest city in [MASK]."}]} | Geotrend/bert-base-en-th-cased | null | [
"transformers",
"pytorch",
"tf",
"jax",
"safetensors",
"bert",
"fill-mask",
"multilingual",
"dataset:wikipedia",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | [] | [
"multilingual"
] | TAGS
#transformers #pytorch #tf #jax #safetensors #bert #fill-mask #multilingual #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
|
# bert-base-en-th-cased
We are sharing smaller versions of bert-base-multilingual-cased that handle a custom number of languages.
Unlike distilbert-base-multilingual-cased, our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.
## How to use
To generate other smaller versions of multilingual transformers please visit our Github repo.
### How to cite
## Contact
Please contact amine@URL for any question, feedback or request.
| [
"# bert-base-en-th-cased\n\nWe are sharing smaller versions of bert-base-multilingual-cased that handle a custom number of languages.\n\nUnlike distilbert-base-multilingual-cased, our versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] | [
"TAGS\n#transformers #pytorch #tf #jax #safetensors #bert #fill-mask #multilingual #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n",
"# bert-base-en-th-cased\n\nWe are sharing smaller versions of bert-base-multilingual-cased that handle a custom number of languages.\n\nUnlike distilbert-base-multilingual-cased, our versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] |
fill-mask | transformers |
# bert-base-en-tr-cased
We are sharing smaller versions of [bert-base-multilingual-cased](https://huggingface.co/bert-base-multilingual-cased) that handle a custom number of languages.
Unlike [distilbert-base-multilingual-cased](https://huggingface.co/distilbert-base-multilingual-cased), our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: [Load What You Need: Smaller Versions of Multilingual BERT](https://www.aclweb.org/anthology/2020.sustainlp-1.16.pdf).
## How to use
```python
from transformers import AutoTokenizer, AutoModel
tokenizer = AutoTokenizer.from_pretrained("Geotrend/bert-base-en-tr-cased")
model = AutoModel.from_pretrained("Geotrend/bert-base-en-tr-cased")
```
To generate other smaller versions of multilingual transformers please visit [our Github repo](https://github.com/Geotrend-research/smaller-transformers).
### How to cite
```bibtex
@inproceedings{smallermbert,
title={Load What You Need: Smaller Versions of Mutlilingual BERT},
author={Abdaoui, Amine and Pradel, Camille and Sigel, Grégoire},
booktitle={SustaiNLP / EMNLP},
year={2020}
}
```
## Contact
Please contact [email protected] for any question, feedback or request.
| {"language": "multilingual", "license": "apache-2.0", "datasets": "wikipedia", "widget": [{"text": "Google generated 46 billion [MASK] in revenue."}, {"text": "Paris is the capital of [MASK]."}, {"text": "Algiers is the largest city in [MASK]."}]} | Geotrend/bert-base-en-tr-cased | null | [
"transformers",
"pytorch",
"tf",
"jax",
"bert",
"fill-mask",
"multilingual",
"dataset:wikipedia",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | [] | [
"multilingual"
] | TAGS
#transformers #pytorch #tf #jax #bert #fill-mask #multilingual #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
|
# bert-base-en-tr-cased
We are sharing smaller versions of bert-base-multilingual-cased that handle a custom number of languages.
Unlike distilbert-base-multilingual-cased, our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.
## How to use
To generate other smaller versions of multilingual transformers please visit our Github repo.
### How to cite
## Contact
Please contact amine@URL for any question, feedback or request.
| [
"# bert-base-en-tr-cased\n\nWe are sharing smaller versions of bert-base-multilingual-cased that handle a custom number of languages.\n\nUnlike distilbert-base-multilingual-cased, our versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] | [
"TAGS\n#transformers #pytorch #tf #jax #bert #fill-mask #multilingual #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n",
"# bert-base-en-tr-cased\n\nWe are sharing smaller versions of bert-base-multilingual-cased that handle a custom number of languages.\n\nUnlike distilbert-base-multilingual-cased, our versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] |
fill-mask | transformers |
# bert-base-en-uk-cased
We are sharing smaller versions of [bert-base-multilingual-cased](https://huggingface.co/bert-base-multilingual-cased) that handle a custom number of languages.
Unlike [distilbert-base-multilingual-cased](https://huggingface.co/distilbert-base-multilingual-cased), our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: [Load What You Need: Smaller Versions of Multilingual BERT](https://www.aclweb.org/anthology/2020.sustainlp-1.16.pdf).
## How to use
```python
from transformers import AutoTokenizer, AutoModel
tokenizer = AutoTokenizer.from_pretrained("Geotrend/bert-base-en-uk-cased")
model = AutoModel.from_pretrained("Geotrend/bert-base-en-uk-cased")
```
To generate other smaller versions of multilingual transformers please visit [our Github repo](https://github.com/Geotrend-research/smaller-transformers).
### How to cite
```bibtex
@inproceedings{smallermbert,
title={Load What You Need: Smaller Versions of Mutlilingual BERT},
author={Abdaoui, Amine and Pradel, Camille and Sigel, Grégoire},
booktitle={SustaiNLP / EMNLP},
year={2020}
}
```
## Contact
Please contact [email protected] for any question, feedback or request. | {"language": "multilingual", "license": "apache-2.0", "datasets": "wikipedia"} | Geotrend/bert-base-en-uk-cased | null | [
"transformers",
"pytorch",
"tf",
"jax",
"bert",
"fill-mask",
"multilingual",
"dataset:wikipedia",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | [] | [
"multilingual"
] | TAGS
#transformers #pytorch #tf #jax #bert #fill-mask #multilingual #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
|
# bert-base-en-uk-cased
We are sharing smaller versions of bert-base-multilingual-cased that handle a custom number of languages.
Unlike distilbert-base-multilingual-cased, our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.
## How to use
To generate other smaller versions of multilingual transformers please visit our Github repo.
### How to cite
## Contact
Please contact amine@URL for any question, feedback or request. | [
"# bert-base-en-uk-cased\n\nWe are sharing smaller versions of bert-base-multilingual-cased that handle a custom number of languages.\n\nUnlike distilbert-base-multilingual-cased, our versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] | [
"TAGS\n#transformers #pytorch #tf #jax #bert #fill-mask #multilingual #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n",
"# bert-base-en-uk-cased\n\nWe are sharing smaller versions of bert-base-multilingual-cased that handle a custom number of languages.\n\nUnlike distilbert-base-multilingual-cased, our versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] |
fill-mask | transformers |
# bert-base-en-ur-cased
We are sharing smaller versions of [bert-base-multilingual-cased](https://huggingface.co/bert-base-multilingual-cased) that handle a custom number of languages.
Unlike [distilbert-base-multilingual-cased](https://huggingface.co/distilbert-base-multilingual-cased), our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: [Load What You Need: Smaller Versions of Multilingual BERT](https://www.aclweb.org/anthology/2020.sustainlp-1.16.pdf).
## How to use
```python
from transformers import AutoTokenizer, AutoModel
tokenizer = AutoTokenizer.from_pretrained("Geotrend/bert-base-en-ur-cased")
model = AutoModel.from_pretrained("Geotrend/bert-base-en-ur-cased")
```
To generate other smaller versions of multilingual transformers please visit [our Github repo](https://github.com/Geotrend-research/smaller-transformers).
### How to cite
```bibtex
@inproceedings{smallermbert,
title={Load What You Need: Smaller Versions of Mutlilingual BERT},
author={Abdaoui, Amine and Pradel, Camille and Sigel, Grégoire},
booktitle={SustaiNLP / EMNLP},
year={2020}
}
```
## Contact
Please contact [email protected] for any question, feedback or request.
| {"language": "multilingual", "license": "apache-2.0", "datasets": "wikipedia", "widget": [{"text": "Google generated 46 billion [MASK] in revenue."}, {"text": "Paris is the capital of [MASK]."}, {"text": "Algiers is the largest city in [MASK]."}]} | Geotrend/bert-base-en-ur-cased | null | [
"transformers",
"pytorch",
"tf",
"jax",
"bert",
"fill-mask",
"multilingual",
"dataset:wikipedia",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | [] | [
"multilingual"
] | TAGS
#transformers #pytorch #tf #jax #bert #fill-mask #multilingual #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
|
# bert-base-en-ur-cased
We are sharing smaller versions of bert-base-multilingual-cased that handle a custom number of languages.
Unlike distilbert-base-multilingual-cased, our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.
## How to use
To generate other smaller versions of multilingual transformers please visit our Github repo.
### How to cite
## Contact
Please contact amine@URL for any question, feedback or request.
| [
"# bert-base-en-ur-cased\n\nWe are sharing smaller versions of bert-base-multilingual-cased that handle a custom number of languages.\n\nUnlike distilbert-base-multilingual-cased, our versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] | [
"TAGS\n#transformers #pytorch #tf #jax #bert #fill-mask #multilingual #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n",
"# bert-base-en-ur-cased\n\nWe are sharing smaller versions of bert-base-multilingual-cased that handle a custom number of languages.\n\nUnlike distilbert-base-multilingual-cased, our versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] |
fill-mask | transformers |
# bert-base-en-vi-cased
We are sharing smaller versions of [bert-base-multilingual-cased](https://huggingface.co/bert-base-multilingual-cased) that handle a custom number of languages.
Unlike [distilbert-base-multilingual-cased](https://huggingface.co/distilbert-base-multilingual-cased), our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: [Load What You Need: Smaller Versions of Multilingual BERT](https://www.aclweb.org/anthology/2020.sustainlp-1.16.pdf).
## How to use
```python
from transformers import AutoTokenizer, AutoModel
tokenizer = AutoTokenizer.from_pretrained("Geotrend/bert-base-en-vi-cased")
model = AutoModel.from_pretrained("Geotrend/bert-base-en-vi-cased")
```
To generate other smaller versions of multilingual transformers please visit [our Github repo](https://github.com/Geotrend-research/smaller-transformers).
### How to cite
```bibtex
@inproceedings{smallermbert,
title={Load What You Need: Smaller Versions of Mutlilingual BERT},
author={Abdaoui, Amine and Pradel, Camille and Sigel, Grégoire},
booktitle={SustaiNLP / EMNLP},
year={2020}
}
```
## Contact
Please contact [email protected] for any question, feedback or request.
| {"language": "multilingual", "license": "apache-2.0", "datasets": "wikipedia", "widget": [{"text": "Google generated 46 billion [MASK] in revenue."}, {"text": "Paris is the capital of [MASK]."}, {"text": "Algiers is the largest city in [MASK]."}]} | Geotrend/bert-base-en-vi-cased | null | [
"transformers",
"pytorch",
"tf",
"jax",
"bert",
"fill-mask",
"multilingual",
"dataset:wikipedia",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | [] | [
"multilingual"
] | TAGS
#transformers #pytorch #tf #jax #bert #fill-mask #multilingual #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
|
# bert-base-en-vi-cased
We are sharing smaller versions of bert-base-multilingual-cased that handle a custom number of languages.
Unlike distilbert-base-multilingual-cased, our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.
## How to use
To generate other smaller versions of multilingual transformers please visit our Github repo.
### How to cite
## Contact
Please contact amine@URL for any question, feedback or request.
| [
"# bert-base-en-vi-cased\n\nWe are sharing smaller versions of bert-base-multilingual-cased that handle a custom number of languages.\n\nUnlike distilbert-base-multilingual-cased, our versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] | [
"TAGS\n#transformers #pytorch #tf #jax #bert #fill-mask #multilingual #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n",
"# bert-base-en-vi-cased\n\nWe are sharing smaller versions of bert-base-multilingual-cased that handle a custom number of languages.\n\nUnlike distilbert-base-multilingual-cased, our versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] |
fill-mask | transformers |
# bert-base-en-zh-cased
We are sharing smaller versions of [bert-base-multilingual-cased](https://huggingface.co/bert-base-multilingual-cased) that handle a custom number of languages.
Unlike [distilbert-base-multilingual-cased](https://huggingface.co/distilbert-base-multilingual-cased), our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: [Load What You Need: Smaller Versions of Multilingual BERT](https://www.aclweb.org/anthology/2020.sustainlp-1.16.pdf).
## How to use
```python
from transformers import AutoTokenizer, AutoModel
tokenizer = AutoTokenizer.from_pretrained("Geotrend/bert-base-en-zh-cased")
model = AutoModel.from_pretrained("Geotrend/bert-base-en-zh-cased")
```
To generate other smaller versions of multilingual transformers please visit [our Github repo](https://github.com/Geotrend-research/smaller-transformers).
### How to cite
```bibtex
@inproceedings{smallermbert,
title={Load What You Need: Smaller Versions of Mutlilingual BERT},
author={Abdaoui, Amine and Pradel, Camille and Sigel, Grégoire},
booktitle={SustaiNLP / EMNLP},
year={2020}
}
```
## Contact
Please contact [email protected] for any question, feedback or request.
| {"language": ["multilingual", "en", "zh"], "license": "apache-2.0", "datasets": "wikipedia", "widget": [{"text": "Google generated 46 billion [MASK] in revenue."}, {"text": "Paris is the capital of [MASK]."}, {"text": "Algiers is the largest city in [MASK]."}]} | Geotrend/bert-base-en-zh-cased | null | [
"transformers",
"pytorch",
"tf",
"jax",
"safetensors",
"bert",
"fill-mask",
"multilingual",
"en",
"zh",
"dataset:wikipedia",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | [] | [
"multilingual",
"en",
"zh"
] | TAGS
#transformers #pytorch #tf #jax #safetensors #bert #fill-mask #multilingual #en #zh #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
|
# bert-base-en-zh-cased
We are sharing smaller versions of bert-base-multilingual-cased that handle a custom number of languages.
Unlike distilbert-base-multilingual-cased, our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.
## How to use
To generate other smaller versions of multilingual transformers please visit our Github repo.
### How to cite
## Contact
Please contact amine@URL for any question, feedback or request.
| [
"# bert-base-en-zh-cased\n\nWe are sharing smaller versions of bert-base-multilingual-cased that handle a custom number of languages.\n\nUnlike distilbert-base-multilingual-cased, our versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] | [
"TAGS\n#transformers #pytorch #tf #jax #safetensors #bert #fill-mask #multilingual #en #zh #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n",
"# bert-base-en-zh-cased\n\nWe are sharing smaller versions of bert-base-multilingual-cased that handle a custom number of languages.\n\nUnlike distilbert-base-multilingual-cased, our versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] |
fill-mask | transformers |
# bert-base-en-zh-hi-cased
We are sharing smaller versions of [bert-base-multilingual-cased](https://huggingface.co/bert-base-multilingual-cased) that handle a custom number of languages.
Unlike [distilbert-base-multilingual-cased](https://huggingface.co/distilbert-base-multilingual-cased), our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: [Load What You Need: Smaller Versions of Multilingual BERT](https://www.aclweb.org/anthology/2020.sustainlp-1.16.pdf).
## How to use
```python
from transformers import AutoTokenizer, AutoModel
tokenizer = AutoTokenizer.from_pretrained("Geotrend/bert-base-en-zh-hi-cased")
model = AutoModel.from_pretrained("Geotrend/bert-base-en-zh-hi-cased")
```
To generate other smaller versions of multilingual transformers please visit [our Github repo](https://github.com/Geotrend-research/smaller-transformers).
### How to cite
```bibtex
@inproceedings{smallermbert,
title={Load What You Need: Smaller Versions of Mutlilingual BERT},
author={Abdaoui, Amine and Pradel, Camille and Sigel, Grégoire},
booktitle={SustaiNLP / EMNLP},
year={2020}
}
```
## Contact
Please contact [email protected] for any question, feedback or request. | {"language": "multilingual", "license": "apache-2.0", "datasets": "wikipedia"} | Geotrend/bert-base-en-zh-hi-cased | null | [
"transformers",
"pytorch",
"tf",
"jax",
"bert",
"fill-mask",
"multilingual",
"dataset:wikipedia",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | [] | [
"multilingual"
] | TAGS
#transformers #pytorch #tf #jax #bert #fill-mask #multilingual #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
|
# bert-base-en-zh-hi-cased
We are sharing smaller versions of bert-base-multilingual-cased that handle a custom number of languages.
Unlike distilbert-base-multilingual-cased, our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.
## How to use
To generate other smaller versions of multilingual transformers please visit our Github repo.
### How to cite
## Contact
Please contact amine@URL for any question, feedback or request. | [
"# bert-base-en-zh-hi-cased\n\nWe are sharing smaller versions of bert-base-multilingual-cased that handle a custom number of languages.\n\nUnlike distilbert-base-multilingual-cased, our versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] | [
"TAGS\n#transformers #pytorch #tf #jax #bert #fill-mask #multilingual #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n",
"# bert-base-en-zh-hi-cased\n\nWe are sharing smaller versions of bert-base-multilingual-cased that handle a custom number of languages.\n\nUnlike distilbert-base-multilingual-cased, our versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] |
fill-mask | transformers |
# bert-base-es-cased
We are sharing smaller versions of [bert-base-multilingual-cased](https://huggingface.co/bert-base-multilingual-cased) that handle a custom number of languages.
Unlike [distilbert-base-multilingual-cased](https://huggingface.co/distilbert-base-multilingual-cased), our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: [Load What You Need: Smaller Versions of Multilingual BERT](https://www.aclweb.org/anthology/2020.sustainlp-1.16.pdf).
## How to use
```python
from transformers import AutoTokenizer, AutoModel
tokenizer = AutoTokenizer.from_pretrained("Geotrend/bert-base-es-cased")
model = AutoModel.from_pretrained("Geotrend/bert-base-es-cased")
```
To generate other smaller versions of multilingual transformers please visit [our Github repo](https://github.com/Geotrend-research/smaller-transformers).
### How to cite
```bibtex
@inproceedings{smallermbert,
title={Load What You Need: Smaller Versions of Mutlilingual BERT},
author={Abdaoui, Amine and Pradel, Camille and Sigel, Grégoire},
booktitle={SustaiNLP / EMNLP},
year={2020}
}
```
## Contact
Please contact [email protected] for any question, feedback or request.
| {"language": "es", "license": "apache-2.0", "datasets": "wikipedia"} | Geotrend/bert-base-es-cased | null | [
"transformers",
"pytorch",
"tf",
"jax",
"safetensors",
"bert",
"fill-mask",
"es",
"dataset:wikipedia",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | [] | [
"es"
] | TAGS
#transformers #pytorch #tf #jax #safetensors #bert #fill-mask #es #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
|
# bert-base-es-cased
We are sharing smaller versions of bert-base-multilingual-cased that handle a custom number of languages.
Unlike distilbert-base-multilingual-cased, our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.
## How to use
To generate other smaller versions of multilingual transformers please visit our Github repo.
### How to cite
## Contact
Please contact amine@URL for any question, feedback or request.
| [
"# bert-base-es-cased\n\nWe are sharing smaller versions of bert-base-multilingual-cased that handle a custom number of languages.\n\nUnlike distilbert-base-multilingual-cased, our versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] | [
"TAGS\n#transformers #pytorch #tf #jax #safetensors #bert #fill-mask #es #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n",
"# bert-base-es-cased\n\nWe are sharing smaller versions of bert-base-multilingual-cased that handle a custom number of languages.\n\nUnlike distilbert-base-multilingual-cased, our versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] |
fill-mask | transformers |
# bert-base-fr-cased
We are sharing smaller versions of [bert-base-multilingual-cased](https://huggingface.co/bert-base-multilingual-cased) that handle a custom number of languages.
Unlike [distilbert-base-multilingual-cased](https://huggingface.co/distilbert-base-multilingual-cased), our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: [Load What You Need: Smaller Versions of Multilingual BERT](https://www.aclweb.org/anthology/2020.sustainlp-1.16.pdf).
## How to use
```python
from transformers import AutoTokenizer, AutoModel
tokenizer = AutoTokenizer.from_pretrained("Geotrend/bert-base-fr-cased")
model = AutoModel.from_pretrained("Geotrend/bert-base-fr-cased")
```
To generate other smaller versions of multilingual transformers please visit [our Github repo](https://github.com/Geotrend-research/smaller-transformers).
### How to cite
```bibtex
@inproceedings{smallermbert,
title={Load What You Need: Smaller Versions of Mutlilingual BERT},
author={Abdaoui, Amine and Pradel, Camille and Sigel, Grégoire},
booktitle={SustaiNLP / EMNLP},
year={2020}
}
```
## Contact
Please contact [email protected] for any question, feedback or request.
| {"language": "fr", "license": "apache-2.0", "datasets": "wikipedia", "widget": [{"text": "Paris est la [MASK] de la France."}, {"text": "Paris est la capitale de la [MASK]."}, {"text": "L'\u00e9lection am\u00e9ricaine a eu [MASK] en novembre 2020."}]} | Geotrend/bert-base-fr-cased | null | [
"transformers",
"pytorch",
"tf",
"jax",
"safetensors",
"bert",
"fill-mask",
"fr",
"dataset:wikipedia",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | [] | [
"fr"
] | TAGS
#transformers #pytorch #tf #jax #safetensors #bert #fill-mask #fr #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
|
# bert-base-fr-cased
We are sharing smaller versions of bert-base-multilingual-cased that handle a custom number of languages.
Unlike distilbert-base-multilingual-cased, our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.
## How to use
To generate other smaller versions of multilingual transformers please visit our Github repo.
### How to cite
## Contact
Please contact amine@URL for any question, feedback or request.
| [
"# bert-base-fr-cased\n\nWe are sharing smaller versions of bert-base-multilingual-cased that handle a custom number of languages.\n\nUnlike distilbert-base-multilingual-cased, our versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] | [
"TAGS\n#transformers #pytorch #tf #jax #safetensors #bert #fill-mask #fr #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n",
"# bert-base-fr-cased\n\nWe are sharing smaller versions of bert-base-multilingual-cased that handle a custom number of languages.\n\nUnlike distilbert-base-multilingual-cased, our versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] |
fill-mask | transformers |
# bert-base-hi-cased
We are sharing smaller versions of [bert-base-multilingual-cased](https://huggingface.co/bert-base-multilingual-cased) that handle a custom number of languages.
Unlike [distilbert-base-multilingual-cased](https://huggingface.co/distilbert-base-multilingual-cased), our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: [Load What You Need: Smaller Versions of Multilingual BERT](https://www.aclweb.org/anthology/2020.sustainlp-1.16.pdf).
## How to use
```python
from transformers import AutoTokenizer, AutoModel
tokenizer = AutoTokenizer.from_pretrained("Geotrend/bert-base-hi-cased")
model = AutoModel.from_pretrained("Geotrend/bert-base-hi-cased")
```
To generate other smaller versions of multilingual transformers please visit [our Github repo](https://github.com/Geotrend-research/smaller-transformers).
### How to cite
```bibtex
@inproceedings{smallermbert,
title={Load What You Need: Smaller Versions of Mutlilingual BERT},
author={Abdaoui, Amine and Pradel, Camille and Sigel, Grégoire},
booktitle={SustaiNLP / EMNLP},
year={2020}
}
```
## Contact
Please contact [email protected] for any question, feedback or request.
| {"language": "hi", "license": "apache-2.0", "datasets": "wikipedia"} | Geotrend/bert-base-hi-cased | null | [
"transformers",
"pytorch",
"tf",
"jax",
"bert",
"fill-mask",
"hi",
"dataset:wikipedia",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | [] | [
"hi"
] | TAGS
#transformers #pytorch #tf #jax #bert #fill-mask #hi #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
|
# bert-base-hi-cased
We are sharing smaller versions of bert-base-multilingual-cased that handle a custom number of languages.
Unlike distilbert-base-multilingual-cased, our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.
## How to use
To generate other smaller versions of multilingual transformers please visit our Github repo.
### How to cite
## Contact
Please contact amine@URL for any question, feedback or request.
| [
"# bert-base-hi-cased\n\nWe are sharing smaller versions of bert-base-multilingual-cased that handle a custom number of languages.\n\nUnlike distilbert-base-multilingual-cased, our versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] | [
"TAGS\n#transformers #pytorch #tf #jax #bert #fill-mask #hi #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n",
"# bert-base-hi-cased\n\nWe are sharing smaller versions of bert-base-multilingual-cased that handle a custom number of languages.\n\nUnlike distilbert-base-multilingual-cased, our versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] |
fill-mask | transformers |
# bert-base-it-cased
We are sharing smaller versions of [bert-base-multilingual-cased](https://huggingface.co/bert-base-multilingual-cased) that handle a custom number of languages.
Unlike [distilbert-base-multilingual-cased](https://huggingface.co/distilbert-base-multilingual-cased), our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: [Load What You Need: Smaller Versions of Multilingual BERT](https://www.aclweb.org/anthology/2020.sustainlp-1.16.pdf).
## How to use
```python
from transformers import AutoTokenizer, AutoModel
tokenizer = AutoTokenizer.from_pretrained("Geotrend/bert-base-it-cased")
model = AutoModel.from_pretrained("Geotrend/bert-base-it-cased")
```
To generate other smaller versions of multilingual transformers please visit [our Github repo](https://github.com/Geotrend-research/smaller-transformers).
### How to cite
```bibtex
@inproceedings{smallermbert,
title={Load What You Need: Smaller Versions of Mutlilingual BERT},
author={Abdaoui, Amine and Pradel, Camille and Sigel, Grégoire},
booktitle={SustaiNLP / EMNLP},
year={2020}
}
```
## Contact
Please contact [email protected] for any question, feedback or request. | {"language": "it", "license": "apache-2.0", "datasets": "wikipedia"} | Geotrend/bert-base-it-cased | null | [
"transformers",
"pytorch",
"tf",
"jax",
"bert",
"fill-mask",
"it",
"dataset:wikipedia",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | [] | [
"it"
] | TAGS
#transformers #pytorch #tf #jax #bert #fill-mask #it #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
|
# bert-base-it-cased
We are sharing smaller versions of bert-base-multilingual-cased that handle a custom number of languages.
Unlike distilbert-base-multilingual-cased, our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.
## How to use
To generate other smaller versions of multilingual transformers please visit our Github repo.
### How to cite
## Contact
Please contact amine@URL for any question, feedback or request. | [
"# bert-base-it-cased\n\nWe are sharing smaller versions of bert-base-multilingual-cased that handle a custom number of languages.\n\nUnlike distilbert-base-multilingual-cased, our versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] | [
"TAGS\n#transformers #pytorch #tf #jax #bert #fill-mask #it #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n",
"# bert-base-it-cased\n\nWe are sharing smaller versions of bert-base-multilingual-cased that handle a custom number of languages.\n\nUnlike distilbert-base-multilingual-cased, our versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] |
fill-mask | transformers |
# bert-base-ja-cased
We are sharing smaller versions of [bert-base-multilingual-cased](https://huggingface.co/bert-base-multilingual-cased) that handle a custom number of languages.
Unlike [distilbert-base-multilingual-cased](https://huggingface.co/distilbert-base-multilingual-cased), our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: [Load What You Need: Smaller Versions of Multilingual BERT](https://www.aclweb.org/anthology/2020.sustainlp-1.16.pdf).
## How to use
```python
from transformers import AutoTokenizer, AutoModel
tokenizer = AutoTokenizer.from_pretrained("Geotrend/bert-base-ja-cased")
model = AutoModel.from_pretrained("Geotrend/bert-base-ja-cased")
```
To generate other smaller versions of multilingual transformers please visit [our Github repo](https://github.com/Geotrend-research/smaller-transformers).
### How to cite
```bibtex
@inproceedings{smallermbert,
title={Load What You Need: Smaller Versions of Mutlilingual BERT},
author={Abdaoui, Amine and Pradel, Camille and Sigel, Grégoire},
booktitle={SustaiNLP / EMNLP},
year={2020}
}
```
## Contact
Please contact [email protected] for any question, feedback or request. | {"language": "ja", "license": "apache-2.0", "datasets": "wikipedia"} | Geotrend/bert-base-ja-cased | null | [
"transformers",
"pytorch",
"tf",
"jax",
"safetensors",
"bert",
"fill-mask",
"ja",
"dataset:wikipedia",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | [] | [
"ja"
] | TAGS
#transformers #pytorch #tf #jax #safetensors #bert #fill-mask #ja #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
|
# bert-base-ja-cased
We are sharing smaller versions of bert-base-multilingual-cased that handle a custom number of languages.
Unlike distilbert-base-multilingual-cased, our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.
## How to use
To generate other smaller versions of multilingual transformers please visit our Github repo.
### How to cite
## Contact
Please contact amine@URL for any question, feedback or request. | [
"# bert-base-ja-cased\n\nWe are sharing smaller versions of bert-base-multilingual-cased that handle a custom number of languages.\n\nUnlike distilbert-base-multilingual-cased, our versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] | [
"TAGS\n#transformers #pytorch #tf #jax #safetensors #bert #fill-mask #ja #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n",
"# bert-base-ja-cased\n\nWe are sharing smaller versions of bert-base-multilingual-cased that handle a custom number of languages.\n\nUnlike distilbert-base-multilingual-cased, our versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] |
fill-mask | transformers |
# bert-base-lt-cased
We are sharing smaller versions of [bert-base-multilingual-cased](https://huggingface.co/bert-base-multilingual-cased) that handle a custom number of languages.
Unlike [distilbert-base-multilingual-cased](https://huggingface.co/distilbert-base-multilingual-cased), our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: [Load What You Need: Smaller Versions of Multilingual BERT](https://www.aclweb.org/anthology/2020.sustainlp-1.16.pdf).
## How to use
```python
from transformers import AutoTokenizer, AutoModel
tokenizer = AutoTokenizer.from_pretrained("Geotrend/bert-base-lt-cased")
model = AutoModel.from_pretrained("Geotrend/bert-base-lt-cased")
```
To generate other smaller versions of multilingual transformers please visit [our Github repo](https://github.com/Geotrend-research/smaller-transformers).
### How to cite
```bibtex
@inproceedings{smallermbert,
title={Load What You Need: Smaller Versions of Mutlilingual BERT},
author={Abdaoui, Amine and Pradel, Camille and Sigel, Grégoire},
booktitle={SustaiNLP / EMNLP},
year={2020}
}
```
## Contact
Please contact [email protected] for any question, feedback or request. | {"language": "lt", "license": "apache-2.0", "datasets": "wikipedia"} | Geotrend/bert-base-lt-cased | null | [
"transformers",
"pytorch",
"tf",
"jax",
"bert",
"fill-mask",
"lt",
"dataset:wikipedia",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | [] | [
"lt"
] | TAGS
#transformers #pytorch #tf #jax #bert #fill-mask #lt #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
|
# bert-base-lt-cased
We are sharing smaller versions of bert-base-multilingual-cased that handle a custom number of languages.
Unlike distilbert-base-multilingual-cased, our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.
## How to use
To generate other smaller versions of multilingual transformers please visit our Github repo.
### How to cite
## Contact
Please contact amine@URL for any question, feedback or request. | [
"# bert-base-lt-cased\n\nWe are sharing smaller versions of bert-base-multilingual-cased that handle a custom number of languages.\n\nUnlike distilbert-base-multilingual-cased, our versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] | [
"TAGS\n#transformers #pytorch #tf #jax #bert #fill-mask #lt #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n",
"# bert-base-lt-cased\n\nWe are sharing smaller versions of bert-base-multilingual-cased that handle a custom number of languages.\n\nUnlike distilbert-base-multilingual-cased, our versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] |
fill-mask | transformers |
# bert-base-nl-cased
We are sharing smaller versions of [bert-base-multilingual-cased](https://huggingface.co/bert-base-multilingual-cased) that handle a custom number of languages.
Unlike [distilbert-base-multilingual-cased](https://huggingface.co/distilbert-base-multilingual-cased), our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: [Load What You Need: Smaller Versions of Multilingual BERT](https://www.aclweb.org/anthology/2020.sustainlp-1.16.pdf).
## How to use
```python
from transformers import AutoTokenizer, AutoModel
tokenizer = AutoTokenizer.from_pretrained("Geotrend/bert-base-nl-cased")
model = AutoModel.from_pretrained("Geotrend/bert-base-nl-cased")
```
To generate other smaller versions of multilingual transformers please visit [our Github repo](https://github.com/Geotrend-research/smaller-transformers).
### How to cite
```bibtex
@inproceedings{smallermbert,
title={Load What You Need: Smaller Versions of Mutlilingual BERT},
author={Abdaoui, Amine and Pradel, Camille and Sigel, Grégoire},
booktitle={SustaiNLP / EMNLP},
year={2020}
}
```
## Contact
Please contact [email protected] for any question, feedback or request. | {"language": "nl", "license": "apache-2.0", "datasets": "wikipedia"} | Geotrend/bert-base-nl-cased | null | [
"transformers",
"pytorch",
"tf",
"jax",
"bert",
"fill-mask",
"nl",
"dataset:wikipedia",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | [] | [
"nl"
] | TAGS
#transformers #pytorch #tf #jax #bert #fill-mask #nl #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
|
# bert-base-nl-cased
We are sharing smaller versions of bert-base-multilingual-cased that handle a custom number of languages.
Unlike distilbert-base-multilingual-cased, our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.
## How to use
To generate other smaller versions of multilingual transformers please visit our Github repo.
### How to cite
## Contact
Please contact amine@URL for any question, feedback or request. | [
"# bert-base-nl-cased\n\nWe are sharing smaller versions of bert-base-multilingual-cased that handle a custom number of languages.\n\nUnlike distilbert-base-multilingual-cased, our versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] | [
"TAGS\n#transformers #pytorch #tf #jax #bert #fill-mask #nl #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n",
"# bert-base-nl-cased\n\nWe are sharing smaller versions of bert-base-multilingual-cased that handle a custom number of languages.\n\nUnlike distilbert-base-multilingual-cased, our versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] |
fill-mask | transformers |
# bert-base-no-cased
We are sharing smaller versions of [bert-base-multilingual-cased](https://huggingface.co/bert-base-multilingual-cased) that handle a custom number of languages.
Unlike [distilbert-base-multilingual-cased](https://huggingface.co/distilbert-base-multilingual-cased), our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: [Load What You Need: Smaller Versions of Multilingual BERT](https://www.aclweb.org/anthology/2020.sustainlp-1.16.pdf).
## How to use
```python
from transformers import AutoTokenizer, AutoModel
tokenizer = AutoTokenizer.from_pretrained("Geotrend/bert-base-no-cased")
model = AutoModel.from_pretrained("Geotrend/bert-base-no-cased")
```
To generate other smaller versions of multilingual transformers please visit [our Github repo](https://github.com/Geotrend-research/smaller-transformers).
### How to cite
```bibtex
@inproceedings{smallermbert,
title={Load What You Need: Smaller Versions of Mutlilingual BERT},
author={Abdaoui, Amine and Pradel, Camille and Sigel, Grégoire},
booktitle={SustaiNLP / EMNLP},
year={2020}
}
```
## Contact
Please contact [email protected] for any question, feedback or request. | {"language": false, "license": "apache-2.0", "datasets": "wikipedia"} | Geotrend/bert-base-no-cased | null | [
"transformers",
"pytorch",
"tf",
"jax",
"bert",
"fill-mask",
"no",
"dataset:wikipedia",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | [] | [
"no"
] | TAGS
#transformers #pytorch #tf #jax #bert #fill-mask #no #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
|
# bert-base-no-cased
We are sharing smaller versions of bert-base-multilingual-cased that handle a custom number of languages.
Unlike distilbert-base-multilingual-cased, our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.
## How to use
To generate other smaller versions of multilingual transformers please visit our Github repo.
### How to cite
## Contact
Please contact amine@URL for any question, feedback or request. | [
"# bert-base-no-cased\n\nWe are sharing smaller versions of bert-base-multilingual-cased that handle a custom number of languages.\n\nUnlike distilbert-base-multilingual-cased, our versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] | [
"TAGS\n#transformers #pytorch #tf #jax #bert #fill-mask #no #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n",
"# bert-base-no-cased\n\nWe are sharing smaller versions of bert-base-multilingual-cased that handle a custom number of languages.\n\nUnlike distilbert-base-multilingual-cased, our versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] |
fill-mask | transformers |
# bert-base-pl-cased
We are sharing smaller versions of [bert-base-multilingual-cased](https://huggingface.co/bert-base-multilingual-cased) that handle a custom number of languages.
Unlike [distilbert-base-multilingual-cased](https://huggingface.co/distilbert-base-multilingual-cased), our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: [Load What You Need: Smaller Versions of Multilingual BERT](https://www.aclweb.org/anthology/2020.sustainlp-1.16.pdf).
## How to use
```python
from transformers import AutoTokenizer, AutoModel
tokenizer = AutoTokenizer.from_pretrained("Geotrend/bert-base-pl-cased")
model = AutoModel.from_pretrained("Geotrend/bert-base-pl-cased")
```
To generate other smaller versions of multilingual transformers please visit [our Github repo](https://github.com/Geotrend-research/smaller-transformers).
### How to cite
```bibtex
@inproceedings{smallermbert,
title={Load What You Need: Smaller Versions of Mutlilingual BERT},
author={Abdaoui, Amine and Pradel, Camille and Sigel, Grégoire},
booktitle={SustaiNLP / EMNLP},
year={2020}
}
```
## Contact
Please contact [email protected] for any question, feedback or request. | {"language": "pl", "license": "apache-2.0", "datasets": "wikipedia"} | Geotrend/bert-base-pl-cased | null | [
"transformers",
"pytorch",
"tf",
"jax",
"bert",
"fill-mask",
"pl",
"dataset:wikipedia",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | [] | [
"pl"
] | TAGS
#transformers #pytorch #tf #jax #bert #fill-mask #pl #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
|
# bert-base-pl-cased
We are sharing smaller versions of bert-base-multilingual-cased that handle a custom number of languages.
Unlike distilbert-base-multilingual-cased, our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.
## How to use
To generate other smaller versions of multilingual transformers please visit our Github repo.
### How to cite
## Contact
Please contact amine@URL for any question, feedback or request. | [
"# bert-base-pl-cased\n\nWe are sharing smaller versions of bert-base-multilingual-cased that handle a custom number of languages.\n\nUnlike distilbert-base-multilingual-cased, our versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] | [
"TAGS\n#transformers #pytorch #tf #jax #bert #fill-mask #pl #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n",
"# bert-base-pl-cased\n\nWe are sharing smaller versions of bert-base-multilingual-cased that handle a custom number of languages.\n\nUnlike distilbert-base-multilingual-cased, our versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] |
fill-mask | transformers |
# bert-base-pt-cased
We are sharing smaller versions of [bert-base-multilingual-cased](https://huggingface.co/bert-base-multilingual-cased) that handle a custom number of languages.
Unlike [distilbert-base-multilingual-cased](https://huggingface.co/distilbert-base-multilingual-cased), our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: [Load What You Need: Smaller Versions of Multilingual BERT](https://www.aclweb.org/anthology/2020.sustainlp-1.16.pdf).
## How to use
```python
from transformers import AutoTokenizer, AutoModel
tokenizer = AutoTokenizer.from_pretrained("Geotrend/bert-base-pt-cased")
model = AutoModel.from_pretrained("Geotrend/bert-base-pt-cased")
```
To generate other smaller versions of multilingual transformers please visit [our Github repo](https://github.com/Geotrend-research/smaller-transformers).
### How to cite
```bibtex
@inproceedings{smallermbert,
title={Load What You Need: Smaller Versions of Mutlilingual BERT},
author={Abdaoui, Amine and Pradel, Camille and Sigel, Grégoire},
booktitle={SustaiNLP / EMNLP},
year={2020}
}
```
## Contact
Please contact [email protected] for any question, feedback or request. | {"language": "pt", "license": "apache-2.0", "datasets": "wikipedia"} | Geotrend/bert-base-pt-cased | null | [
"transformers",
"pytorch",
"tf",
"jax",
"bert",
"fill-mask",
"pt",
"dataset:wikipedia",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | [] | [
"pt"
] | TAGS
#transformers #pytorch #tf #jax #bert #fill-mask #pt #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
|
# bert-base-pt-cased
We are sharing smaller versions of bert-base-multilingual-cased that handle a custom number of languages.
Unlike distilbert-base-multilingual-cased, our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.
## How to use
To generate other smaller versions of multilingual transformers please visit our Github repo.
### How to cite
## Contact
Please contact amine@URL for any question, feedback or request. | [
"# bert-base-pt-cased\n\nWe are sharing smaller versions of bert-base-multilingual-cased that handle a custom number of languages.\n\nUnlike distilbert-base-multilingual-cased, our versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] | [
"TAGS\n#transformers #pytorch #tf #jax #bert #fill-mask #pt #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n",
"# bert-base-pt-cased\n\nWe are sharing smaller versions of bert-base-multilingual-cased that handle a custom number of languages.\n\nUnlike distilbert-base-multilingual-cased, our versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] |
fill-mask | transformers |
# bert-base-ro-cased
We are sharing smaller versions of [bert-base-multilingual-cased](https://huggingface.co/bert-base-multilingual-cased) that handle a custom number of languages.
Unlike [distilbert-base-multilingual-cased](https://huggingface.co/distilbert-base-multilingual-cased), our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: [Load What You Need: Smaller Versions of Multilingual BERT](https://www.aclweb.org/anthology/2020.sustainlp-1.16.pdf).
## How to use
```python
from transformers import AutoTokenizer, AutoModel
tokenizer = AutoTokenizer.from_pretrained("Geotrend/bert-base-ro-cased")
model = AutoModel.from_pretrained("Geotrend/bert-base-ro-cased")
```
To generate other smaller versions of multilingual transformers please visit [our Github repo](https://github.com/Geotrend-research/smaller-transformers).
### How to cite
```bibtex
@inproceedings{smallermbert,
title={Load What You Need: Smaller Versions of Mutlilingual BERT},
author={Abdaoui, Amine and Pradel, Camille and Sigel, Grégoire},
booktitle={SustaiNLP / EMNLP},
year={2020}
}
```
## Contact
Please contact [email protected] for any question, feedback or request. | {"language": "ro", "license": "apache-2.0", "datasets": "wikipedia"} | Geotrend/bert-base-ro-cased | null | [
"transformers",
"pytorch",
"tf",
"jax",
"bert",
"fill-mask",
"ro",
"dataset:wikipedia",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | [] | [
"ro"
] | TAGS
#transformers #pytorch #tf #jax #bert #fill-mask #ro #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
|
# bert-base-ro-cased
We are sharing smaller versions of bert-base-multilingual-cased that handle a custom number of languages.
Unlike distilbert-base-multilingual-cased, our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.
## How to use
To generate other smaller versions of multilingual transformers please visit our Github repo.
### How to cite
## Contact
Please contact amine@URL for any question, feedback or request. | [
"# bert-base-ro-cased\n\nWe are sharing smaller versions of bert-base-multilingual-cased that handle a custom number of languages.\n\nUnlike distilbert-base-multilingual-cased, our versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] | [
"TAGS\n#transformers #pytorch #tf #jax #bert #fill-mask #ro #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n",
"# bert-base-ro-cased\n\nWe are sharing smaller versions of bert-base-multilingual-cased that handle a custom number of languages.\n\nUnlike distilbert-base-multilingual-cased, our versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] |
fill-mask | transformers |
# bert-base-ru-cased
We are sharing smaller versions of [bert-base-multilingual-cased](https://huggingface.co/bert-base-multilingual-cased) that handle a custom number of languages.
Unlike [distilbert-base-multilingual-cased](https://huggingface.co/distilbert-base-multilingual-cased), our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: [Load What You Need: Smaller Versions of Multilingual BERT](https://www.aclweb.org/anthology/2020.sustainlp-1.16.pdf).
## How to use
```python
from transformers import AutoTokenizer, AutoModel
tokenizer = AutoTokenizer.from_pretrained("Geotrend/bert-base-ru-cased")
model = AutoModel.from_pretrained("Geotrend/bert-base-ru-cased")
```
To generate other smaller versions of multilingual transformers please visit [our Github repo](https://github.com/Geotrend-research/smaller-transformers).
### How to cite
```bibtex
@inproceedings{smallermbert,
title={Load What You Need: Smaller Versions of Mutlilingual BERT},
author={Abdaoui, Amine and Pradel, Camille and Sigel, Grégoire},
booktitle={SustaiNLP / EMNLP},
year={2020}
}
```
## Contact
Please contact [email protected] for any question, feedback or request.
| {"language": "ru", "license": "apache-2.0", "datasets": "wikipedia"} | Geotrend/bert-base-ru-cased | null | [
"transformers",
"pytorch",
"tf",
"jax",
"safetensors",
"bert",
"fill-mask",
"ru",
"dataset:wikipedia",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | [] | [
"ru"
] | TAGS
#transformers #pytorch #tf #jax #safetensors #bert #fill-mask #ru #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #has_space #region-us
|
# bert-base-ru-cased
We are sharing smaller versions of bert-base-multilingual-cased that handle a custom number of languages.
Unlike distilbert-base-multilingual-cased, our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.
## How to use
To generate other smaller versions of multilingual transformers please visit our Github repo.
### How to cite
## Contact
Please contact amine@URL for any question, feedback or request.
| [
"# bert-base-ru-cased\n\nWe are sharing smaller versions of bert-base-multilingual-cased that handle a custom number of languages.\n\nUnlike distilbert-base-multilingual-cased, our versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] | [
"TAGS\n#transformers #pytorch #tf #jax #safetensors #bert #fill-mask #ru #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #has_space #region-us \n",
"# bert-base-ru-cased\n\nWe are sharing smaller versions of bert-base-multilingual-cased that handle a custom number of languages.\n\nUnlike distilbert-base-multilingual-cased, our versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] |
fill-mask | transformers |
# bert-base-sw-cased
We are sharing smaller versions of [bert-base-multilingual-cased](https://huggingface.co/bert-base-multilingual-cased) that handle a custom number of languages.
Unlike [distilbert-base-multilingual-cased](https://huggingface.co/distilbert-base-multilingual-cased), our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: [Load What You Need: Smaller Versions of Multilingual BERT](https://www.aclweb.org/anthology/2020.sustainlp-1.16.pdf).
## How to use
```python
from transformers import AutoTokenizer, AutoModel
tokenizer = AutoTokenizer.from_pretrained("Geotrend/bert-base-sw-cased")
model = AutoModel.from_pretrained("Geotrend/bert-base-sw-cased")
```
To generate other smaller versions of multilingual transformers please visit [our Github repo](https://github.com/Geotrend-research/smaller-transformers).
### How to cite
```bibtex
@inproceedings{smallermbert,
title={Load What You Need: Smaller Versions of Mutlilingual BERT},
author={Abdaoui, Amine and Pradel, Camille and Sigel, Grégoire},
booktitle={SustaiNLP / EMNLP},
year={2020}
}
```
## Contact
Please contact [email protected] for any question, feedback or request.
| {"language": "sw", "license": "apache-2.0", "datasets": "wikipedia"} | Geotrend/bert-base-sw-cased | null | [
"transformers",
"pytorch",
"tf",
"jax",
"bert",
"fill-mask",
"sw",
"dataset:wikipedia",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | [] | [
"sw"
] | TAGS
#transformers #pytorch #tf #jax #bert #fill-mask #sw #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
|
# bert-base-sw-cased
We are sharing smaller versions of bert-base-multilingual-cased that handle a custom number of languages.
Unlike distilbert-base-multilingual-cased, our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.
## How to use
To generate other smaller versions of multilingual transformers please visit our Github repo.
### How to cite
## Contact
Please contact amine@URL for any question, feedback or request.
| [
"# bert-base-sw-cased\n\nWe are sharing smaller versions of bert-base-multilingual-cased that handle a custom number of languages.\n\nUnlike distilbert-base-multilingual-cased, our versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] | [
"TAGS\n#transformers #pytorch #tf #jax #bert #fill-mask #sw #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n",
"# bert-base-sw-cased\n\nWe are sharing smaller versions of bert-base-multilingual-cased that handle a custom number of languages.\n\nUnlike distilbert-base-multilingual-cased, our versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] |
fill-mask | transformers |
# bert-base-th-cased
We are sharing smaller versions of [bert-base-multilingual-cased](https://huggingface.co/bert-base-multilingual-cased) that handle a custom number of languages.
Unlike [distilbert-base-multilingual-cased](https://huggingface.co/distilbert-base-multilingual-cased), our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: [Load What You Need: Smaller Versions of Multilingual BERT](https://www.aclweb.org/anthology/2020.sustainlp-1.16.pdf).
## How to use
```python
from transformers import AutoTokenizer, AutoModel
tokenizer = AutoTokenizer.from_pretrained("Geotrend/bert-base-th-cased")
model = AutoModel.from_pretrained("Geotrend/bert-base-th-cased")
```
To generate other smaller versions of multilingual transformers please visit [our Github repo](https://github.com/Geotrend-research/smaller-transformers).
### How to cite
```bibtex
@inproceedings{smallermbert,
title={Load What You Need: Smaller Versions of Mutlilingual BERT},
author={Abdaoui, Amine and Pradel, Camille and Sigel, Grégoire},
booktitle={SustaiNLP / EMNLP},
year={2020}
}
```
## Contact
Please contact [email protected] for any question, feedback or request.
| {"language": "th", "license": "apache-2.0", "datasets": "wikipedia"} | Geotrend/bert-base-th-cased | null | [
"transformers",
"pytorch",
"tf",
"jax",
"bert",
"fill-mask",
"th",
"dataset:wikipedia",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | [] | [
"th"
] | TAGS
#transformers #pytorch #tf #jax #bert #fill-mask #th #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
|
# bert-base-th-cased
We are sharing smaller versions of bert-base-multilingual-cased that handle a custom number of languages.
Unlike distilbert-base-multilingual-cased, our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.
## How to use
To generate other smaller versions of multilingual transformers please visit our Github repo.
### How to cite
## Contact
Please contact amine@URL for any question, feedback or request.
| [
"# bert-base-th-cased\n\nWe are sharing smaller versions of bert-base-multilingual-cased that handle a custom number of languages.\n\nUnlike distilbert-base-multilingual-cased, our versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] | [
"TAGS\n#transformers #pytorch #tf #jax #bert #fill-mask #th #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n",
"# bert-base-th-cased\n\nWe are sharing smaller versions of bert-base-multilingual-cased that handle a custom number of languages.\n\nUnlike distilbert-base-multilingual-cased, our versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] |
fill-mask | transformers |
# bert-base-tr-cased
We are sharing smaller versions of [bert-base-multilingual-cased](https://huggingface.co/bert-base-multilingual-cased) that handle a custom number of languages.
Unlike [distilbert-base-multilingual-cased](https://huggingface.co/distilbert-base-multilingual-cased), our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: [Load What You Need: Smaller Versions of Multilingual BERT](https://www.aclweb.org/anthology/2020.sustainlp-1.16.pdf).
## How to use
```python
from transformers import AutoTokenizer, AutoModel
tokenizer = AutoTokenizer.from_pretrained("Geotrend/bert-base-tr-cased")
model = AutoModel.from_pretrained("Geotrend/bert-base-tr-cased")
```
To generate other smaller versions of multilingual transformers please visit [our Github repo](https://github.com/Geotrend-research/smaller-transformers).
### How to cite
```bibtex
@inproceedings{smallermbert,
title={Load What You Need: Smaller Versions of Mutlilingual BERT},
author={Abdaoui, Amine and Pradel, Camille and Sigel, Grégoire},
booktitle={SustaiNLP / EMNLP},
year={2020}
}
```
## Contact
Please contact [email protected] for any question, feedback or request.
| {"language": "tr", "license": "apache-2.0", "datasets": "wikipedia"} | Geotrend/bert-base-tr-cased | null | [
"transformers",
"pytorch",
"tf",
"jax",
"safetensors",
"bert",
"fill-mask",
"tr",
"dataset:wikipedia",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | [] | [
"tr"
] | TAGS
#transformers #pytorch #tf #jax #safetensors #bert #fill-mask #tr #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
|
# bert-base-tr-cased
We are sharing smaller versions of bert-base-multilingual-cased that handle a custom number of languages.
Unlike distilbert-base-multilingual-cased, our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.
## How to use
To generate other smaller versions of multilingual transformers please visit our Github repo.
### How to cite
## Contact
Please contact amine@URL for any question, feedback or request.
| [
"# bert-base-tr-cased\n\nWe are sharing smaller versions of bert-base-multilingual-cased that handle a custom number of languages.\n\nUnlike distilbert-base-multilingual-cased, our versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] | [
"TAGS\n#transformers #pytorch #tf #jax #safetensors #bert #fill-mask #tr #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n",
"# bert-base-tr-cased\n\nWe are sharing smaller versions of bert-base-multilingual-cased that handle a custom number of languages.\n\nUnlike distilbert-base-multilingual-cased, our versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] |
fill-mask | transformers |
# bert-base-uk-cased
We are sharing smaller versions of [bert-base-multilingual-cased](https://huggingface.co/bert-base-multilingual-cased) that handle a custom number of languages.
Unlike [distilbert-base-multilingual-cased](https://huggingface.co/distilbert-base-multilingual-cased), our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: [Load What You Need: Smaller Versions of Multilingual BERT](https://www.aclweb.org/anthology/2020.sustainlp-1.16.pdf).
## How to use
```python
from transformers import AutoTokenizer, AutoModel
tokenizer = AutoTokenizer.from_pretrained("Geotrend/bert-base-uk-cased")
model = AutoModel.from_pretrained("Geotrend/bert-base-uk-cased")
```
To generate other smaller versions of multilingual transformers please visit [our Github repo](https://github.com/Geotrend-research/smaller-transformers).
### How to cite
```bibtex
@inproceedings{smallermbert,
title={Load What You Need: Smaller Versions of Mutlilingual BERT},
author={Abdaoui, Amine and Pradel, Camille and Sigel, Grégoire},
booktitle={SustaiNLP / EMNLP},
year={2020}
}
```
## Contact
Please contact [email protected] for any question, feedback or request. | {"language": "uk", "license": "apache-2.0", "datasets": "wikipedia"} | Geotrend/bert-base-uk-cased | null | [
"transformers",
"pytorch",
"tf",
"jax",
"safetensors",
"bert",
"fill-mask",
"uk",
"dataset:wikipedia",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | [] | [
"uk"
] | TAGS
#transformers #pytorch #tf #jax #safetensors #bert #fill-mask #uk #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
|
# bert-base-uk-cased
We are sharing smaller versions of bert-base-multilingual-cased that handle a custom number of languages.
Unlike distilbert-base-multilingual-cased, our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.
## How to use
To generate other smaller versions of multilingual transformers please visit our Github repo.
### How to cite
## Contact
Please contact amine@URL for any question, feedback or request. | [
"# bert-base-uk-cased\n\nWe are sharing smaller versions of bert-base-multilingual-cased that handle a custom number of languages.\n\nUnlike distilbert-base-multilingual-cased, our versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] | [
"TAGS\n#transformers #pytorch #tf #jax #safetensors #bert #fill-mask #uk #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n",
"# bert-base-uk-cased\n\nWe are sharing smaller versions of bert-base-multilingual-cased that handle a custom number of languages.\n\nUnlike distilbert-base-multilingual-cased, our versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] |
fill-mask | transformers |
# bert-base-ur-cased
We are sharing smaller versions of [bert-base-multilingual-cased](https://huggingface.co/bert-base-multilingual-cased) that handle a custom number of languages.
Unlike [distilbert-base-multilingual-cased](https://huggingface.co/distilbert-base-multilingual-cased), our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: [Load What You Need: Smaller Versions of Multilingual BERT](https://www.aclweb.org/anthology/2020.sustainlp-1.16.pdf).
## How to use
```python
from transformers import AutoTokenizer, AutoModel
tokenizer = AutoTokenizer.from_pretrained("Geotrend/bert-base-ur-cased")
model = AutoModel.from_pretrained("Geotrend/bert-base-ur-cased")
```
To generate other smaller versions of multilingual transformers please visit [our Github repo](https://github.com/Geotrend-research/smaller-transformers).
### How to cite
```bibtex
@inproceedings{smallermbert,
title={Load What You Need: Smaller Versions of Mutlilingual BERT},
author={Abdaoui, Amine and Pradel, Camille and Sigel, Grégoire},
booktitle={SustaiNLP / EMNLP},
year={2020}
}
```
## Contact
Please contact [email protected] for any question, feedback or request.
| {"language": "ur", "license": "apache-2.0", "datasets": "wikipedia"} | Geotrend/bert-base-ur-cased | null | [
"transformers",
"pytorch",
"tf",
"jax",
"safetensors",
"bert",
"fill-mask",
"ur",
"dataset:wikipedia",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | [] | [
"ur"
] | TAGS
#transformers #pytorch #tf #jax #safetensors #bert #fill-mask #ur #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
|
# bert-base-ur-cased
We are sharing smaller versions of bert-base-multilingual-cased that handle a custom number of languages.
Unlike distilbert-base-multilingual-cased, our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.
## How to use
To generate other smaller versions of multilingual transformers please visit our Github repo.
### How to cite
## Contact
Please contact amine@URL for any question, feedback or request.
| [
"# bert-base-ur-cased\n\nWe are sharing smaller versions of bert-base-multilingual-cased that handle a custom number of languages.\n\nUnlike distilbert-base-multilingual-cased, our versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] | [
"TAGS\n#transformers #pytorch #tf #jax #safetensors #bert #fill-mask #ur #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n",
"# bert-base-ur-cased\n\nWe are sharing smaller versions of bert-base-multilingual-cased that handle a custom number of languages.\n\nUnlike distilbert-base-multilingual-cased, our versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] |
fill-mask | transformers |
# bert-base-vi-cased
We are sharing smaller versions of [bert-base-multilingual-cased](https://huggingface.co/bert-base-multilingual-cased) that handle a custom number of languages.
Unlike [distilbert-base-multilingual-cased](https://huggingface.co/distilbert-base-multilingual-cased), our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: [Load What You Need: Smaller Versions of Multilingual BERT](https://www.aclweb.org/anthology/2020.sustainlp-1.16.pdf).
## How to use
```python
from transformers import AutoTokenizer, AutoModel
tokenizer = AutoTokenizer.from_pretrained("Geotrend/bert-base-vi-cased")
model = AutoModel.from_pretrained("Geotrend/bert-base-vi-cased")
```
To generate other smaller versions of multilingual transformers please visit [our Github repo](https://github.com/Geotrend-research/smaller-transformers).
### How to cite
```bibtex
@inproceedings{smallermbert,
title={Load What You Need: Smaller Versions of Mutlilingual BERT},
author={Abdaoui, Amine and Pradel, Camille and Sigel, Grégoire},
booktitle={SustaiNLP / EMNLP},
year={2020}
}
```
## Contact
Please contact [email protected] for any question, feedback or request.
| {"language": "vi", "license": "apache-2.0", "datasets": "wikipedia"} | Geotrend/bert-base-vi-cased | null | [
"transformers",
"pytorch",
"tf",
"jax",
"bert",
"fill-mask",
"vi",
"dataset:wikipedia",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | [] | [
"vi"
] | TAGS
#transformers #pytorch #tf #jax #bert #fill-mask #vi #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
|
# bert-base-vi-cased
We are sharing smaller versions of bert-base-multilingual-cased that handle a custom number of languages.
Unlike distilbert-base-multilingual-cased, our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.
## How to use
To generate other smaller versions of multilingual transformers please visit our Github repo.
### How to cite
## Contact
Please contact amine@URL for any question, feedback or request.
| [
"# bert-base-vi-cased\n\nWe are sharing smaller versions of bert-base-multilingual-cased that handle a custom number of languages.\n\nUnlike distilbert-base-multilingual-cased, our versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] | [
"TAGS\n#transformers #pytorch #tf #jax #bert #fill-mask #vi #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n",
"# bert-base-vi-cased\n\nWe are sharing smaller versions of bert-base-multilingual-cased that handle a custom number of languages.\n\nUnlike distilbert-base-multilingual-cased, our versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] |
fill-mask | transformers |
# bert-base-zh-cased
We are sharing smaller versions of [bert-base-multilingual-cased](https://huggingface.co/bert-base-multilingual-cased) that handle a custom number of languages.
Unlike [distilbert-base-multilingual-cased](https://huggingface.co/distilbert-base-multilingual-cased), our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: [Load What You Need: Smaller Versions of Multilingual BERT](https://www.aclweb.org/anthology/2020.sustainlp-1.16.pdf).
## How to use
```python
from transformers import AutoTokenizer, AutoModel
tokenizer = AutoTokenizer.from_pretrained("Geotrend/bert-base-zh-cased")
model = AutoModel.from_pretrained("Geotrend/bert-base-zh-cased")
```
To generate other smaller versions of multilingual transformers please visit [our Github repo](https://github.com/Geotrend-research/smaller-transformers).
### How to cite
```bibtex
@inproceedings{smallermbert,
title={Load What You Need: Smaller Versions of Mutlilingual BERT},
author={Abdaoui, Amine and Pradel, Camille and Sigel, Grégoire},
booktitle={SustaiNLP / EMNLP},
year={2020}
}
```
## Contact
Please contact [email protected] for any question, feedback or request.
| {"language": "zh", "license": "apache-2.0", "datasets": "wikipedia"} | Geotrend/bert-base-zh-cased | null | [
"transformers",
"pytorch",
"tf",
"jax",
"safetensors",
"bert",
"fill-mask",
"zh",
"dataset:wikipedia",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | [] | [
"zh"
] | TAGS
#transformers #pytorch #tf #jax #safetensors #bert #fill-mask #zh #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
|
# bert-base-zh-cased
We are sharing smaller versions of bert-base-multilingual-cased that handle a custom number of languages.
Unlike distilbert-base-multilingual-cased, our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.
## How to use
To generate other smaller versions of multilingual transformers please visit our Github repo.
### How to cite
## Contact
Please contact amine@URL for any question, feedback or request.
| [
"# bert-base-zh-cased\n\nWe are sharing smaller versions of bert-base-multilingual-cased that handle a custom number of languages.\n\nUnlike distilbert-base-multilingual-cased, our versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] | [
"TAGS\n#transformers #pytorch #tf #jax #safetensors #bert #fill-mask #zh #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n",
"# bert-base-zh-cased\n\nWe are sharing smaller versions of bert-base-multilingual-cased that handle a custom number of languages.\n\nUnlike distilbert-base-multilingual-cased, our versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] |
fill-mask | transformers |
# distilbert-base-25lang-cased
We are sharing smaller versions of [distilbert-base-multilingual-cased](https://huggingface.co/distilbert-base-multilingual-cased) that handle a custom number of languages.
Our versions give exactly the same representations produced by the original model which preserves the original accuracy.
Handled languages: en, fr, es, de, zh, ar, ru, vi, el, bg, th, tr, hi, ur, sw, nl, uk, ro, pt, it, lt, no, pl, da and ja.
For more information please visit our paper: [Load What You Need: Smaller Versions of Multilingual BERT](https://www.aclweb.org/anthology/2020.sustainlp-1.16.pdf).
## How to use
```python
from transformers import AutoTokenizer, AutoModel
tokenizer = AutoTokenizer.from_pretrained("Geotrend/distilbert-base-25lang-cased")
model = AutoModel.from_pretrained("Geotrend/distilbert-base-25lang-cased")
```
To generate other smaller versions of multilingual transformers please visit [our Github repo](https://github.com/Geotrend-research/smaller-transformers).
### How to cite
```bibtex
@inproceedings{smallermbert,
title={Load What You Need: Smaller Versions of Multilingual BERT},
author={Abdaoui, Amine and Pradel, Camille and Sigel, Grégoire},
booktitle={SustaiNLP / EMNLP},
year={2020}
}
```
## Contact
Please contact [email protected] for any question, feedback or request. | {"language": ["multilingual", "en", "fr", "es", "de", "zh", "ar", "ru", "vi", "el", "bg", "th", "tr", "hi", "ur", "sw", "nl", "uk", "ro", "pt", "it", "lt", false, "pl", "da", "ja"], "license": "apache-2.0", "datasets": "wikipedia", "widget": [{"text": "Google generated 46 billion [MASK] in revenue."}, {"text": "Paris is the capital of [MASK]."}, {"text": "Algiers is the largest city in [MASK]."}, {"text": "Paris est la [MASK] de la France."}, {"text": "Paris est la capitale de la [MASK]."}, {"text": "L'\u00e9lection am\u00e9ricaine a eu [MASK] en novembre 2020."}, {"text": "\u062a\u0642\u0639 \u0633\u0648\u064a\u0633\u0631\u0627 \u0641\u064a [MASK] \u0623\u0648\u0631\u0648\u0628\u0627"}, {"text": "\u0625\u0633\u0645\u064a \u0645\u062d\u0645\u062f \u0648\u0623\u0633\u0643\u0646 \u0641\u064a [MASK]."}]} | Geotrend/distilbert-base-25lang-cased | null | [
"transformers",
"pytorch",
"safetensors",
"distilbert",
"fill-mask",
"multilingual",
"en",
"fr",
"es",
"de",
"zh",
"ar",
"ru",
"vi",
"el",
"bg",
"th",
"tr",
"hi",
"ur",
"sw",
"nl",
"uk",
"ro",
"pt",
"it",
"lt",
"no",
"pl",
"da",
"ja",
"dataset:wikipedia",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | [] | [
"multilingual",
"en",
"fr",
"es",
"de",
"zh",
"ar",
"ru",
"vi",
"el",
"bg",
"th",
"tr",
"hi",
"ur",
"sw",
"nl",
"uk",
"ro",
"pt",
"it",
"lt",
"no",
"pl",
"da",
"ja"
] | TAGS
#transformers #pytorch #safetensors #distilbert #fill-mask #multilingual #en #fr #es #de #zh #ar #ru #vi #el #bg #th #tr #hi #ur #sw #nl #uk #ro #pt #it #lt #no #pl #da #ja #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
|
# distilbert-base-25lang-cased
We are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.
Our versions give exactly the same representations produced by the original model which preserves the original accuracy.
Handled languages: en, fr, es, de, zh, ar, ru, vi, el, bg, th, tr, hi, ur, sw, nl, uk, ro, pt, it, lt, no, pl, da and ja.
For more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.
## How to use
To generate other smaller versions of multilingual transformers please visit our Github repo.
### How to cite
## Contact
Please contact amine@URL for any question, feedback or request. | [
"# distilbert-base-25lang-cased\n\nWe are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.\n\nOur versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\nHandled languages: en, fr, es, de, zh, ar, ru, vi, el, bg, th, tr, hi, ur, sw, nl, uk, ro, pt, it, lt, no, pl, da and ja.\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] | [
"TAGS\n#transformers #pytorch #safetensors #distilbert #fill-mask #multilingual #en #fr #es #de #zh #ar #ru #vi #el #bg #th #tr #hi #ur #sw #nl #uk #ro #pt #it #lt #no #pl #da #ja #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n",
"# distilbert-base-25lang-cased\n\nWe are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.\n\nOur versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\nHandled languages: en, fr, es, de, zh, ar, ru, vi, el, bg, th, tr, hi, ur, sw, nl, uk, ro, pt, it, lt, no, pl, da and ja.\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] |
fill-mask | transformers |
# distilbert-base-ar-cased
We are sharing smaller versions of [distilbert-base-multilingual-cased](https://huggingface.co/distilbert-base-multilingual-cased) that handle a custom number of languages.
Our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: [Load What You Need: Smaller Versions of Multilingual BERT](https://www.aclweb.org/anthology/2020.sustainlp-1.16.pdf).
## How to use
```python
from transformers import AutoTokenizer, AutoModel
tokenizer = AutoTokenizer.from_pretrained("Geotrend/distilbert-base-ar-cased")
model = AutoModel.from_pretrained("Geotrend/distilbert-base-ar-cased")
```
To generate other smaller versions of multilingual transformers please visit [our Github repo](https://github.com/Geotrend-research/smaller-transformers).
### How to cite
```bibtex
@inproceedings{smallermdistilbert,
title={Load What You Need: Smaller Versions of Mutlilingual BERT},
author={Abdaoui, Amine and Pradel, Camille and Sigel, Grégoire},
booktitle={SustaiNLP / EMNLP},
year={2020}
}
```
## Contact
Please contact [email protected] for any question, feedback or request. | {"language": "ar", "license": "apache-2.0", "datasets": "wikipedia"} | Geotrend/distilbert-base-ar-cased | null | [
"transformers",
"pytorch",
"safetensors",
"distilbert",
"fill-mask",
"ar",
"dataset:wikipedia",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | [] | [
"ar"
] | TAGS
#transformers #pytorch #safetensors #distilbert #fill-mask #ar #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
|
# distilbert-base-ar-cased
We are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.
Our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.
## How to use
To generate other smaller versions of multilingual transformers please visit our Github repo.
### How to cite
## Contact
Please contact amine@URL for any question, feedback or request. | [
"# distilbert-base-ar-cased\n\nWe are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.\n\nOur versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] | [
"TAGS\n#transformers #pytorch #safetensors #distilbert #fill-mask #ar #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n",
"# distilbert-base-ar-cased\n\nWe are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.\n\nOur versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] |
fill-mask | transformers |
# distilbert-base-bg-cased
We are sharing smaller versions of [distilbert-base-multilingual-cased](https://huggingface.co/distilbert-base-multilingual-cased) that handle a custom number of languages.
Our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: [Load What You Need: Smaller Versions of Multilingual BERT](https://www.aclweb.org/anthology/2020.sustainlp-1.16.pdf).
## How to use
```python
from transformers import AutoTokenizer, AutoModel
tokenizer = AutoTokenizer.from_pretrained("Geotrend/distilbert-base-bg-cased")
model = AutoModel.from_pretrained("Geotrend/distilbert-base-bg-cased")
```
To generate other smaller versions of multilingual transformers please visit [our Github repo](https://github.com/Geotrend-research/smaller-transformers).
### How to cite
```bibtex
@inproceedings{smallermdistilbert,
title={Load What You Need: Smaller Versions of Mutlilingual BERT},
author={Abdaoui, Amine and Pradel, Camille and Sigel, Grégoire},
booktitle={SustaiNLP / EMNLP},
year={2020}
}
```
## Contact
Please contact [email protected] for any question, feedback or request. | {"language": "bg", "license": "apache-2.0", "datasets": "wikipedia"} | Geotrend/distilbert-base-bg-cased | null | [
"transformers",
"pytorch",
"distilbert",
"fill-mask",
"bg",
"dataset:wikipedia",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | [] | [
"bg"
] | TAGS
#transformers #pytorch #distilbert #fill-mask #bg #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
|
# distilbert-base-bg-cased
We are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.
Our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.
## How to use
To generate other smaller versions of multilingual transformers please visit our Github repo.
### How to cite
## Contact
Please contact amine@URL for any question, feedback or request. | [
"# distilbert-base-bg-cased\n\nWe are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.\n\nOur versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] | [
"TAGS\n#transformers #pytorch #distilbert #fill-mask #bg #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n",
"# distilbert-base-bg-cased\n\nWe are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.\n\nOur versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] |
fill-mask | transformers |
# distilbert-base-da-cased
We are sharing smaller versions of [distilbert-base-multilingual-cased](https://huggingface.co/distilbert-base-multilingual-cased) that handle a custom number of languages.
Our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: [Load What You Need: Smaller Versions of Multilingual BERT](https://www.aclweb.org/anthology/2020.sustainlp-1.16.pdf).
## How to use
```python
from transformers import AutoTokenizer, AutoModel
tokenizer = AutoTokenizer.from_pretrained("Geotrend/distilbert-base-da-cased")
model = AutoModel.from_pretrained("Geotrend/distilbert-base-da-cased")
```
To generate other smaller versions of multilingual transformers please visit [our Github repo](https://github.com/Geotrend-research/smaller-transformers).
### How to cite
```bibtex
@inproceedings{smallermdistilbert,
title={Load What You Need: Smaller Versions of Mutlilingual BERT},
author={Abdaoui, Amine and Pradel, Camille and Sigel, Grégoire},
booktitle={SustaiNLP / EMNLP},
year={2020}
}
```
## Contact
Please contact [email protected] for any question, feedback or request. | {"language": "da", "license": "apache-2.0", "datasets": "wikipedia"} | Geotrend/distilbert-base-da-cased | null | [
"transformers",
"pytorch",
"safetensors",
"distilbert",
"fill-mask",
"da",
"dataset:wikipedia",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | [] | [
"da"
] | TAGS
#transformers #pytorch #safetensors #distilbert #fill-mask #da #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
|
# distilbert-base-da-cased
We are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.
Our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.
## How to use
To generate other smaller versions of multilingual transformers please visit our Github repo.
### How to cite
## Contact
Please contact amine@URL for any question, feedback or request. | [
"# distilbert-base-da-cased\n\nWe are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.\n\nOur versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] | [
"TAGS\n#transformers #pytorch #safetensors #distilbert #fill-mask #da #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n",
"# distilbert-base-da-cased\n\nWe are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.\n\nOur versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] |
fill-mask | transformers |
# distilbert-base-de-cased
We are sharing smaller versions of [distilbert-base-multilingual-cased](https://huggingface.co/distilbert-base-multilingual-cased) that handle a custom number of languages.
Our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: [Load What You Need: Smaller Versions of Multilingual BERT](https://www.aclweb.org/anthology/2020.sustainlp-1.16.pdf).
## How to use
```python
from transformers import AutoTokenizer, AutoModel
tokenizer = AutoTokenizer.from_pretrained("Geotrend/distilbert-base-de-cased")
model = AutoModel.from_pretrained("Geotrend/distilbert-base-de-cased")
```
To generate other smaller versions of multilingual transformers please visit [our Github repo](https://github.com/Geotrend-research/smaller-transformers).
### How to cite
```bibtex
@inproceedings{smallermdistilbert,
title={Load What You Need: Smaller Versions of Mutlilingual BERT},
author={Abdaoui, Amine and Pradel, Camille and Sigel, Grégoire},
booktitle={SustaiNLP / EMNLP},
year={2020}
}
```
## Contact
Please contact [email protected] for any question, feedback or request. | {"language": "de", "license": "apache-2.0", "datasets": "wikipedia"} | Geotrend/distilbert-base-de-cased | null | [
"transformers",
"pytorch",
"distilbert",
"fill-mask",
"de",
"dataset:wikipedia",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | [] | [
"de"
] | TAGS
#transformers #pytorch #distilbert #fill-mask #de #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
|
# distilbert-base-de-cased
We are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.
Our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.
## How to use
To generate other smaller versions of multilingual transformers please visit our Github repo.
### How to cite
## Contact
Please contact amine@URL for any question, feedback or request. | [
"# distilbert-base-de-cased\n\nWe are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.\n\nOur versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] | [
"TAGS\n#transformers #pytorch #distilbert #fill-mask #de #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n",
"# distilbert-base-de-cased\n\nWe are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.\n\nOur versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] |
fill-mask | transformers |
# distilbert-base-el-cased
We are sharing smaller versions of [distilbert-base-multilingual-cased](https://huggingface.co/distilbert-base-multilingual-cased) that handle a custom number of languages.
Our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: [Load What You Need: Smaller Versions of Multilingual BERT](https://www.aclweb.org/anthology/2020.sustainlp-1.16.pdf).
## How to use
```python
from transformers import AutoTokenizer, AutoModel
tokenizer = AutoTokenizer.from_pretrained("Geotrend/distilbert-base-el-cased")
model = AutoModel.from_pretrained("Geotrend/distilbert-base-el-cased")
```
To generate other smaller versions of multilingual transformers please visit [our Github repo](https://github.com/Geotrend-research/smaller-transformers).
### How to cite
```bibtex
@inproceedings{smallermdistilbert,
title={Load What You Need: Smaller Versions of Mutlilingual BERT},
author={Abdaoui, Amine and Pradel, Camille and Sigel, Grégoire},
booktitle={SustaiNLP / EMNLP},
year={2020}
}
```
## Contact
Please contact [email protected] for any question, feedback or request. | {"language": "el", "license": "apache-2.0", "datasets": "wikipedia"} | Geotrend/distilbert-base-el-cased | null | [
"transformers",
"pytorch",
"distilbert",
"fill-mask",
"el",
"dataset:wikipedia",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | [] | [
"el"
] | TAGS
#transformers #pytorch #distilbert #fill-mask #el #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
|
# distilbert-base-el-cased
We are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.
Our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.
## How to use
To generate other smaller versions of multilingual transformers please visit our Github repo.
### How to cite
## Contact
Please contact amine@URL for any question, feedback or request. | [
"# distilbert-base-el-cased\n\nWe are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.\n\nOur versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] | [
"TAGS\n#transformers #pytorch #distilbert #fill-mask #el #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n",
"# distilbert-base-el-cased\n\nWe are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.\n\nOur versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] |
fill-mask | transformers |
# distilbert-base-en-ar-cased
We are sharing smaller versions of [distilbert-base-multilingual-cased](https://huggingface.co/distilbert-base-multilingual-cased) that handle a custom number of languages.
Our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: [Load What You Need: Smaller Versions of Multilingual BERT](https://www.aclweb.org/anthology/2020.sustainlp-1.16.pdf).
## How to use
```python
from transformers import AutoTokenizer, AutoModel
tokenizer = AutoTokenizer.from_pretrained("Geotrend/distilbert-base-en-ar-cased")
model = AutoModel.from_pretrained("Geotrend/distilbert-base-en-ar-cased")
```
To generate other smaller versions of multilingual transformers please visit [our Github repo](https://github.com/Geotrend-research/smaller-transformers).
### How to cite
```bibtex
@inproceedings{smallermdistilbert,
title={Load What You Need: Smaller Versions of Mutlilingual BERT},
author={Abdaoui, Amine and Pradel, Camille and Sigel, Grégoire},
booktitle={SustaiNLP / EMNLP},
year={2020}
}
```
## Contact
Please contact [email protected] for any question, feedback or request. | {"language": "multilingual", "license": "apache-2.0", "datasets": "wikipedia"} | Geotrend/distilbert-base-en-ar-cased | null | [
"transformers",
"pytorch",
"distilbert",
"fill-mask",
"multilingual",
"dataset:wikipedia",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | [] | [
"multilingual"
] | TAGS
#transformers #pytorch #distilbert #fill-mask #multilingual #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
|
# distilbert-base-en-ar-cased
We are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.
Our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.
## How to use
To generate other smaller versions of multilingual transformers please visit our Github repo.
### How to cite
## Contact
Please contact amine@URL for any question, feedback or request. | [
"# distilbert-base-en-ar-cased\n\nWe are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.\n\nOur versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] | [
"TAGS\n#transformers #pytorch #distilbert #fill-mask #multilingual #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n",
"# distilbert-base-en-ar-cased\n\nWe are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.\n\nOur versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] |
fill-mask | transformers |
# distilbert-base-en-bg-cased
We are sharing smaller versions of [distilbert-base-multilingual-cased](https://huggingface.co/distilbert-base-multilingual-cased) that handle a custom number of languages.
Our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: [Load What You Need: Smaller Versions of Multilingual BERT](https://www.aclweb.org/anthology/2020.sustainlp-1.16.pdf).
## How to use
```python
from transformers import AutoTokenizer, AutoModel
tokenizer = AutoTokenizer.from_pretrained("Geotrend/distilbert-base-en-bg-cased")
model = AutoModel.from_pretrained("Geotrend/distilbert-base-en-bg-cased")
```
To generate other smaller versions of multilingual transformers please visit [our Github repo](https://github.com/Geotrend-research/smaller-transformers).
### How to cite
```bibtex
@inproceedings{smallermdistilbert,
title={Load What You Need: Smaller Versions of Mutlilingual BERT},
author={Abdaoui, Amine and Pradel, Camille and Sigel, Grégoire},
booktitle={SustaiNLP / EMNLP},
year={2020}
}
```
## Contact
Please contact [email protected] for any question, feedback or request. | {"language": "multilingual", "license": "apache-2.0", "datasets": "wikipedia"} | Geotrend/distilbert-base-en-bg-cased | null | [
"transformers",
"pytorch",
"distilbert",
"fill-mask",
"multilingual",
"dataset:wikipedia",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | [] | [
"multilingual"
] | TAGS
#transformers #pytorch #distilbert #fill-mask #multilingual #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
|
# distilbert-base-en-bg-cased
We are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.
Our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.
## How to use
To generate other smaller versions of multilingual transformers please visit our Github repo.
### How to cite
## Contact
Please contact amine@URL for any question, feedback or request. | [
"# distilbert-base-en-bg-cased\n\nWe are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.\n\nOur versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] | [
"TAGS\n#transformers #pytorch #distilbert #fill-mask #multilingual #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n",
"# distilbert-base-en-bg-cased\n\nWe are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.\n\nOur versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] |
fill-mask | transformers |
# distilbert-base-en-cased
We are sharing smaller versions of [distilbert-base-multilingual-cased](https://huggingface.co/distilbert-base-multilingual-cased) that handle a custom number of languages.
Our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: [Load What You Need: Smaller Versions of Multilingual BERT](https://www.aclweb.org/anthology/2020.sustainlp-1.16.pdf).
## How to use
```python
from transformers import AutoTokenizer, AutoModel
tokenizer = AutoTokenizer.from_pretrained("Geotrend/distilbert-base-en-cased")
model = AutoModel.from_pretrained("Geotrend/distilbert-base-en-cased")
```
To generate other smaller versions of multilingual transformers please visit [our Github repo](https://github.com/Geotrend-research/smaller-transformers).
### How to cite
```bibtex
@inproceedings{smallermdistilbert,
title={Load What You Need: Smaller Versions of Mutlilingual BERT},
author={Abdaoui, Amine and Pradel, Camille and Sigel, Grégoire},
booktitle={SustaiNLP / EMNLP},
year={2020}
}
```
## Contact
Please contact [email protected] for any question, feedback or request. | {"language": "en", "license": "apache-2.0", "datasets": "wikipedia"} | Geotrend/distilbert-base-en-cased | null | [
"transformers",
"pytorch",
"safetensors",
"distilbert",
"fill-mask",
"en",
"dataset:wikipedia",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | [] | [
"en"
] | TAGS
#transformers #pytorch #safetensors #distilbert #fill-mask #en #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
|
# distilbert-base-en-cased
We are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.
Our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.
## How to use
To generate other smaller versions of multilingual transformers please visit our Github repo.
### How to cite
## Contact
Please contact amine@URL for any question, feedback or request. | [
"# distilbert-base-en-cased\n\nWe are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.\n\nOur versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] | [
"TAGS\n#transformers #pytorch #safetensors #distilbert #fill-mask #en #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n",
"# distilbert-base-en-cased\n\nWe are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.\n\nOur versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] |
fill-mask | transformers |
# distilbert-base-en-da-cased
We are sharing smaller versions of [distilbert-base-multilingual-cased](https://huggingface.co/distilbert-base-multilingual-cased) that handle a custom number of languages.
Our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: [Load What You Need: Smaller Versions of Multilingual BERT](https://www.aclweb.org/anthology/2020.sustainlp-1.16.pdf).
## How to use
```python
from transformers import AutoTokenizer, AutoModel
tokenizer = AutoTokenizer.from_pretrained("Geotrend/distilbert-base-en-da-cased")
model = AutoModel.from_pretrained("Geotrend/distilbert-base-en-da-cased")
```
To generate other smaller versions of multilingual transformers please visit [our Github repo](https://github.com/Geotrend-research/smaller-transformers).
### How to cite
```bibtex
@inproceedings{smallermdistilbert,
title={Load What You Need: Smaller Versions of Mutlilingual BERT},
author={Abdaoui, Amine and Pradel, Camille and Sigel, Grégoire},
booktitle={SustaiNLP / EMNLP},
year={2020}
}
```
## Contact
Please contact [email protected] for any question, feedback or request. | {"language": "multilingual", "license": "apache-2.0", "datasets": "wikipedia"} | Geotrend/distilbert-base-en-da-cased | null | [
"transformers",
"pytorch",
"distilbert",
"fill-mask",
"multilingual",
"dataset:wikipedia",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | [] | [
"multilingual"
] | TAGS
#transformers #pytorch #distilbert #fill-mask #multilingual #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
|
# distilbert-base-en-da-cased
We are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.
Our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.
## How to use
To generate other smaller versions of multilingual transformers please visit our Github repo.
### How to cite
## Contact
Please contact amine@URL for any question, feedback or request. | [
"# distilbert-base-en-da-cased\n\nWe are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.\n\nOur versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] | [
"TAGS\n#transformers #pytorch #distilbert #fill-mask #multilingual #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n",
"# distilbert-base-en-da-cased\n\nWe are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.\n\nOur versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] |
fill-mask | transformers |
# distilbert-base-en-de-cased
We are sharing smaller versions of [distilbert-base-multilingual-cased](https://huggingface.co/distilbert-base-multilingual-cased) that handle a custom number of languages.
Our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: [Load What You Need: Smaller Versions of Multilingual BERT](https://www.aclweb.org/anthology/2020.sustainlp-1.16.pdf).
## How to use
```python
from transformers import AutoTokenizer, AutoModel
tokenizer = AutoTokenizer.from_pretrained("Geotrend/distilbert-base-en-de-cased")
model = AutoModel.from_pretrained("Geotrend/distilbert-base-en-de-cased")
```
To generate other smaller versions of multilingual transformers please visit [our Github repo](https://github.com/Geotrend-research/smaller-transformers).
### How to cite
```bibtex
@inproceedings{smallermdistilbert,
title={Load What You Need: Smaller Versions of Mutlilingual BERT},
author={Abdaoui, Amine and Pradel, Camille and Sigel, Grégoire},
booktitle={SustaiNLP / EMNLP},
year={2020}
}
```
## Contact
Please contact [email protected] for any question, feedback or request. | {"language": "multilingual", "license": "apache-2.0", "datasets": "wikipedia"} | Geotrend/distilbert-base-en-de-cased | null | [
"transformers",
"pytorch",
"distilbert",
"fill-mask",
"multilingual",
"dataset:wikipedia",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | [] | [
"multilingual"
] | TAGS
#transformers #pytorch #distilbert #fill-mask #multilingual #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
|
# distilbert-base-en-de-cased
We are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.
Our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.
## How to use
To generate other smaller versions of multilingual transformers please visit our Github repo.
### How to cite
## Contact
Please contact amine@URL for any question, feedback or request. | [
"# distilbert-base-en-de-cased\n\nWe are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.\n\nOur versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] | [
"TAGS\n#transformers #pytorch #distilbert #fill-mask #multilingual #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n",
"# distilbert-base-en-de-cased\n\nWe are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.\n\nOur versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] |
fill-mask | transformers |
# distilbert-base-en-el-cased
We are sharing smaller versions of [distilbert-base-multilingual-cased](https://huggingface.co/distilbert-base-multilingual-cased) that handle a custom number of languages.
Our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: [Load What You Need: Smaller Versions of Multilingual BERT](https://www.aclweb.org/anthology/2020.sustainlp-1.16.pdf).
## How to use
```python
from transformers import AutoTokenizer, AutoModel
tokenizer = AutoTokenizer.from_pretrained("Geotrend/distilbert-base-en-el-cased")
model = AutoModel.from_pretrained("Geotrend/distilbert-base-en-el-cased")
```
To generate other smaller versions of multilingual transformers please visit [our Github repo](https://github.com/Geotrend-research/smaller-transformers).
### How to cite
```bibtex
@inproceedings{smallermdistilbert,
title={Load What You Need: Smaller Versions of Mutlilingual BERT},
author={Abdaoui, Amine and Pradel, Camille and Sigel, Grégoire},
booktitle={SustaiNLP / EMNLP},
year={2020}
}
```
## Contact
Please contact [email protected] for any question, feedback or request. | {"language": "multilingual", "license": "apache-2.0", "datasets": "wikipedia"} | Geotrend/distilbert-base-en-el-cased | null | [
"transformers",
"pytorch",
"distilbert",
"fill-mask",
"multilingual",
"dataset:wikipedia",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | [] | [
"multilingual"
] | TAGS
#transformers #pytorch #distilbert #fill-mask #multilingual #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
|
# distilbert-base-en-el-cased
We are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.
Our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.
## How to use
To generate other smaller versions of multilingual transformers please visit our Github repo.
### How to cite
## Contact
Please contact amine@URL for any question, feedback or request. | [
"# distilbert-base-en-el-cased\n\nWe are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.\n\nOur versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] | [
"TAGS\n#transformers #pytorch #distilbert #fill-mask #multilingual #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n",
"# distilbert-base-en-el-cased\n\nWe are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.\n\nOur versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] |
fill-mask | transformers |
# distilbert-base-en-el-ru-cased
We are sharing smaller versions of [distilbert-base-multilingual-cased](https://huggingface.co/distilbert-base-multilingual-cased) that handle a custom number of languages.
Our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: [Load What You Need: Smaller Versions of Multilingual BERT](https://www.aclweb.org/anthology/2020.sustainlp-1.16.pdf).
## How to use
```python
from transformers import AutoTokenizer, AutoModel
tokenizer = AutoTokenizer.from_pretrained("Geotrend/distilbert-base-en-el-ru-cased")
model = AutoModel.from_pretrained("Geotrend/distilbert-base-en-el-ru-cased")
```
To generate other smaller versions of multilingual transformers please visit [our Github repo](https://github.com/Geotrend-research/smaller-transformers).
### How to cite
```bibtex
@inproceedings{smallermdistilbert,
title={Load What You Need: Smaller Versions of Mutlilingual BERT},
author={Abdaoui, Amine and Pradel, Camille and Sigel, Grégoire},
booktitle={SustaiNLP / EMNLP},
year={2020}
}
```
## Contact
Please contact [email protected] for any question, feedback or request. | {"language": "multilingual", "license": "apache-2.0", "datasets": "wikipedia"} | Geotrend/distilbert-base-en-el-ru-cased | null | [
"transformers",
"pytorch",
"distilbert",
"fill-mask",
"multilingual",
"dataset:wikipedia",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | [] | [
"multilingual"
] | TAGS
#transformers #pytorch #distilbert #fill-mask #multilingual #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
|
# distilbert-base-en-el-ru-cased
We are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.
Our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.
## How to use
To generate other smaller versions of multilingual transformers please visit our Github repo.
### How to cite
## Contact
Please contact amine@URL for any question, feedback or request. | [
"# distilbert-base-en-el-ru-cased\n\nWe are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.\n\nOur versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] | [
"TAGS\n#transformers #pytorch #distilbert #fill-mask #multilingual #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n",
"# distilbert-base-en-el-ru-cased\n\nWe are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.\n\nOur versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] |
fill-mask | transformers |
# distilbert-base-en-es-cased
We are sharing smaller versions of [distilbert-base-multilingual-cased](https://huggingface.co/distilbert-base-multilingual-cased) that handle a custom number of languages.
Our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: [Load What You Need: Smaller Versions of Multilingual BERT](https://www.aclweb.org/anthology/2020.sustainlp-1.16.pdf).
## How to use
```python
from transformers import AutoTokenizer, AutoModel
tokenizer = AutoTokenizer.from_pretrained("Geotrend/distilbert-base-en-es-cased")
model = AutoModel.from_pretrained("Geotrend/distilbert-base-en-es-cased")
```
To generate other smaller versions of multilingual transformers please visit [our Github repo](https://github.com/Geotrend-research/smaller-transformers).
### How to cite
```bibtex
@inproceedings{smallermdistilbert,
title={Load What You Need: Smaller Versions of Mutlilingual BERT},
author={Abdaoui, Amine and Pradel, Camille and Sigel, Grégoire},
booktitle={SustaiNLP / EMNLP},
year={2020}
}
```
## Contact
Please contact [email protected] for any question, feedback or request. | {"language": "multilingual", "license": "apache-2.0", "datasets": "wikipedia"} | Geotrend/distilbert-base-en-es-cased | null | [
"transformers",
"pytorch",
"distilbert",
"fill-mask",
"multilingual",
"dataset:wikipedia",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | [] | [
"multilingual"
] | TAGS
#transformers #pytorch #distilbert #fill-mask #multilingual #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
|
# distilbert-base-en-es-cased
We are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.
Our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.
## How to use
To generate other smaller versions of multilingual transformers please visit our Github repo.
### How to cite
## Contact
Please contact amine@URL for any question, feedback or request. | [
"# distilbert-base-en-es-cased\n\nWe are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.\n\nOur versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] | [
"TAGS\n#transformers #pytorch #distilbert #fill-mask #multilingual #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n",
"# distilbert-base-en-es-cased\n\nWe are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.\n\nOur versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] |
fill-mask | transformers |
# distilbert-base-en-es-it-cased
We are sharing smaller versions of [distilbert-base-multilingual-cased](https://huggingface.co/distilbert-base-multilingual-cased) that handle a custom number of languages.
Our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: [Load What You Need: Smaller Versions of Multilingual BERT](https://www.aclweb.org/anthology/2020.sustainlp-1.16.pdf).
## How to use
```python
from transformers import AutoTokenizer, AutoModel
tokenizer = AutoTokenizer.from_pretrained("Geotrend/distilbert-base-en-es-it-cased")
model = AutoModel.from_pretrained("Geotrend/distilbert-base-en-es-it-cased")
```
To generate other smaller versions of multilingual transformers please visit [our Github repo](https://github.com/Geotrend-research/smaller-transformers).
### How to cite
```bibtex
@inproceedings{smallermdistilbert,
title={Load What You Need: Smaller Versions of Mutlilingual BERT},
author={Abdaoui, Amine and Pradel, Camille and Sigel, Grégoire},
booktitle={SustaiNLP / EMNLP},
year={2020}
}
```
## Contact
Please contact [email protected] for any question, feedback or request. | {"language": "multilingual", "license": "apache-2.0", "datasets": "wikipedia"} | Geotrend/distilbert-base-en-es-it-cased | null | [
"transformers",
"pytorch",
"safetensors",
"distilbert",
"fill-mask",
"multilingual",
"dataset:wikipedia",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | [] | [
"multilingual"
] | TAGS
#transformers #pytorch #safetensors #distilbert #fill-mask #multilingual #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
|
# distilbert-base-en-es-it-cased
We are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.
Our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.
## How to use
To generate other smaller versions of multilingual transformers please visit our Github repo.
### How to cite
## Contact
Please contact amine@URL for any question, feedback or request. | [
"# distilbert-base-en-es-it-cased\n\nWe are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.\n\nOur versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] | [
"TAGS\n#transformers #pytorch #safetensors #distilbert #fill-mask #multilingual #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n",
"# distilbert-base-en-es-it-cased\n\nWe are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.\n\nOur versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] |
fill-mask | transformers |
# distilbert-base-en-es-pt-cased
We are sharing smaller versions of [distilbert-base-multilingual-cased](https://huggingface.co/distilbert-base-multilingual-cased) that handle a custom number of languages.
Our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: [Load What You Need: Smaller Versions of Multilingual BERT](https://www.aclweb.org/anthology/2020.sustainlp-1.16.pdf).
## How to use
```python
from transformers import AutoTokenizer, AutoModel
tokenizer = AutoTokenizer.from_pretrained("Geotrend/distilbert-base-en-es-pt-cased")
model = AutoModel.from_pretrained("Geotrend/distilbert-base-en-es-pt-cased")
```
To generate other smaller versions of multilingual transformers please visit [our Github repo](https://github.com/Geotrend-research/smaller-transformers).
### How to cite
```bibtex
@inproceedings{smallermdistilbert,
title={Load What You Need: Smaller Versions of Mutlilingual BERT},
author={Abdaoui, Amine and Pradel, Camille and Sigel, Grégoire},
booktitle={SustaiNLP / EMNLP},
year={2020}
}
```
## Contact
Please contact [email protected] for any question, feedback or request. | {"language": "multilingual", "license": "apache-2.0", "datasets": "wikipedia"} | Geotrend/distilbert-base-en-es-pt-cased | null | [
"transformers",
"pytorch",
"safetensors",
"distilbert",
"fill-mask",
"multilingual",
"dataset:wikipedia",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | [] | [
"multilingual"
] | TAGS
#transformers #pytorch #safetensors #distilbert #fill-mask #multilingual #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
|
# distilbert-base-en-es-pt-cased
We are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.
Our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.
## How to use
To generate other smaller versions of multilingual transformers please visit our Github repo.
### How to cite
## Contact
Please contact amine@URL for any question, feedback or request. | [
"# distilbert-base-en-es-pt-cased\n\nWe are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.\n\nOur versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] | [
"TAGS\n#transformers #pytorch #safetensors #distilbert #fill-mask #multilingual #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n",
"# distilbert-base-en-es-pt-cased\n\nWe are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.\n\nOur versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] |
fill-mask | transformers |
# distilbert-base-en-es-zh-cased
We are sharing smaller versions of [distilbert-base-multilingual-cased](https://huggingface.co/distilbert-base-multilingual-cased) that handle a custom number of languages.
Our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: [Load What You Need: Smaller Versions of Multilingual BERT](https://www.aclweb.org/anthology/2020.sustainlp-1.16.pdf).
## How to use
```python
from transformers import AutoTokenizer, AutoModel
tokenizer = AutoTokenizer.from_pretrained("Geotrend/distilbert-base-en-es-zh-cased")
model = AutoModel.from_pretrained("Geotrend/distilbert-base-en-es-zh-cased")
```
To generate other smaller versions of multilingual transformers please visit [our Github repo](https://github.com/Geotrend-research/smaller-transformers).
### How to cite
```bibtex
@inproceedings{smallermdistilbert,
title={Load What You Need: Smaller Versions of Mutlilingual BERT},
author={Abdaoui, Amine and Pradel, Camille and Sigel, Grégoire},
booktitle={SustaiNLP / EMNLP},
year={2020}
}
```
## Contact
Please contact [email protected] for any question, feedback or request. | {"language": ["multilingual", "en", "es", "zh"], "license": "apache-2.0", "datasets": "wikipedia"} | Geotrend/distilbert-base-en-es-zh-cased | null | [
"transformers",
"pytorch",
"safetensors",
"distilbert",
"fill-mask",
"multilingual",
"en",
"es",
"zh",
"dataset:wikipedia",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | [] | [
"multilingual",
"en",
"es",
"zh"
] | TAGS
#transformers #pytorch #safetensors #distilbert #fill-mask #multilingual #en #es #zh #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
|
# distilbert-base-en-es-zh-cased
We are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.
Our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.
## How to use
To generate other smaller versions of multilingual transformers please visit our Github repo.
### How to cite
## Contact
Please contact amine@URL for any question, feedback or request. | [
"# distilbert-base-en-es-zh-cased\n\nWe are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.\n\nOur versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] | [
"TAGS\n#transformers #pytorch #safetensors #distilbert #fill-mask #multilingual #en #es #zh #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n",
"# distilbert-base-en-es-zh-cased\n\nWe are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.\n\nOur versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] |
fill-mask | transformers |
# distilbert-base-en-fr-ar-cased
We are sharing smaller versions of [distilbert-base-multilingual-cased](https://huggingface.co/distilbert-base-multilingual-cased) that handle a custom number of languages.
Our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: [Load What You Need: Smaller Versions of Multilingual BERT](https://www.aclweb.org/anthology/2020.sustainlp-1.16.pdf).
## How to use
```python
from transformers import AutoTokenizer, AutoModel
tokenizer = AutoTokenizer.from_pretrained("Geotrend/distilbert-base-en-fr-ar-cased")
model = AutoModel.from_pretrained("Geotrend/distilbert-base-en-fr-ar-cased")
```
To generate other smaller versions of multilingual transformers please visit [our Github repo](https://github.com/Geotrend-research/smaller-transformers).
### How to cite
```bibtex
@inproceedings{smallermdistilbert,
title={Load What You Need: Smaller Versions of Mutlilingual BERT},
author={Abdaoui, Amine and Pradel, Camille and Sigel, Grégoire},
booktitle={SustaiNLP / EMNLP},
year={2020}
}
```
## Contact
Please contact [email protected] for any question, feedback or request. | {"language": "multilingual", "license": "apache-2.0", "datasets": "wikipedia"} | Geotrend/distilbert-base-en-fr-ar-cased | null | [
"transformers",
"pytorch",
"distilbert",
"fill-mask",
"multilingual",
"dataset:wikipedia",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | [] | [
"multilingual"
] | TAGS
#transformers #pytorch #distilbert #fill-mask #multilingual #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
|
# distilbert-base-en-fr-ar-cased
We are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.
Our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.
## How to use
To generate other smaller versions of multilingual transformers please visit our Github repo.
### How to cite
## Contact
Please contact amine@URL for any question, feedback or request. | [
"# distilbert-base-en-fr-ar-cased\n\nWe are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.\n\nOur versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] | [
"TAGS\n#transformers #pytorch #distilbert #fill-mask #multilingual #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n",
"# distilbert-base-en-fr-ar-cased\n\nWe are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.\n\nOur versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] |
fill-mask | transformers |
# distilbert-base-en-fr-cased
We are sharing smaller versions of [distilbert-base-multilingual-cased](https://huggingface.co/distilbert-base-multilingual-cased) that handle a custom number of languages.
Our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: [Load What You Need: Smaller Versions of Multilingual BERT](https://www.aclweb.org/anthology/2020.sustainlp-1.16.pdf).
## How to use
```python
from transformers import AutoTokenizer, AutoModel
tokenizer = AutoTokenizer.from_pretrained("Geotrend/distilbert-base-en-fr-cased")
model = AutoModel.from_pretrained("Geotrend/distilbert-base-en-fr-cased")
```
To generate other smaller versions of multilingual transformers please visit [our Github repo](https://github.com/Geotrend-research/smaller-transformers).
### How to cite
```bibtex
@inproceedings{smallermdistilbert,
title={Load What You Need: Smaller Versions of Mutlilingual BERT},
author={Abdaoui, Amine and Pradel, Camille and Sigel, Grégoire},
booktitle={SustaiNLP / EMNLP},
year={2020}
}
```
## Contact
Please contact [email protected] for any question, feedback or request. | {"language": ["multilingual", "en", "fr"], "license": "apache-2.0", "datasets": "wikipedia"} | Geotrend/distilbert-base-en-fr-cased | null | [
"transformers",
"pytorch",
"safetensors",
"distilbert",
"fill-mask",
"multilingual",
"en",
"fr",
"dataset:wikipedia",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | [] | [
"multilingual",
"en",
"fr"
] | TAGS
#transformers #pytorch #safetensors #distilbert #fill-mask #multilingual #en #fr #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
|
# distilbert-base-en-fr-cased
We are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.
Our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.
## How to use
To generate other smaller versions of multilingual transformers please visit our Github repo.
### How to cite
## Contact
Please contact amine@URL for any question, feedback or request. | [
"# distilbert-base-en-fr-cased\n\nWe are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.\n\nOur versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] | [
"TAGS\n#transformers #pytorch #safetensors #distilbert #fill-mask #multilingual #en #fr #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n",
"# distilbert-base-en-fr-cased\n\nWe are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.\n\nOur versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] |
fill-mask | transformers |
# distilbert-base-en-fr-da-ja-vi-cased
We are sharing smaller versions of [distilbert-base-multilingual-cased](https://huggingface.co/distilbert-base-multilingual-cased) that handle a custom number of languages.
Our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: [Load What You Need: Smaller Versions of Multilingual BERT](https://www.aclweb.org/anthology/2020.sustainlp-1.16.pdf).
## How to use
```python
from transformers import AutoTokenizer, AutoModel
tokenizer = AutoTokenizer.from_pretrained("Geotrend/distilbert-base-en-fr-da-ja-vi-cased")
model = AutoModel.from_pretrained("Geotrend/distilbert-base-en-fr-da-ja-vi-cased")
```
To generate other smaller versions of multilingual transformers please visit [our Github repo](https://github.com/Geotrend-research/smaller-transformers).
### How to cite
```bibtex
@inproceedings{smallermdistilbert,
title={Load What You Need: Smaller Versions of Mutlilingual BERT},
author={Abdaoui, Amine and Pradel, Camille and Sigel, Grégoire},
booktitle={SustaiNLP / EMNLP},
year={2020}
}
```
## Contact
Please contact [email protected] for any question, feedback or request. | {"language": "multilingual", "license": "apache-2.0", "datasets": "wikipedia"} | Geotrend/distilbert-base-en-fr-da-ja-vi-cased | null | [
"transformers",
"pytorch",
"distilbert",
"fill-mask",
"multilingual",
"dataset:wikipedia",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | [] | [
"multilingual"
] | TAGS
#transformers #pytorch #distilbert #fill-mask #multilingual #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
|
# distilbert-base-en-fr-da-ja-vi-cased
We are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.
Our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.
## How to use
To generate other smaller versions of multilingual transformers please visit our Github repo.
### How to cite
## Contact
Please contact amine@URL for any question, feedback or request. | [
"# distilbert-base-en-fr-da-ja-vi-cased\n\nWe are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.\n\nOur versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] | [
"TAGS\n#transformers #pytorch #distilbert #fill-mask #multilingual #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n",
"# distilbert-base-en-fr-da-ja-vi-cased\n\nWe are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.\n\nOur versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] |
fill-mask | transformers |
# distilbert-base-en-fr-de-cased
We are sharing smaller versions of [distilbert-base-multilingual-cased](https://huggingface.co/distilbert-base-multilingual-cased) that handle a custom number of languages.
Our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: [Load What You Need: Smaller Versions of Multilingual BERT](https://www.aclweb.org/anthology/2020.sustainlp-1.16.pdf).
## How to use
```python
from transformers import AutoTokenizer, AutoModel
tokenizer = AutoTokenizer.from_pretrained("Geotrend/distilbert-base-en-fr-de-cased")
model = AutoModel.from_pretrained("Geotrend/distilbert-base-en-fr-de-cased")
```
To generate other smaller versions of multilingual transformers please visit [our Github repo](https://github.com/Geotrend-research/smaller-transformers).
### How to cite
```bibtex
@inproceedings{smallermdistilbert,
title={Load What You Need: Smaller Versions of Mutlilingual BERT},
author={Abdaoui, Amine and Pradel, Camille and Sigel, Grégoire},
booktitle={SustaiNLP / EMNLP},
year={2020}
}
```
## Contact
Please contact [email protected] for any question, feedback or request. | {"language": "multilingual", "license": "apache-2.0", "datasets": "wikipedia"} | Geotrend/distilbert-base-en-fr-de-cased | null | [
"transformers",
"pytorch",
"distilbert",
"fill-mask",
"multilingual",
"dataset:wikipedia",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | [] | [
"multilingual"
] | TAGS
#transformers #pytorch #distilbert #fill-mask #multilingual #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
|
# distilbert-base-en-fr-de-cased
We are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.
Our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.
## How to use
To generate other smaller versions of multilingual transformers please visit our Github repo.
### How to cite
## Contact
Please contact amine@URL for any question, feedback or request. | [
"# distilbert-base-en-fr-de-cased\n\nWe are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.\n\nOur versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] | [
"TAGS\n#transformers #pytorch #distilbert #fill-mask #multilingual #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n",
"# distilbert-base-en-fr-de-cased\n\nWe are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.\n\nOur versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] |
fill-mask | transformers |
# distilbert-base-en-fr-de-no-da-cased
We are sharing smaller versions of [distilbert-base-multilingual-cased](https://huggingface.co/distilbert-base-multilingual-cased) that handle a custom number of languages.
Our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: [Load What You Need: Smaller Versions of Multilingual BERT](https://www.aclweb.org/anthology/2020.sustainlp-1.16.pdf).
## How to use
```python
from transformers import AutoTokenizer, AutoModel
tokenizer = AutoTokenizer.from_pretrained("Geotrend/distilbert-base-en-fr-de-no-da-cased")
model = AutoModel.from_pretrained("Geotrend/distilbert-base-en-fr-de-no-da-cased")
```
To generate other smaller versions of multilingual transformers please visit [our Github repo](https://github.com/Geotrend-research/smaller-transformers).
### How to cite
```bibtex
@inproceedings{smallermdistilbert,
title={Load What You Need: Smaller Versions of Mutlilingual BERT},
author={Abdaoui, Amine and Pradel, Camille and Sigel, Grégoire},
booktitle={SustaiNLP / EMNLP},
year={2020}
}
```
## Contact
Please contact [email protected] for any question, feedback or request. | {"language": "multilingual", "license": "apache-2.0", "datasets": "wikipedia"} | Geotrend/distilbert-base-en-fr-de-no-da-cased | null | [
"transformers",
"pytorch",
"distilbert",
"fill-mask",
"multilingual",
"dataset:wikipedia",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | [] | [
"multilingual"
] | TAGS
#transformers #pytorch #distilbert #fill-mask #multilingual #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
|
# distilbert-base-en-fr-de-no-da-cased
We are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.
Our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.
## How to use
To generate other smaller versions of multilingual transformers please visit our Github repo.
### How to cite
## Contact
Please contact amine@URL for any question, feedback or request. | [
"# distilbert-base-en-fr-de-no-da-cased\n\nWe are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.\n\nOur versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] | [
"TAGS\n#transformers #pytorch #distilbert #fill-mask #multilingual #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n",
"# distilbert-base-en-fr-de-no-da-cased\n\nWe are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.\n\nOur versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] |
fill-mask | transformers |
# distilbert-base-en-fr-es-cased
We are sharing smaller versions of [distilbert-base-multilingual-cased](https://huggingface.co/distilbert-base-multilingual-cased) that handle a custom number of languages.
Our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: [Load What You Need: Smaller Versions of Multilingual BERT](https://www.aclweb.org/anthology/2020.sustainlp-1.16.pdf).
## How to use
```python
from transformers import AutoTokenizer, AutoModel
tokenizer = AutoTokenizer.from_pretrained("Geotrend/distilbert-base-en-fr-es-cased")
model = AutoModel.from_pretrained("Geotrend/distilbert-base-en-fr-es-cased")
```
To generate other smaller versions of multilingual transformers please visit [our Github repo](https://github.com/Geotrend-research/smaller-transformers).
### How to cite
```bibtex
@inproceedings{smallermdistilbert,
title={Load What You Need: Smaller Versions of Mutlilingual BERT},
author={Abdaoui, Amine and Pradel, Camille and Sigel, Grégoire},
booktitle={SustaiNLP / EMNLP},
year={2020}
}
```
## Contact
Please contact [email protected] for any question, feedback or request. | {"language": "multilingual", "license": "apache-2.0", "datasets": "wikipedia"} | Geotrend/distilbert-base-en-fr-es-cased | null | [
"transformers",
"pytorch",
"distilbert",
"fill-mask",
"multilingual",
"dataset:wikipedia",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | [] | [
"multilingual"
] | TAGS
#transformers #pytorch #distilbert #fill-mask #multilingual #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
|
# distilbert-base-en-fr-es-cased
We are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.
Our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.
## How to use
To generate other smaller versions of multilingual transformers please visit our Github repo.
### How to cite
## Contact
Please contact amine@URL for any question, feedback or request. | [
"# distilbert-base-en-fr-es-cased\n\nWe are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.\n\nOur versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] | [
"TAGS\n#transformers #pytorch #distilbert #fill-mask #multilingual #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n",
"# distilbert-base-en-fr-es-cased\n\nWe are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.\n\nOur versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] |
fill-mask | transformers |
# distilbert-base-en-fr-es-de-zh-cased
We are sharing smaller versions of [distilbert-base-multilingual-cased](https://huggingface.co/distilbert-base-multilingual-cased) that handle a custom number of languages.
Our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: [Load What You Need: Smaller Versions of Multilingual BERT](https://www.aclweb.org/anthology/2020.sustainlp-1.16.pdf).
## How to use
```python
from transformers import AutoTokenizer, AutoModel
tokenizer = AutoTokenizer.from_pretrained("Geotrend/distilbert-base-en-fr-es-de-zh-cased")
model = AutoModel.from_pretrained("Geotrend/distilbert-base-en-fr-es-de-zh-cased")
```
To generate other smaller versions of multilingual transformers please visit [our Github repo](https://github.com/Geotrend-research/smaller-transformers).
### How to cite
```bibtex
@inproceedings{smallermdistilbert,
title={Load What You Need: Smaller Versions of Mutlilingual BERT},
author={Abdaoui, Amine and Pradel, Camille and Sigel, Grégoire},
booktitle={SustaiNLP / EMNLP},
year={2020}
}
```
## Contact
Please contact [email protected] for any question, feedback or request. | {"language": "multilingual", "license": "apache-2.0", "datasets": "wikipedia"} | Geotrend/distilbert-base-en-fr-es-de-zh-cased | null | [
"transformers",
"pytorch",
"distilbert",
"fill-mask",
"multilingual",
"dataset:wikipedia",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | [] | [
"multilingual"
] | TAGS
#transformers #pytorch #distilbert #fill-mask #multilingual #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
|
# distilbert-base-en-fr-es-de-zh-cased
We are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.
Our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.
## How to use
To generate other smaller versions of multilingual transformers please visit our Github repo.
### How to cite
## Contact
Please contact amine@URL for any question, feedback or request. | [
"# distilbert-base-en-fr-es-de-zh-cased\n\nWe are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.\n\nOur versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] | [
"TAGS\n#transformers #pytorch #distilbert #fill-mask #multilingual #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n",
"# distilbert-base-en-fr-es-de-zh-cased\n\nWe are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.\n\nOur versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] |
fill-mask | transformers |
# distilbert-base-en-fr-es-pt-it-cased
We are sharing smaller versions of [distilbert-base-multilingual-cased](https://huggingface.co/distilbert-base-multilingual-cased) that handle a custom number of languages.
Our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: [Load What You Need: Smaller Versions of Multilingual BERT](https://www.aclweb.org/anthology/2020.sustainlp-1.16.pdf).
## How to use
```python
from transformers import AutoTokenizer, AutoModel
tokenizer = AutoTokenizer.from_pretrained("Geotrend/distilbert-base-en-fr-es-pt-it-cased")
model = AutoModel.from_pretrained("Geotrend/distilbert-base-en-fr-es-pt-it-cased")
```
To generate other smaller versions of multilingual transformers please visit [our Github repo](https://github.com/Geotrend-research/smaller-transformers).
### How to cite
```bibtex
@inproceedings{smallermdistilbert,
title={Load What You Need: Smaller Versions of Mutlilingual BERT},
author={Abdaoui, Amine and Pradel, Camille and Sigel, Grégoire},
booktitle={SustaiNLP / EMNLP},
year={2020}
}
```
## Contact
Please contact [email protected] for any question, feedback or request. | {"language": ["multilingual", "en", "fr", "es", "pt", "it"], "license": "apache-2.0", "datasets": "wikipedia"} | Geotrend/distilbert-base-en-fr-es-pt-it-cased | null | [
"transformers",
"pytorch",
"distilbert",
"fill-mask",
"multilingual",
"en",
"fr",
"es",
"pt",
"it",
"dataset:wikipedia",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | [] | [
"multilingual",
"en",
"fr",
"es",
"pt",
"it"
] | TAGS
#transformers #pytorch #distilbert #fill-mask #multilingual #en #fr #es #pt #it #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
|
# distilbert-base-en-fr-es-pt-it-cased
We are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.
Our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.
## How to use
To generate other smaller versions of multilingual transformers please visit our Github repo.
### How to cite
## Contact
Please contact amine@URL for any question, feedback or request. | [
"# distilbert-base-en-fr-es-pt-it-cased\n\nWe are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.\n\nOur versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] | [
"TAGS\n#transformers #pytorch #distilbert #fill-mask #multilingual #en #fr #es #pt #it #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n",
"# distilbert-base-en-fr-es-pt-it-cased\n\nWe are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.\n\nOur versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] |
fill-mask | transformers |
# distilbert-base-en-fr-it-cased
We are sharing smaller versions of [distilbert-base-multilingual-cased](https://huggingface.co/distilbert-base-multilingual-cased) that handle a custom number of languages.
Our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: [Load What You Need: Smaller Versions of Multilingual BERT](https://www.aclweb.org/anthology/2020.sustainlp-1.16.pdf).
## How to use
```python
from transformers import AutoTokenizer, AutoModel
tokenizer = AutoTokenizer.from_pretrained("Geotrend/distilbert-base-en-fr-it-cased")
model = AutoModel.from_pretrained("Geotrend/distilbert-base-en-fr-it-cased")
```
To generate other smaller versions of multilingual transformers please visit [our Github repo](https://github.com/Geotrend-research/smaller-transformers).
### How to cite
```bibtex
@inproceedings{smallermdistilbert,
title={Load What You Need: Smaller Versions of Mutlilingual BERT},
author={Abdaoui, Amine and Pradel, Camille and Sigel, Grégoire},
booktitle={SustaiNLP / EMNLP},
year={2020}
}
```
## Contact
Please contact [email protected] for any question, feedback or request. | {"language": "multilingual", "license": "apache-2.0", "datasets": "wikipedia"} | Geotrend/distilbert-base-en-fr-it-cased | null | [
"transformers",
"pytorch",
"distilbert",
"fill-mask",
"multilingual",
"dataset:wikipedia",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | [] | [
"multilingual"
] | TAGS
#transformers #pytorch #distilbert #fill-mask #multilingual #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
|
# distilbert-base-en-fr-it-cased
We are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.
Our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.
## How to use
To generate other smaller versions of multilingual transformers please visit our Github repo.
### How to cite
## Contact
Please contact amine@URL for any question, feedback or request. | [
"# distilbert-base-en-fr-it-cased\n\nWe are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.\n\nOur versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] | [
"TAGS\n#transformers #pytorch #distilbert #fill-mask #multilingual #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n",
"# distilbert-base-en-fr-it-cased\n\nWe are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.\n\nOur versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] |
fill-mask | transformers |
# distilbert-base-en-fr-lt-no-pl-cased
We are sharing smaller versions of [distilbert-base-multilingual-cased](https://huggingface.co/distilbert-base-multilingual-cased) that handle a custom number of languages.
Our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: [Load What You Need: Smaller Versions of Multilingual BERT](https://www.aclweb.org/anthology/2020.sustainlp-1.16.pdf).
## How to use
```python
from transformers import AutoTokenizer, AutoModel
tokenizer = AutoTokenizer.from_pretrained("Geotrend/distilbert-base-en-fr-lt-no-pl-cased")
model = AutoModel.from_pretrained("Geotrend/distilbert-base-en-fr-lt-no-pl-cased")
```
To generate other smaller versions of multilingual transformers please visit [our Github repo](https://github.com/Geotrend-research/smaller-transformers).
### How to cite
```bibtex
@inproceedings{smallermdistilbert,
title={Load What You Need: Smaller Versions of Mutlilingual BERT},
author={Abdaoui, Amine and Pradel, Camille and Sigel, Grégoire},
booktitle={SustaiNLP / EMNLP},
year={2020}
}
```
## Contact
Please contact [email protected] for any question, feedback or request. | {"language": "multilingual", "license": "apache-2.0", "datasets": "wikipedia"} | Geotrend/distilbert-base-en-fr-lt-no-pl-cased | null | [
"transformers",
"pytorch",
"distilbert",
"fill-mask",
"multilingual",
"dataset:wikipedia",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | [] | [
"multilingual"
] | TAGS
#transformers #pytorch #distilbert #fill-mask #multilingual #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
|
# distilbert-base-en-fr-lt-no-pl-cased
We are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.
Our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.
## How to use
To generate other smaller versions of multilingual transformers please visit our Github repo.
### How to cite
## Contact
Please contact amine@URL for any question, feedback or request. | [
"# distilbert-base-en-fr-lt-no-pl-cased\n\nWe are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.\n\nOur versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] | [
"TAGS\n#transformers #pytorch #distilbert #fill-mask #multilingual #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n",
"# distilbert-base-en-fr-lt-no-pl-cased\n\nWe are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.\n\nOur versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] |
fill-mask | transformers |
# distilbert-base-en-fr-nl-ru-ar-cased
We are sharing smaller versions of [distilbert-base-multilingual-cased](https://huggingface.co/distilbert-base-multilingual-cased) that handle a custom number of languages.
Our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: [Load What You Need: Smaller Versions of Multilingual BERT](https://www.aclweb.org/anthology/2020.sustainlp-1.16.pdf).
## How to use
```python
from transformers import AutoTokenizer, AutoModel
tokenizer = AutoTokenizer.from_pretrained("Geotrend/distilbert-base-en-fr-nl-ru-ar-cased")
model = AutoModel.from_pretrained("Geotrend/distilbert-base-en-fr-nl-ru-ar-cased")
```
To generate other smaller versions of multilingual transformers please visit [our Github repo](https://github.com/Geotrend-research/smaller-transformers).
### How to cite
```bibtex
@inproceedings{smallermdistilbert,
title={Load What You Need: Smaller Versions of Mutlilingual BERT},
author={Abdaoui, Amine and Pradel, Camille and Sigel, Grégoire},
booktitle={SustaiNLP / EMNLP},
year={2020}
}
```
## Contact
Please contact [email protected] for any question, feedback or request. | {"language": "multilingual", "license": "apache-2.0", "datasets": "wikipedia"} | Geotrend/distilbert-base-en-fr-nl-ru-ar-cased | null | [
"transformers",
"pytorch",
"distilbert",
"fill-mask",
"multilingual",
"dataset:wikipedia",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | [] | [
"multilingual"
] | TAGS
#transformers #pytorch #distilbert #fill-mask #multilingual #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
|
# distilbert-base-en-fr-nl-ru-ar-cased
We are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.
Our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.
## How to use
To generate other smaller versions of multilingual transformers please visit our Github repo.
### How to cite
## Contact
Please contact amine@URL for any question, feedback or request. | [
"# distilbert-base-en-fr-nl-ru-ar-cased\n\nWe are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.\n\nOur versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] | [
"TAGS\n#transformers #pytorch #distilbert #fill-mask #multilingual #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n",
"# distilbert-base-en-fr-nl-ru-ar-cased\n\nWe are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.\n\nOur versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] |
fill-mask | transformers |
# distilbert-base-en-fr-uk-el-ro-cased
We are sharing smaller versions of [distilbert-base-multilingual-cased](https://huggingface.co/distilbert-base-multilingual-cased) that handle a custom number of languages.
Our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: [Load What You Need: Smaller Versions of Multilingual BERT](https://www.aclweb.org/anthology/2020.sustainlp-1.16.pdf).
## How to use
```python
from transformers import AutoTokenizer, AutoModel
tokenizer = AutoTokenizer.from_pretrained("Geotrend/distilbert-base-en-fr-uk-el-ro-cased")
model = AutoModel.from_pretrained("Geotrend/distilbert-base-en-fr-uk-el-ro-cased")
```
To generate other smaller versions of multilingual transformers please visit [our Github repo](https://github.com/Geotrend-research/smaller-transformers).
### How to cite
```bibtex
@inproceedings{smallermdistilbert,
title={Load What You Need: Smaller Versions of Mutlilingual BERT},
author={Abdaoui, Amine and Pradel, Camille and Sigel, Grégoire},
booktitle={SustaiNLP / EMNLP},
year={2020}
}
```
## Contact
Please contact [email protected] for any question, feedback or request. | {"language": "multilingual", "license": "apache-2.0", "datasets": "wikipedia"} | Geotrend/distilbert-base-en-fr-uk-el-ro-cased | null | [
"transformers",
"pytorch",
"distilbert",
"fill-mask",
"multilingual",
"dataset:wikipedia",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | [] | [
"multilingual"
] | TAGS
#transformers #pytorch #distilbert #fill-mask #multilingual #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
|
# distilbert-base-en-fr-uk-el-ro-cased
We are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.
Our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.
## How to use
To generate other smaller versions of multilingual transformers please visit our Github repo.
### How to cite
## Contact
Please contact amine@URL for any question, feedback or request. | [
"# distilbert-base-en-fr-uk-el-ro-cased\n\nWe are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.\n\nOur versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] | [
"TAGS\n#transformers #pytorch #distilbert #fill-mask #multilingual #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n",
"# distilbert-base-en-fr-uk-el-ro-cased\n\nWe are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.\n\nOur versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] |
fill-mask | transformers |
# distilbert-base-en-fr-zh-cased
We are sharing smaller versions of [distilbert-base-multilingual-cased](https://huggingface.co/distilbert-base-multilingual-cased) that handle a custom number of languages.
Our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: [Load What You Need: Smaller Versions of Multilingual BERT](https://www.aclweb.org/anthology/2020.sustainlp-1.16.pdf).
## How to use
```python
from transformers import AutoTokenizer, AutoModel
tokenizer = AutoTokenizer.from_pretrained("Geotrend/distilbert-base-en-fr-zh-cased")
model = AutoModel.from_pretrained("Geotrend/distilbert-base-en-fr-zh-cased")
```
To generate other smaller versions of multilingual transformers please visit [our Github repo](https://github.com/Geotrend-research/smaller-transformers).
### How to cite
```bibtex
@inproceedings{smallermdistilbert,
title={Load What You Need: Smaller Versions of Mutlilingual BERT},
author={Abdaoui, Amine and Pradel, Camille and Sigel, Grégoire},
booktitle={SustaiNLP / EMNLP},
year={2020}
}
```
## Contact
Please contact [email protected] for any question, feedback or request. | {"language": "multilingual", "license": "apache-2.0", "datasets": "wikipedia"} | Geotrend/distilbert-base-en-fr-zh-cased | null | [
"transformers",
"pytorch",
"safetensors",
"distilbert",
"fill-mask",
"multilingual",
"dataset:wikipedia",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | [] | [
"multilingual"
] | TAGS
#transformers #pytorch #safetensors #distilbert #fill-mask #multilingual #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
|
# distilbert-base-en-fr-zh-cased
We are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.
Our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.
## How to use
To generate other smaller versions of multilingual transformers please visit our Github repo.
### How to cite
## Contact
Please contact amine@URL for any question, feedback or request. | [
"# distilbert-base-en-fr-zh-cased\n\nWe are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.\n\nOur versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] | [
"TAGS\n#transformers #pytorch #safetensors #distilbert #fill-mask #multilingual #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n",
"# distilbert-base-en-fr-zh-cased\n\nWe are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.\n\nOur versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] |
fill-mask | transformers |
# distilbert-base-en-fr-zh-ja-vi-cased
We are sharing smaller versions of [distilbert-base-multilingual-cased](https://huggingface.co/distilbert-base-multilingual-cased) that handle a custom number of languages.
Our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: [Load What You Need: Smaller Versions of Multilingual BERT](https://www.aclweb.org/anthology/2020.sustainlp-1.16.pdf).
## How to use
```python
from transformers import AutoTokenizer, AutoModel
tokenizer = AutoTokenizer.from_pretrained("Geotrend/distilbert-base-en-fr-zh-ja-vi-cased")
model = AutoModel.from_pretrained("Geotrend/distilbert-base-en-fr-zh-ja-vi-cased")
```
To generate other smaller versions of multilingual transformers please visit [our Github repo](https://github.com/Geotrend-research/smaller-transformers).
### How to cite
```bibtex
@inproceedings{smallermdistilbert,
title={Load What You Need: Smaller Versions of Mutlilingual BERT},
author={Abdaoui, Amine and Pradel, Camille and Sigel, Grégoire},
booktitle={SustaiNLP / EMNLP},
year={2020}
}
```
## Contact
Please contact [email protected] for any question, feedback or request. | {"language": "multilingual", "license": "apache-2.0", "datasets": "wikipedia"} | Geotrend/distilbert-base-en-fr-zh-ja-vi-cased | null | [
"transformers",
"pytorch",
"distilbert",
"fill-mask",
"multilingual",
"dataset:wikipedia",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | [] | [
"multilingual"
] | TAGS
#transformers #pytorch #distilbert #fill-mask #multilingual #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
|
# distilbert-base-en-fr-zh-ja-vi-cased
We are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.
Our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.
## How to use
To generate other smaller versions of multilingual transformers please visit our Github repo.
### How to cite
## Contact
Please contact amine@URL for any question, feedback or request. | [
"# distilbert-base-en-fr-zh-ja-vi-cased\n\nWe are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.\n\nOur versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] | [
"TAGS\n#transformers #pytorch #distilbert #fill-mask #multilingual #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n",
"# distilbert-base-en-fr-zh-ja-vi-cased\n\nWe are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.\n\nOur versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] |
fill-mask | transformers |
# distilbert-base-en-hi-cased
We are sharing smaller versions of [distilbert-base-multilingual-cased](https://huggingface.co/distilbert-base-multilingual-cased) that handle a custom number of languages.
Our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: [Load What You Need: Smaller Versions of Multilingual BERT](https://www.aclweb.org/anthology/2020.sustainlp-1.16.pdf).
## How to use
```python
from transformers import AutoTokenizer, AutoModel
tokenizer = AutoTokenizer.from_pretrained("Geotrend/distilbert-base-en-hi-cased")
model = AutoModel.from_pretrained("Geotrend/distilbert-base-en-hi-cased")
```
To generate other smaller versions of multilingual transformers please visit [our Github repo](https://github.com/Geotrend-research/smaller-transformers).
### How to cite
```bibtex
@inproceedings{smallermdistilbert,
title={Load What You Need: Smaller Versions of Mutlilingual BERT},
author={Abdaoui, Amine and Pradel, Camille and Sigel, Grégoire},
booktitle={SustaiNLP / EMNLP},
year={2020}
}
```
## Contact
Please contact [email protected] for any question, feedback or request. | {"language": "multilingual", "license": "apache-2.0", "datasets": "wikipedia"} | Geotrend/distilbert-base-en-hi-cased | null | [
"transformers",
"pytorch",
"distilbert",
"fill-mask",
"multilingual",
"dataset:wikipedia",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | [] | [
"multilingual"
] | TAGS
#transformers #pytorch #distilbert #fill-mask #multilingual #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
|
# distilbert-base-en-hi-cased
We are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.
Our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.
## How to use
To generate other smaller versions of multilingual transformers please visit our Github repo.
### How to cite
## Contact
Please contact amine@URL for any question, feedback or request. | [
"# distilbert-base-en-hi-cased\n\nWe are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.\n\nOur versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] | [
"TAGS\n#transformers #pytorch #distilbert #fill-mask #multilingual #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n",
"# distilbert-base-en-hi-cased\n\nWe are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.\n\nOur versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] |
fill-mask | transformers |
# distilbert-base-en-it-cased
We are sharing smaller versions of [distilbert-base-multilingual-cased](https://huggingface.co/distilbert-base-multilingual-cased) that handle a custom number of languages.
Our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: [Load What You Need: Smaller Versions of Multilingual BERT](https://www.aclweb.org/anthology/2020.sustainlp-1.16.pdf).
## How to use
```python
from transformers import AutoTokenizer, AutoModel
tokenizer = AutoTokenizer.from_pretrained("Geotrend/distilbert-base-en-it-cased")
model = AutoModel.from_pretrained("Geotrend/distilbert-base-en-it-cased")
```
To generate other smaller versions of multilingual transformers please visit [our Github repo](https://github.com/Geotrend-research/smaller-transformers).
### How to cite
```bibtex
@inproceedings{smallermdistilbert,
title={Load What You Need: Smaller Versions of Mutlilingual BERT},
author={Abdaoui, Amine and Pradel, Camille and Sigel, Grégoire},
booktitle={SustaiNLP / EMNLP},
year={2020}
}
```
## Contact
Please contact [email protected] for any question, feedback or request. | {"language": "multilingual", "license": "apache-2.0", "datasets": "wikipedia"} | Geotrend/distilbert-base-en-it-cased | null | [
"transformers",
"pytorch",
"distilbert",
"fill-mask",
"multilingual",
"dataset:wikipedia",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | [] | [
"multilingual"
] | TAGS
#transformers #pytorch #distilbert #fill-mask #multilingual #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
|
# distilbert-base-en-it-cased
We are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.
Our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.
## How to use
To generate other smaller versions of multilingual transformers please visit our Github repo.
### How to cite
## Contact
Please contact amine@URL for any question, feedback or request. | [
"# distilbert-base-en-it-cased\n\nWe are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.\n\nOur versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] | [
"TAGS\n#transformers #pytorch #distilbert #fill-mask #multilingual #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n",
"# distilbert-base-en-it-cased\n\nWe are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.\n\nOur versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] |
fill-mask | transformers |
# distilbert-base-en-ja-cased
We are sharing smaller versions of [distilbert-base-multilingual-cased](https://huggingface.co/distilbert-base-multilingual-cased) that handle a custom number of languages.
Our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: [Load What You Need: Smaller Versions of Multilingual BERT](https://www.aclweb.org/anthology/2020.sustainlp-1.16.pdf).
## How to use
```python
from transformers import AutoTokenizer, AutoModel
tokenizer = AutoTokenizer.from_pretrained("Geotrend/distilbert-base-en-ja-cased")
model = AutoModel.from_pretrained("Geotrend/distilbert-base-en-ja-cased")
```
To generate other smaller versions of multilingual transformers please visit [our Github repo](https://github.com/Geotrend-research/smaller-transformers).
### How to cite
```bibtex
@inproceedings{smallermdistilbert,
title={Load What You Need: Smaller Versions of Mutlilingual BERT},
author={Abdaoui, Amine and Pradel, Camille and Sigel, Grégoire},
booktitle={SustaiNLP / EMNLP},
year={2020}
}
```
## Contact
Please contact [email protected] for any question, feedback or request. | {"language": "multilingual", "license": "apache-2.0", "datasets": "wikipedia"} | Geotrend/distilbert-base-en-ja-cased | null | [
"transformers",
"pytorch",
"safetensors",
"distilbert",
"fill-mask",
"multilingual",
"dataset:wikipedia",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | [] | [
"multilingual"
] | TAGS
#transformers #pytorch #safetensors #distilbert #fill-mask #multilingual #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
|
# distilbert-base-en-ja-cased
We are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.
Our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.
## How to use
To generate other smaller versions of multilingual transformers please visit our Github repo.
### How to cite
## Contact
Please contact amine@URL for any question, feedback or request. | [
"# distilbert-base-en-ja-cased\n\nWe are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.\n\nOur versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] | [
"TAGS\n#transformers #pytorch #safetensors #distilbert #fill-mask #multilingual #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n",
"# distilbert-base-en-ja-cased\n\nWe are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.\n\nOur versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] |
fill-mask | transformers |
# distilbert-base-en-lt-cased
We are sharing smaller versions of [distilbert-base-multilingual-cased](https://huggingface.co/distilbert-base-multilingual-cased) that handle a custom number of languages.
Our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: [Load What You Need: Smaller Versions of Multilingual BERT](https://www.aclweb.org/anthology/2020.sustainlp-1.16.pdf).
## How to use
```python
from transformers import AutoTokenizer, AutoModel
tokenizer = AutoTokenizer.from_pretrained("Geotrend/distilbert-base-en-lt-cased")
model = AutoModel.from_pretrained("Geotrend/distilbert-base-en-lt-cased")
```
To generate other smaller versions of multilingual transformers please visit [our Github repo](https://github.com/Geotrend-research/smaller-transformers).
### How to cite
```bibtex
@inproceedings{smallermdistilbert,
title={Load What You Need: Smaller Versions of Mutlilingual BERT},
author={Abdaoui, Amine and Pradel, Camille and Sigel, Grégoire},
booktitle={SustaiNLP / EMNLP},
year={2020}
}
```
## Contact
Please contact [email protected] for any question, feedback or request. | {"language": "multilingual", "license": "apache-2.0", "datasets": "wikipedia"} | Geotrend/distilbert-base-en-lt-cased | null | [
"transformers",
"pytorch",
"safetensors",
"distilbert",
"fill-mask",
"multilingual",
"dataset:wikipedia",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | [] | [
"multilingual"
] | TAGS
#transformers #pytorch #safetensors #distilbert #fill-mask #multilingual #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
|
# distilbert-base-en-lt-cased
We are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.
Our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.
## How to use
To generate other smaller versions of multilingual transformers please visit our Github repo.
### How to cite
## Contact
Please contact amine@URL for any question, feedback or request. | [
"# distilbert-base-en-lt-cased\n\nWe are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.\n\nOur versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] | [
"TAGS\n#transformers #pytorch #safetensors #distilbert #fill-mask #multilingual #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n",
"# distilbert-base-en-lt-cased\n\nWe are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.\n\nOur versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] |
fill-mask | transformers |
# distilbert-base-en-nl-cased
We are sharing smaller versions of [distilbert-base-multilingual-cased](https://huggingface.co/distilbert-base-multilingual-cased) that handle a custom number of languages.
Our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: [Load What You Need: Smaller Versions of Multilingual BERT](https://www.aclweb.org/anthology/2020.sustainlp-1.16.pdf).
## How to use
```python
from transformers import AutoTokenizer, AutoModel
tokenizer = AutoTokenizer.from_pretrained("Geotrend/distilbert-base-en-nl-cased")
model = AutoModel.from_pretrained("Geotrend/distilbert-base-en-nl-cased")
```
To generate other smaller versions of multilingual transformers please visit [our Github repo](https://github.com/Geotrend-research/smaller-transformers).
### How to cite
```bibtex
@inproceedings{smallermdistilbert,
title={Load What You Need: Smaller Versions of Mutlilingual BERT},
author={Abdaoui, Amine and Pradel, Camille and Sigel, Grégoire},
booktitle={SustaiNLP / EMNLP},
year={2020}
}
```
## Contact
Please contact [email protected] for any question, feedback or request. | {"language": "multilingual", "license": "apache-2.0", "datasets": "wikipedia"} | Geotrend/distilbert-base-en-nl-cased | null | [
"transformers",
"pytorch",
"safetensors",
"distilbert",
"fill-mask",
"multilingual",
"dataset:wikipedia",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | [] | [
"multilingual"
] | TAGS
#transformers #pytorch #safetensors #distilbert #fill-mask #multilingual #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
|
# distilbert-base-en-nl-cased
We are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.
Our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.
## How to use
To generate other smaller versions of multilingual transformers please visit our Github repo.
### How to cite
## Contact
Please contact amine@URL for any question, feedback or request. | [
"# distilbert-base-en-nl-cased\n\nWe are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.\n\nOur versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] | [
"TAGS\n#transformers #pytorch #safetensors #distilbert #fill-mask #multilingual #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n",
"# distilbert-base-en-nl-cased\n\nWe are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.\n\nOur versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] |
fill-mask | transformers |
# distilbert-base-en-no-cased
We are sharing smaller versions of [distilbert-base-multilingual-cased](https://huggingface.co/distilbert-base-multilingual-cased) that handle a custom number of languages.
Our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: [Load What You Need: Smaller Versions of Multilingual BERT](https://www.aclweb.org/anthology/2020.sustainlp-1.16.pdf).
## How to use
```python
from transformers import AutoTokenizer, AutoModel
tokenizer = AutoTokenizer.from_pretrained("Geotrend/distilbert-base-en-no-cased")
model = AutoModel.from_pretrained("Geotrend/distilbert-base-en-no-cased")
```
To generate other smaller versions of multilingual transformers please visit [our Github repo](https://github.com/Geotrend-research/smaller-transformers).
### How to cite
```bibtex
@inproceedings{smallermdistilbert,
title={Load What You Need: Smaller Versions of Mutlilingual BERT},
author={Abdaoui, Amine and Pradel, Camille and Sigel, Grégoire},
booktitle={SustaiNLP / EMNLP},
year={2020}
}
```
## Contact
Please contact [email protected] for any question, feedback or request. | {"language": "multilingual", "license": "apache-2.0", "datasets": "wikipedia"} | Geotrend/distilbert-base-en-no-cased | null | [
"transformers",
"pytorch",
"safetensors",
"distilbert",
"fill-mask",
"multilingual",
"dataset:wikipedia",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | [] | [
"multilingual"
] | TAGS
#transformers #pytorch #safetensors #distilbert #fill-mask #multilingual #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
|
# distilbert-base-en-no-cased
We are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.
Our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.
## How to use
To generate other smaller versions of multilingual transformers please visit our Github repo.
### How to cite
## Contact
Please contact amine@URL for any question, feedback or request. | [
"# distilbert-base-en-no-cased\n\nWe are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.\n\nOur versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] | [
"TAGS\n#transformers #pytorch #safetensors #distilbert #fill-mask #multilingual #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n",
"# distilbert-base-en-no-cased\n\nWe are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.\n\nOur versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] |
fill-mask | transformers |
# distilbert-base-en-pl-cased
We are sharing smaller versions of [distilbert-base-multilingual-cased](https://huggingface.co/distilbert-base-multilingual-cased) that handle a custom number of languages.
Our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: [Load What You Need: Smaller Versions of Multilingual BERT](https://www.aclweb.org/anthology/2020.sustainlp-1.16.pdf).
## How to use
```python
from transformers import AutoTokenizer, AutoModel
tokenizer = AutoTokenizer.from_pretrained("Geotrend/distilbert-base-en-pl-cased")
model = AutoModel.from_pretrained("Geotrend/distilbert-base-en-pl-cased")
```
To generate other smaller versions of multilingual transformers please visit [our Github repo](https://github.com/Geotrend-research/smaller-transformers).
### How to cite
```bibtex
@inproceedings{smallermdistilbert,
title={Load What You Need: Smaller Versions of Mutlilingual BERT},
author={Abdaoui, Amine and Pradel, Camille and Sigel, Grégoire},
booktitle={SustaiNLP / EMNLP},
year={2020}
}
```
## Contact
Please contact [email protected] for any question, feedback or request. | {"language": "multilingual", "license": "apache-2.0", "datasets": "wikipedia"} | Geotrend/distilbert-base-en-pl-cased | null | [
"transformers",
"pytorch",
"distilbert",
"fill-mask",
"multilingual",
"dataset:wikipedia",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | [] | [
"multilingual"
] | TAGS
#transformers #pytorch #distilbert #fill-mask #multilingual #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
|
# distilbert-base-en-pl-cased
We are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.
Our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.
## How to use
To generate other smaller versions of multilingual transformers please visit our Github repo.
### How to cite
## Contact
Please contact amine@URL for any question, feedback or request. | [
"# distilbert-base-en-pl-cased\n\nWe are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.\n\nOur versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] | [
"TAGS\n#transformers #pytorch #distilbert #fill-mask #multilingual #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n",
"# distilbert-base-en-pl-cased\n\nWe are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.\n\nOur versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] |
fill-mask | transformers |
# distilbert-base-en-pt-cased
We are sharing smaller versions of [distilbert-base-multilingual-cased](https://huggingface.co/distilbert-base-multilingual-cased) that handle a custom number of languages.
Our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: [Load What You Need: Smaller Versions of Multilingual BERT](https://www.aclweb.org/anthology/2020.sustainlp-1.16.pdf).
## How to use
```python
from transformers import AutoTokenizer, AutoModel
tokenizer = AutoTokenizer.from_pretrained("Geotrend/distilbert-base-en-pt-cased")
model = AutoModel.from_pretrained("Geotrend/distilbert-base-en-pt-cased")
```
To generate other smaller versions of multilingual transformers please visit [our Github repo](https://github.com/Geotrend-research/smaller-transformers).
### How to cite
```bibtex
@inproceedings{smallermdistilbert,
title={Load What You Need: Smaller Versions of Mutlilingual BERT},
author={Abdaoui, Amine and Pradel, Camille and Sigel, Grégoire},
booktitle={SustaiNLP / EMNLP},
year={2020}
}
```
## Contact
Please contact [email protected] for any question, feedback or request. | {"language": "multilingual", "license": "apache-2.0", "datasets": "wikipedia"} | Geotrend/distilbert-base-en-pt-cased | null | [
"transformers",
"pytorch",
"safetensors",
"distilbert",
"fill-mask",
"multilingual",
"dataset:wikipedia",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | [] | [
"multilingual"
] | TAGS
#transformers #pytorch #safetensors #distilbert #fill-mask #multilingual #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
|
# distilbert-base-en-pt-cased
We are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.
Our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.
## How to use
To generate other smaller versions of multilingual transformers please visit our Github repo.
### How to cite
## Contact
Please contact amine@URL for any question, feedback or request. | [
"# distilbert-base-en-pt-cased\n\nWe are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.\n\nOur versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] | [
"TAGS\n#transformers #pytorch #safetensors #distilbert #fill-mask #multilingual #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n",
"# distilbert-base-en-pt-cased\n\nWe are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.\n\nOur versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] |
fill-mask | transformers |
# distilbert-base-en-ro-cased
We are sharing smaller versions of [distilbert-base-multilingual-cased](https://huggingface.co/distilbert-base-multilingual-cased) that handle a custom number of languages.
Our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: [Load What You Need: Smaller Versions of Multilingual BERT](https://www.aclweb.org/anthology/2020.sustainlp-1.16.pdf).
## How to use
```python
from transformers import AutoTokenizer, AutoModel
tokenizer = AutoTokenizer.from_pretrained("Geotrend/distilbert-base-en-ro-cased")
model = AutoModel.from_pretrained("Geotrend/distilbert-base-en-ro-cased")
```
To generate other smaller versions of multilingual transformers please visit [our Github repo](https://github.com/Geotrend-research/smaller-transformers).
### How to cite
```bibtex
@inproceedings{smallermdistilbert,
title={Load What You Need: Smaller Versions of Mutlilingual BERT},
author={Abdaoui, Amine and Pradel, Camille and Sigel, Grégoire},
booktitle={SustaiNLP / EMNLP},
year={2020}
}
```
## Contact
Please contact [email protected] for any question, feedback or request. | {"language": "multilingual", "license": "apache-2.0", "datasets": "wikipedia"} | Geotrend/distilbert-base-en-ro-cased | null | [
"transformers",
"pytorch",
"distilbert",
"fill-mask",
"multilingual",
"dataset:wikipedia",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | [] | [
"multilingual"
] | TAGS
#transformers #pytorch #distilbert #fill-mask #multilingual #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
|
# distilbert-base-en-ro-cased
We are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.
Our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.
## How to use
To generate other smaller versions of multilingual transformers please visit our Github repo.
### How to cite
## Contact
Please contact amine@URL for any question, feedback or request. | [
"# distilbert-base-en-ro-cased\n\nWe are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.\n\nOur versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] | [
"TAGS\n#transformers #pytorch #distilbert #fill-mask #multilingual #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n",
"# distilbert-base-en-ro-cased\n\nWe are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.\n\nOur versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] |
fill-mask | transformers |
# distilbert-base-en-ru-cased
We are sharing smaller versions of [distilbert-base-multilingual-cased](https://huggingface.co/distilbert-base-multilingual-cased) that handle a custom number of languages.
Our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: [Load What You Need: Smaller Versions of Multilingual BERT](https://www.aclweb.org/anthology/2020.sustainlp-1.16.pdf).
## How to use
```python
from transformers import AutoTokenizer, AutoModel
tokenizer = AutoTokenizer.from_pretrained("Geotrend/distilbert-base-en-ru-cased")
model = AutoModel.from_pretrained("Geotrend/distilbert-base-en-ru-cased")
```
To generate other smaller versions of multilingual transformers please visit [our Github repo](https://github.com/Geotrend-research/smaller-transformers).
### How to cite
```bibtex
@inproceedings{smallermdistilbert,
title={Load What You Need: Smaller Versions of Mutlilingual BERT},
author={Abdaoui, Amine and Pradel, Camille and Sigel, Grégoire},
booktitle={SustaiNLP / EMNLP},
year={2020}
}
```
## Contact
Please contact [email protected] for any question, feedback or request. | {"language": "multilingual", "license": "apache-2.0", "datasets": "wikipedia"} | Geotrend/distilbert-base-en-ru-cased | null | [
"transformers",
"pytorch",
"distilbert",
"fill-mask",
"multilingual",
"dataset:wikipedia",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | [] | [
"multilingual"
] | TAGS
#transformers #pytorch #distilbert #fill-mask #multilingual #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
|
# distilbert-base-en-ru-cased
We are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.
Our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.
## How to use
To generate other smaller versions of multilingual transformers please visit our Github repo.
### How to cite
## Contact
Please contact amine@URL for any question, feedback or request. | [
"# distilbert-base-en-ru-cased\n\nWe are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.\n\nOur versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] | [
"TAGS\n#transformers #pytorch #distilbert #fill-mask #multilingual #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n",
"# distilbert-base-en-ru-cased\n\nWe are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.\n\nOur versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] |
fill-mask | transformers |
# distilbert-base-en-sw-cased
We are sharing smaller versions of [distilbert-base-multilingual-cased](https://huggingface.co/distilbert-base-multilingual-cased) that handle a custom number of languages.
Our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: [Load What You Need: Smaller Versions of Multilingual BERT](https://www.aclweb.org/anthology/2020.sustainlp-1.16.pdf).
## How to use
```python
from transformers import AutoTokenizer, AutoModel
tokenizer = AutoTokenizer.from_pretrained("Geotrend/distilbert-base-en-sw-cased")
model = AutoModel.from_pretrained("Geotrend/distilbert-base-en-sw-cased")
```
To generate other smaller versions of multilingual transformers please visit [our Github repo](https://github.com/Geotrend-research/smaller-transformers).
### How to cite
```bibtex
@inproceedings{smallermdistilbert,
title={Load What You Need: Smaller Versions of Mutlilingual BERT},
author={Abdaoui, Amine and Pradel, Camille and Sigel, Grégoire},
booktitle={SustaiNLP / EMNLP},
year={2020}
}
```
## Contact
Please contact [email protected] for any question, feedback or request. | {"language": ["multilingual", "en", "sw"], "license": "apache-2.0", "datasets": "wikipedia"} | Geotrend/distilbert-base-en-sw-cased | null | [
"transformers",
"pytorch",
"safetensors",
"distilbert",
"fill-mask",
"multilingual",
"en",
"sw",
"dataset:wikipedia",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | [] | [
"multilingual",
"en",
"sw"
] | TAGS
#transformers #pytorch #safetensors #distilbert #fill-mask #multilingual #en #sw #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
|
# distilbert-base-en-sw-cased
We are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.
Our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.
## How to use
To generate other smaller versions of multilingual transformers please visit our Github repo.
### How to cite
## Contact
Please contact amine@URL for any question, feedback or request. | [
"# distilbert-base-en-sw-cased\n\nWe are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.\n\nOur versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] | [
"TAGS\n#transformers #pytorch #safetensors #distilbert #fill-mask #multilingual #en #sw #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n",
"# distilbert-base-en-sw-cased\n\nWe are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.\n\nOur versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] |
fill-mask | transformers |
# distilbert-base-en-th-cased
We are sharing smaller versions of [distilbert-base-multilingual-cased](https://huggingface.co/distilbert-base-multilingual-cased) that handle a custom number of languages.
Our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: [Load What You Need: Smaller Versions of Multilingual BERT](https://www.aclweb.org/anthology/2020.sustainlp-1.16.pdf).
## How to use
```python
from transformers import AutoTokenizer, AutoModel
tokenizer = AutoTokenizer.from_pretrained("Geotrend/distilbert-base-en-th-cased")
model = AutoModel.from_pretrained("Geotrend/distilbert-base-en-th-cased")
```
To generate other smaller versions of multilingual transformers please visit [our Github repo](https://github.com/Geotrend-research/smaller-transformers).
### How to cite
```bibtex
@inproceedings{smallermdistilbert,
title={Load What You Need: Smaller Versions of Mutlilingual BERT},
author={Abdaoui, Amine and Pradel, Camille and Sigel, Grégoire},
booktitle={SustaiNLP / EMNLP},
year={2020}
}
```
## Contact
Please contact [email protected] for any question, feedback or request. | {"language": "multilingual", "license": "apache-2.0", "datasets": "wikipedia"} | Geotrend/distilbert-base-en-th-cased | null | [
"transformers",
"pytorch",
"safetensors",
"distilbert",
"fill-mask",
"multilingual",
"dataset:wikipedia",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | [] | [
"multilingual"
] | TAGS
#transformers #pytorch #safetensors #distilbert #fill-mask #multilingual #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
|
# distilbert-base-en-th-cased
We are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.
Our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.
## How to use
To generate other smaller versions of multilingual transformers please visit our Github repo.
### How to cite
## Contact
Please contact amine@URL for any question, feedback or request. | [
"# distilbert-base-en-th-cased\n\nWe are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.\n\nOur versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] | [
"TAGS\n#transformers #pytorch #safetensors #distilbert #fill-mask #multilingual #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n",
"# distilbert-base-en-th-cased\n\nWe are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.\n\nOur versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] |
fill-mask | transformers |
# distilbert-base-en-tr-cased
We are sharing smaller versions of [distilbert-base-multilingual-cased](https://huggingface.co/distilbert-base-multilingual-cased) that handle a custom number of languages.
Our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: [Load What You Need: Smaller Versions of Multilingual BERT](https://www.aclweb.org/anthology/2020.sustainlp-1.16.pdf).
## How to use
```python
from transformers import AutoTokenizer, AutoModel
tokenizer = AutoTokenizer.from_pretrained("Geotrend/distilbert-base-en-tr-cased")
model = AutoModel.from_pretrained("Geotrend/distilbert-base-en-tr-cased")
```
To generate other smaller versions of multilingual transformers please visit [our Github repo](https://github.com/Geotrend-research/smaller-transformers).
### How to cite
```bibtex
@inproceedings{smallermdistilbert,
title={Load What You Need: Smaller Versions of Mutlilingual BERT},
author={Abdaoui, Amine and Pradel, Camille and Sigel, Grégoire},
booktitle={SustaiNLP / EMNLP},
year={2020}
}
```
## Contact
Please contact [email protected] for any question, feedback or request. | {"language": "multilingual", "license": "apache-2.0", "datasets": "wikipedia"} | Geotrend/distilbert-base-en-tr-cased | null | [
"transformers",
"pytorch",
"safetensors",
"distilbert",
"fill-mask",
"multilingual",
"dataset:wikipedia",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | [] | [
"multilingual"
] | TAGS
#transformers #pytorch #safetensors #distilbert #fill-mask #multilingual #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
|
# distilbert-base-en-tr-cased
We are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.
Our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.
## How to use
To generate other smaller versions of multilingual transformers please visit our Github repo.
### How to cite
## Contact
Please contact amine@URL for any question, feedback or request. | [
"# distilbert-base-en-tr-cased\n\nWe are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.\n\nOur versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] | [
"TAGS\n#transformers #pytorch #safetensors #distilbert #fill-mask #multilingual #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n",
"# distilbert-base-en-tr-cased\n\nWe are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.\n\nOur versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] |
fill-mask | transformers |
# distilbert-base-en-uk-cased
We are sharing smaller versions of [distilbert-base-multilingual-cased](https://huggingface.co/distilbert-base-multilingual-cased) that handle a custom number of languages.
Our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: [Load What You Need: Smaller Versions of Multilingual BERT](https://www.aclweb.org/anthology/2020.sustainlp-1.16.pdf).
## How to use
```python
from transformers import AutoTokenizer, AutoModel
tokenizer = AutoTokenizer.from_pretrained("Geotrend/distilbert-base-en-uk-cased")
model = AutoModel.from_pretrained("Geotrend/distilbert-base-en-uk-cased")
```
To generate other smaller versions of multilingual transformers please visit [our Github repo](https://github.com/Geotrend-research/smaller-transformers).
### How to cite
```bibtex
@inproceedings{smallermdistilbert,
title={Load What You Need: Smaller Versions of Mutlilingual BERT},
author={Abdaoui, Amine and Pradel, Camille and Sigel, Grégoire},
booktitle={SustaiNLP / EMNLP},
year={2020}
}
```
## Contact
Please contact [email protected] for any question, feedback or request. | {"language": "multilingual", "license": "apache-2.0", "datasets": "wikipedia"} | Geotrend/distilbert-base-en-uk-cased | null | [
"transformers",
"pytorch",
"distilbert",
"fill-mask",
"multilingual",
"dataset:wikipedia",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | [] | [
"multilingual"
] | TAGS
#transformers #pytorch #distilbert #fill-mask #multilingual #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
|
# distilbert-base-en-uk-cased
We are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.
Our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.
## How to use
To generate other smaller versions of multilingual transformers please visit our Github repo.
### How to cite
## Contact
Please contact amine@URL for any question, feedback or request. | [
"# distilbert-base-en-uk-cased\n\nWe are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.\n\nOur versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] | [
"TAGS\n#transformers #pytorch #distilbert #fill-mask #multilingual #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n",
"# distilbert-base-en-uk-cased\n\nWe are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.\n\nOur versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] |
fill-mask | transformers |
# distilbert-base-en-ur-cased
We are sharing smaller versions of [distilbert-base-multilingual-cased](https://huggingface.co/distilbert-base-multilingual-cased) that handle a custom number of languages.
Our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: [Load What You Need: Smaller Versions of Multilingual BERT](https://www.aclweb.org/anthology/2020.sustainlp-1.16.pdf).
## How to use
```python
from transformers import AutoTokenizer, AutoModel
tokenizer = AutoTokenizer.from_pretrained("Geotrend/distilbert-base-en-ur-cased")
model = AutoModel.from_pretrained("Geotrend/distilbert-base-en-ur-cased")
```
To generate other smaller versions of multilingual transformers please visit [our Github repo](https://github.com/Geotrend-research/smaller-transformers).
### How to cite
```bibtex
@inproceedings{smallermdistilbert,
title={Load What You Need: Smaller Versions of Mutlilingual BERT},
author={Abdaoui, Amine and Pradel, Camille and Sigel, Grégoire},
booktitle={SustaiNLP / EMNLP},
year={2020}
}
```
## Contact
Please contact [email protected] for any question, feedback or request. | {"language": "multilingual", "license": "apache-2.0", "datasets": "wikipedia"} | Geotrend/distilbert-base-en-ur-cased | null | [
"transformers",
"pytorch",
"distilbert",
"fill-mask",
"multilingual",
"dataset:wikipedia",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | [] | [
"multilingual"
] | TAGS
#transformers #pytorch #distilbert #fill-mask #multilingual #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
|
# distilbert-base-en-ur-cased
We are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.
Our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.
## How to use
To generate other smaller versions of multilingual transformers please visit our Github repo.
### How to cite
## Contact
Please contact amine@URL for any question, feedback or request. | [
"# distilbert-base-en-ur-cased\n\nWe are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.\n\nOur versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] | [
"TAGS\n#transformers #pytorch #distilbert #fill-mask #multilingual #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n",
"# distilbert-base-en-ur-cased\n\nWe are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.\n\nOur versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] |
fill-mask | transformers |
# distilbert-base-en-vi-cased
We are sharing smaller versions of [distilbert-base-multilingual-cased](https://huggingface.co/distilbert-base-multilingual-cased) that handle a custom number of languages.
Our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: [Load What You Need: Smaller Versions of Multilingual BERT](https://www.aclweb.org/anthology/2020.sustainlp-1.16.pdf).
## How to use
```python
from transformers import AutoTokenizer, AutoModel
tokenizer = AutoTokenizer.from_pretrained("Geotrend/distilbert-base-en-vi-cased")
model = AutoModel.from_pretrained("Geotrend/distilbert-base-en-vi-cased")
```
To generate other smaller versions of multilingual transformers please visit [our Github repo](https://github.com/Geotrend-research/smaller-transformers).
### How to cite
```bibtex
@inproceedings{smallermdistilbert,
title={Load What You Need: Smaller Versions of Mutlilingual BERT},
author={Abdaoui, Amine and Pradel, Camille and Sigel, Grégoire},
booktitle={SustaiNLP / EMNLP},
year={2020}
}
```
## Contact
Please contact [email protected] for any question, feedback or request. | {"language": "multilingual", "license": "apache-2.0", "datasets": "wikipedia"} | Geotrend/distilbert-base-en-vi-cased | null | [
"transformers",
"pytorch",
"distilbert",
"fill-mask",
"multilingual",
"dataset:wikipedia",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | [] | [
"multilingual"
] | TAGS
#transformers #pytorch #distilbert #fill-mask #multilingual #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
|
# distilbert-base-en-vi-cased
We are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.
Our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.
## How to use
To generate other smaller versions of multilingual transformers please visit our Github repo.
### How to cite
## Contact
Please contact amine@URL for any question, feedback or request. | [
"# distilbert-base-en-vi-cased\n\nWe are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.\n\nOur versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] | [
"TAGS\n#transformers #pytorch #distilbert #fill-mask #multilingual #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n",
"# distilbert-base-en-vi-cased\n\nWe are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.\n\nOur versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] |
fill-mask | transformers |
# distilbert-base-en-zh-cased
We are sharing smaller versions of [distilbert-base-multilingual-cased](https://huggingface.co/distilbert-base-multilingual-cased) that handle a custom number of languages.
Our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: [Load What You Need: Smaller Versions of Multilingual BERT](https://www.aclweb.org/anthology/2020.sustainlp-1.16.pdf).
## How to use
```python
from transformers import AutoTokenizer, AutoModel
tokenizer = AutoTokenizer.from_pretrained("Geotrend/distilbert-base-en-zh-cased")
model = AutoModel.from_pretrained("Geotrend/distilbert-base-en-zh-cased")
```
To generate other smaller versions of multilingual transformers please visit [our Github repo](https://github.com/Geotrend-research/smaller-transformers).
### How to cite
```bibtex
@inproceedings{smallermdistilbert,
title={Load What You Need: Smaller Versions of Mutlilingual BERT},
author={Abdaoui, Amine and Pradel, Camille and Sigel, Grégoire},
booktitle={SustaiNLP / EMNLP},
year={2020}
}
```
## Contact
Please contact [email protected] for any question, feedback or request. | {"language": "multilingual", "license": "apache-2.0", "datasets": "wikipedia"} | Geotrend/distilbert-base-en-zh-cased | null | [
"transformers",
"pytorch",
"distilbert",
"fill-mask",
"multilingual",
"dataset:wikipedia",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | [] | [
"multilingual"
] | TAGS
#transformers #pytorch #distilbert #fill-mask #multilingual #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
|
# distilbert-base-en-zh-cased
We are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.
Our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.
## How to use
To generate other smaller versions of multilingual transformers please visit our Github repo.
### How to cite
## Contact
Please contact amine@URL for any question, feedback or request. | [
"# distilbert-base-en-zh-cased\n\nWe are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.\n\nOur versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] | [
"TAGS\n#transformers #pytorch #distilbert #fill-mask #multilingual #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n",
"# distilbert-base-en-zh-cased\n\nWe are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.\n\nOur versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] |
fill-mask | transformers |
# distilbert-base-en-zh-hi-cased
We are sharing smaller versions of [distilbert-base-multilingual-cased](https://huggingface.co/distilbert-base-multilingual-cased) that handle a custom number of languages.
Our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: [Load What You Need: Smaller Versions of Multilingual BERT](https://www.aclweb.org/anthology/2020.sustainlp-1.16.pdf).
## How to use
```python
from transformers import AutoTokenizer, AutoModel
tokenizer = AutoTokenizer.from_pretrained("Geotrend/distilbert-base-en-zh-hi-cased")
model = AutoModel.from_pretrained("Geotrend/distilbert-base-en-zh-hi-cased")
```
To generate other smaller versions of multilingual transformers please visit [our Github repo](https://github.com/Geotrend-research/smaller-transformers).
### How to cite
```bibtex
@inproceedings{smallermdistilbert,
title={Load What You Need: Smaller Versions of Mutlilingual BERT},
author={Abdaoui, Amine and Pradel, Camille and Sigel, Grégoire},
booktitle={SustaiNLP / EMNLP},
year={2020}
}
```
## Contact
Please contact [email protected] for any question, feedback or request. | {"language": "multilingual", "license": "apache-2.0", "datasets": "wikipedia"} | Geotrend/distilbert-base-en-zh-hi-cased | null | [
"transformers",
"pytorch",
"distilbert",
"fill-mask",
"multilingual",
"dataset:wikipedia",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | [] | [
"multilingual"
] | TAGS
#transformers #pytorch #distilbert #fill-mask #multilingual #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
|
# distilbert-base-en-zh-hi-cased
We are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.
Our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.
## How to use
To generate other smaller versions of multilingual transformers please visit our Github repo.
### How to cite
## Contact
Please contact amine@URL for any question, feedback or request. | [
"# distilbert-base-en-zh-hi-cased\n\nWe are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.\n\nOur versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] | [
"TAGS\n#transformers #pytorch #distilbert #fill-mask #multilingual #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n",
"# distilbert-base-en-zh-hi-cased\n\nWe are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.\n\nOur versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] |
fill-mask | transformers |
# distilbert-base-es-cased
We are sharing smaller versions of [distilbert-base-multilingual-cased](https://huggingface.co/distilbert-base-multilingual-cased) that handle a custom number of languages.
Our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: [Load What You Need: Smaller Versions of Multilingual BERT](https://www.aclweb.org/anthology/2020.sustainlp-1.16.pdf).
## How to use
```python
from transformers import AutoTokenizer, AutoModel
tokenizer = AutoTokenizer.from_pretrained("Geotrend/distilbert-base-es-cased")
model = AutoModel.from_pretrained("Geotrend/distilbert-base-es-cased")
```
To generate other smaller versions of multilingual transformers please visit [our Github repo](https://github.com/Geotrend-research/smaller-transformers).
### How to cite
```bibtex
@inproceedings{smallermdistilbert,
title={Load What You Need: Smaller Versions of Mutlilingual BERT},
author={Abdaoui, Amine and Pradel, Camille and Sigel, Grégoire},
booktitle={SustaiNLP / EMNLP},
year={2020}
}
```
## Contact
Please contact [email protected] for any question, feedback or request. | {"language": "es", "license": "apache-2.0", "datasets": "wikipedia"} | Geotrend/distilbert-base-es-cased | null | [
"transformers",
"pytorch",
"safetensors",
"distilbert",
"fill-mask",
"es",
"dataset:wikipedia",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | [] | [
"es"
] | TAGS
#transformers #pytorch #safetensors #distilbert #fill-mask #es #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
|
# distilbert-base-es-cased
We are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.
Our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.
## How to use
To generate other smaller versions of multilingual transformers please visit our Github repo.
### How to cite
## Contact
Please contact amine@URL for any question, feedback or request. | [
"# distilbert-base-es-cased\n\nWe are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.\n\nOur versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] | [
"TAGS\n#transformers #pytorch #safetensors #distilbert #fill-mask #es #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n",
"# distilbert-base-es-cased\n\nWe are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.\n\nOur versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] |
fill-mask | transformers |
# distilbert-base-fr-cased
We are sharing smaller versions of [distilbert-base-multilingual-cased](https://huggingface.co/distilbert-base-multilingual-cased) that handle a custom number of languages.
Our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: [Load What You Need: Smaller Versions of Multilingual BERT](https://www.aclweb.org/anthology/2020.sustainlp-1.16.pdf).
## How to use
```python
from transformers import AutoTokenizer, AutoModel
tokenizer = AutoTokenizer.from_pretrained("Geotrend/distilbert-base-fr-cased")
model = AutoModel.from_pretrained("Geotrend/distilbert-base-fr-cased")
```
To generate other smaller versions of multilingual transformers please visit [our Github repo](https://github.com/Geotrend-research/smaller-transformers).
### How to cite
```bibtex
@inproceedings{smallermdistilbert,
title={Load What You Need: Smaller Versions of Mutlilingual BERT},
author={Abdaoui, Amine and Pradel, Camille and Sigel, Grégoire},
booktitle={SustaiNLP / EMNLP},
year={2020}
}
```
## Contact
Please contact [email protected] for any question, feedback or request. | {"language": "fr", "license": "apache-2.0", "datasets": "wikipedia"} | Geotrend/distilbert-base-fr-cased | null | [
"transformers",
"pytorch",
"safetensors",
"distilbert",
"fill-mask",
"fr",
"dataset:wikipedia",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | [] | [
"fr"
] | TAGS
#transformers #pytorch #safetensors #distilbert #fill-mask #fr #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
|
# distilbert-base-fr-cased
We are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.
Our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.
## How to use
To generate other smaller versions of multilingual transformers please visit our Github repo.
### How to cite
## Contact
Please contact amine@URL for any question, feedback or request. | [
"# distilbert-base-fr-cased\n\nWe are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.\n\nOur versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] | [
"TAGS\n#transformers #pytorch #safetensors #distilbert #fill-mask #fr #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n",
"# distilbert-base-fr-cased\n\nWe are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.\n\nOur versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] |
fill-mask | transformers |
# distilbert-base-hi-cased
We are sharing smaller versions of [distilbert-base-multilingual-cased](https://huggingface.co/distilbert-base-multilingual-cased) that handle a custom number of languages.
Our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: [Load What You Need: Smaller Versions of Multilingual BERT](https://www.aclweb.org/anthology/2020.sustainlp-1.16.pdf).
## How to use
```python
from transformers import AutoTokenizer, AutoModel
tokenizer = AutoTokenizer.from_pretrained("Geotrend/distilbert-base-hi-cased")
model = AutoModel.from_pretrained("Geotrend/distilbert-base-hi-cased")
```
To generate other smaller versions of multilingual transformers please visit [our Github repo](https://github.com/Geotrend-research/smaller-transformers).
### How to cite
```bibtex
@inproceedings{smallermdistilbert,
title={Load What You Need: Smaller Versions of Mutlilingual BERT},
author={Abdaoui, Amine and Pradel, Camille and Sigel, Grégoire},
booktitle={SustaiNLP / EMNLP},
year={2020}
}
```
## Contact
Please contact [email protected] for any question, feedback or request. | {"language": "hi", "license": "apache-2.0", "datasets": "wikipedia"} | Geotrend/distilbert-base-hi-cased | null | [
"transformers",
"pytorch",
"distilbert",
"fill-mask",
"hi",
"dataset:wikipedia",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | [] | [
"hi"
] | TAGS
#transformers #pytorch #distilbert #fill-mask #hi #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
|
# distilbert-base-hi-cased
We are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.
Our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.
## How to use
To generate other smaller versions of multilingual transformers please visit our Github repo.
### How to cite
## Contact
Please contact amine@URL for any question, feedback or request. | [
"# distilbert-base-hi-cased\n\nWe are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.\n\nOur versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] | [
"TAGS\n#transformers #pytorch #distilbert #fill-mask #hi #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n",
"# distilbert-base-hi-cased\n\nWe are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.\n\nOur versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] |
fill-mask | transformers |
# distilbert-base-it-cased
We are sharing smaller versions of [distilbert-base-multilingual-cased](https://huggingface.co/distilbert-base-multilingual-cased) that handle a custom number of languages.
Our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: [Load What You Need: Smaller Versions of Multilingual BERT](https://www.aclweb.org/anthology/2020.sustainlp-1.16.pdf).
## How to use
```python
from transformers import AutoTokenizer, AutoModel
tokenizer = AutoTokenizer.from_pretrained("Geotrend/distilbert-base-it-cased")
model = AutoModel.from_pretrained("Geotrend/distilbert-base-it-cased")
```
To generate other smaller versions of multilingual transformers please visit [our Github repo](https://github.com/Geotrend-research/smaller-transformers).
### How to cite
```bibtex
@inproceedings{smallermdistilbert,
title={Load What You Need: Smaller Versions of Mutlilingual BERT},
author={Abdaoui, Amine and Pradel, Camille and Sigel, Grégoire},
booktitle={SustaiNLP / EMNLP},
year={2020}
}
```
## Contact
Please contact [email protected] for any question, feedback or request. | {"language": "it", "license": "apache-2.0", "datasets": "wikipedia"} | Geotrend/distilbert-base-it-cased | null | [
"transformers",
"pytorch",
"safetensors",
"distilbert",
"fill-mask",
"it",
"dataset:wikipedia",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | [] | [
"it"
] | TAGS
#transformers #pytorch #safetensors #distilbert #fill-mask #it #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
|
# distilbert-base-it-cased
We are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.
Our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.
## How to use
To generate other smaller versions of multilingual transformers please visit our Github repo.
### How to cite
## Contact
Please contact amine@URL for any question, feedback or request. | [
"# distilbert-base-it-cased\n\nWe are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.\n\nOur versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] | [
"TAGS\n#transformers #pytorch #safetensors #distilbert #fill-mask #it #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n",
"# distilbert-base-it-cased\n\nWe are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.\n\nOur versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] |
fill-mask | transformers |
# distilbert-base-ja-cased
We are sharing smaller versions of [distilbert-base-multilingual-cased](https://huggingface.co/distilbert-base-multilingual-cased) that handle a custom number of languages.
Our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: [Load What You Need: Smaller Versions of Multilingual BERT](https://www.aclweb.org/anthology/2020.sustainlp-1.16.pdf).
## How to use
```python
from transformers import AutoTokenizer, AutoModel
tokenizer = AutoTokenizer.from_pretrained("Geotrend/distilbert-base-ja-cased")
model = AutoModel.from_pretrained("Geotrend/distilbert-base-ja-cased")
```
To generate other smaller versions of multilingual transformers please visit [our Github repo](https://github.com/Geotrend-research/smaller-transformers).
### How to cite
```bibtex
@inproceedings{smallermdistilbert,
title={Load What You Need: Smaller Versions of Mutlilingual BERT},
author={Abdaoui, Amine and Pradel, Camille and Sigel, Grégoire},
booktitle={SustaiNLP / EMNLP},
year={2020}
}
```
## Contact
Please contact [email protected] for any question, feedback or request. | {"language": "ja", "license": "apache-2.0", "datasets": "wikipedia"} | Geotrend/distilbert-base-ja-cased | null | [
"transformers",
"pytorch",
"distilbert",
"fill-mask",
"ja",
"dataset:wikipedia",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | [] | [
"ja"
] | TAGS
#transformers #pytorch #distilbert #fill-mask #ja #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
|
# distilbert-base-ja-cased
We are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.
Our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.
## How to use
To generate other smaller versions of multilingual transformers please visit our Github repo.
### How to cite
## Contact
Please contact amine@URL for any question, feedback or request. | [
"# distilbert-base-ja-cased\n\nWe are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.\n\nOur versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] | [
"TAGS\n#transformers #pytorch #distilbert #fill-mask #ja #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n",
"# distilbert-base-ja-cased\n\nWe are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.\n\nOur versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] |
fill-mask | transformers |
# distilbert-base-lt-cased
We are sharing smaller versions of [distilbert-base-multilingual-cased](https://huggingface.co/distilbert-base-multilingual-cased) that handle a custom number of languages.
Our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: [Load What You Need: Smaller Versions of Multilingual BERT](https://www.aclweb.org/anthology/2020.sustainlp-1.16.pdf).
## How to use
```python
from transformers import AutoTokenizer, AutoModel
tokenizer = AutoTokenizer.from_pretrained("Geotrend/distilbert-base-lt-cased")
model = AutoModel.from_pretrained("Geotrend/distilbert-base-lt-cased")
```
To generate other smaller versions of multilingual transformers please visit [our Github repo](https://github.com/Geotrend-research/smaller-transformers).
### How to cite
```bibtex
@inproceedings{smallermdistilbert,
title={Load What You Need: Smaller Versions of Mutlilingual BERT},
author={Abdaoui, Amine and Pradel, Camille and Sigel, Grégoire},
booktitle={SustaiNLP / EMNLP},
year={2020}
}
```
## Contact
Please contact [email protected] for any question, feedback or request. | {"language": "lt", "license": "apache-2.0", "datasets": "wikipedia"} | Geotrend/distilbert-base-lt-cased | null | [
"transformers",
"pytorch",
"distilbert",
"fill-mask",
"lt",
"dataset:wikipedia",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | [] | [
"lt"
] | TAGS
#transformers #pytorch #distilbert #fill-mask #lt #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
|
# distilbert-base-lt-cased
We are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.
Our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.
## How to use
To generate other smaller versions of multilingual transformers please visit our Github repo.
### How to cite
## Contact
Please contact amine@URL for any question, feedback or request. | [
"# distilbert-base-lt-cased\n\nWe are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.\n\nOur versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] | [
"TAGS\n#transformers #pytorch #distilbert #fill-mask #lt #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n",
"# distilbert-base-lt-cased\n\nWe are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.\n\nOur versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] |
fill-mask | transformers |
# distilbert-base-nl-cased
We are sharing smaller versions of [distilbert-base-multilingual-cased](https://huggingface.co/distilbert-base-multilingual-cased) that handle a custom number of languages.
Our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: [Load What You Need: Smaller Versions of Multilingual BERT](https://www.aclweb.org/anthology/2020.sustainlp-1.16.pdf).
## How to use
```python
from transformers import AutoTokenizer, AutoModel
tokenizer = AutoTokenizer.from_pretrained("Geotrend/distilbert-base-nl-cased")
model = AutoModel.from_pretrained("Geotrend/distilbert-base-nl-cased")
```
To generate other smaller versions of multilingual transformers please visit [our Github repo](https://github.com/Geotrend-research/smaller-transformers).
### How to cite
```bibtex
@inproceedings{smallermdistilbert,
title={Load What You Need: Smaller Versions of Mutlilingual BERT},
author={Abdaoui, Amine and Pradel, Camille and Sigel, Grégoire},
booktitle={SustaiNLP / EMNLP},
year={2020}
}
```
## Contact
Please contact [email protected] for any question, feedback or request. | {"language": "nl", "license": "apache-2.0", "datasets": "wikipedia"} | Geotrend/distilbert-base-nl-cased | null | [
"transformers",
"pytorch",
"distilbert",
"fill-mask",
"nl",
"dataset:wikipedia",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | [] | [
"nl"
] | TAGS
#transformers #pytorch #distilbert #fill-mask #nl #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
|
# distilbert-base-nl-cased
We are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.
Our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.
## How to use
To generate other smaller versions of multilingual transformers please visit our Github repo.
### How to cite
## Contact
Please contact amine@URL for any question, feedback or request. | [
"# distilbert-base-nl-cased\n\nWe are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.\n\nOur versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] | [
"TAGS\n#transformers #pytorch #distilbert #fill-mask #nl #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n",
"# distilbert-base-nl-cased\n\nWe are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.\n\nOur versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] |
fill-mask | transformers |
# distilbert-base-no-cased
We are sharing smaller versions of [distilbert-base-multilingual-cased](https://huggingface.co/distilbert-base-multilingual-cased) that handle a custom number of languages.
Our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: [Load What You Need: Smaller Versions of Multilingual BERT](https://www.aclweb.org/anthology/2020.sustainlp-1.16.pdf).
## How to use
```python
from transformers import AutoTokenizer, AutoModel
tokenizer = AutoTokenizer.from_pretrained("Geotrend/distilbert-base-no-cased")
model = AutoModel.from_pretrained("Geotrend/distilbert-base-no-cased")
```
To generate other smaller versions of multilingual transformers please visit [our Github repo](https://github.com/Geotrend-research/smaller-transformers).
### How to cite
```bibtex
@inproceedings{smallermdistilbert,
title={Load What You Need: Smaller Versions of Mutlilingual BERT},
author={Abdaoui, Amine and Pradel, Camille and Sigel, Grégoire},
booktitle={SustaiNLP / EMNLP},
year={2020}
}
```
## Contact
Please contact [email protected] for any question, feedback or request. | {"language": false, "license": "apache-2.0", "datasets": "wikipedia"} | Geotrend/distilbert-base-no-cased | null | [
"transformers",
"pytorch",
"distilbert",
"fill-mask",
"no",
"dataset:wikipedia",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | [] | [
"no"
] | TAGS
#transformers #pytorch #distilbert #fill-mask #no #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
|
# distilbert-base-no-cased
We are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.
Our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.
## How to use
To generate other smaller versions of multilingual transformers please visit our Github repo.
### How to cite
## Contact
Please contact amine@URL for any question, feedback or request. | [
"# distilbert-base-no-cased\n\nWe are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.\n\nOur versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] | [
"TAGS\n#transformers #pytorch #distilbert #fill-mask #no #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n",
"# distilbert-base-no-cased\n\nWe are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.\n\nOur versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] |
fill-mask | transformers |
# distilbert-base-pl-cased
We are sharing smaller versions of [distilbert-base-multilingual-cased](https://huggingface.co/distilbert-base-multilingual-cased) that handle a custom number of languages.
Our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: [Load What You Need: Smaller Versions of Multilingual BERT](https://www.aclweb.org/anthology/2020.sustainlp-1.16.pdf).
## How to use
```python
from transformers import AutoTokenizer, AutoModel
tokenizer = AutoTokenizer.from_pretrained("Geotrend/distilbert-base-pl-cased")
model = AutoModel.from_pretrained("Geotrend/distilbert-base-pl-cased")
```
To generate other smaller versions of multilingual transformers please visit [our Github repo](https://github.com/Geotrend-research/smaller-transformers).
### How to cite
```bibtex
@inproceedings{smallermdistilbert,
title={Load What You Need: Smaller Versions of Mutlilingual BERT},
author={Abdaoui, Amine and Pradel, Camille and Sigel, Grégoire},
booktitle={SustaiNLP / EMNLP},
year={2020}
}
```
## Contact
Please contact [email protected] for any question, feedback or request. | {"language": "pl", "license": "apache-2.0", "datasets": "wikipedia"} | Geotrend/distilbert-base-pl-cased | null | [
"transformers",
"pytorch",
"distilbert",
"fill-mask",
"pl",
"dataset:wikipedia",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | [] | [
"pl"
] | TAGS
#transformers #pytorch #distilbert #fill-mask #pl #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
|
# distilbert-base-pl-cased
We are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.
Our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.
## How to use
To generate other smaller versions of multilingual transformers please visit our Github repo.
### How to cite
## Contact
Please contact amine@URL for any question, feedback or request. | [
"# distilbert-base-pl-cased\n\nWe are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.\n\nOur versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] | [
"TAGS\n#transformers #pytorch #distilbert #fill-mask #pl #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n",
"# distilbert-base-pl-cased\n\nWe are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.\n\nOur versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] |
fill-mask | transformers |
# distilbert-base-pt-cased
We are sharing smaller versions of [distilbert-base-multilingual-cased](https://huggingface.co/distilbert-base-multilingual-cased) that handle a custom number of languages.
Our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: [Load What You Need: Smaller Versions of Multilingual BERT](https://www.aclweb.org/anthology/2020.sustainlp-1.16.pdf).
## How to use
```python
from transformers import AutoTokenizer, AutoModel
tokenizer = AutoTokenizer.from_pretrained("Geotrend/distilbert-base-pt-cased")
model = AutoModel.from_pretrained("Geotrend/distilbert-base-pt-cased")
```
To generate other smaller versions of multilingual transformers please visit [our Github repo](https://github.com/Geotrend-research/smaller-transformers).
### How to cite
```bibtex
@inproceedings{smallermdistilbert,
title={Load What You Need: Smaller Versions of Mutlilingual BERT},
author={Abdaoui, Amine and Pradel, Camille and Sigel, Grégoire},
booktitle={SustaiNLP / EMNLP},
year={2020}
}
```
## Contact
Please contact [email protected] for any question, feedback or request. | {"language": "pt", "license": "apache-2.0", "datasets": "wikipedia"} | Geotrend/distilbert-base-pt-cased | null | [
"transformers",
"pytorch",
"safetensors",
"distilbert",
"fill-mask",
"pt",
"dataset:wikipedia",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | [] | [
"pt"
] | TAGS
#transformers #pytorch #safetensors #distilbert #fill-mask #pt #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
|
# distilbert-base-pt-cased
We are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.
Our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.
## How to use
To generate other smaller versions of multilingual transformers please visit our Github repo.
### How to cite
## Contact
Please contact amine@URL for any question, feedback or request. | [
"# distilbert-base-pt-cased\n\nWe are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.\n\nOur versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] | [
"TAGS\n#transformers #pytorch #safetensors #distilbert #fill-mask #pt #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n",
"# distilbert-base-pt-cased\n\nWe are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.\n\nOur versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] |
fill-mask | transformers |
# distilbert-base-ro-cased
We are sharing smaller versions of [distilbert-base-multilingual-cased](https://huggingface.co/distilbert-base-multilingual-cased) that handle a custom number of languages.
Our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: [Load What You Need: Smaller Versions of Multilingual BERT](https://www.aclweb.org/anthology/2020.sustainlp-1.16.pdf).
## How to use
```python
from transformers import AutoTokenizer, AutoModel
tokenizer = AutoTokenizer.from_pretrained("Geotrend/distilbert-base-ro-cased")
model = AutoModel.from_pretrained("Geotrend/distilbert-base-ro-cased")
```
To generate other smaller versions of multilingual transformers please visit [our Github repo](https://github.com/Geotrend-research/smaller-transformers).
### How to cite
```bibtex
@inproceedings{smallermdistilbert,
title={Load What You Need: Smaller Versions of Mutlilingual BERT},
author={Abdaoui, Amine and Pradel, Camille and Sigel, Grégoire},
booktitle={SustaiNLP / EMNLP},
year={2020}
}
```
## Contact
Please contact [email protected] for any question, feedback or request. | {"language": "ro", "license": "apache-2.0", "datasets": "wikipedia"} | Geotrend/distilbert-base-ro-cased | null | [
"transformers",
"pytorch",
"safetensors",
"distilbert",
"fill-mask",
"ro",
"dataset:wikipedia",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | [] | [
"ro"
] | TAGS
#transformers #pytorch #safetensors #distilbert #fill-mask #ro #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
|
# distilbert-base-ro-cased
We are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.
Our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.
## How to use
To generate other smaller versions of multilingual transformers please visit our Github repo.
### How to cite
## Contact
Please contact amine@URL for any question, feedback or request. | [
"# distilbert-base-ro-cased\n\nWe are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.\n\nOur versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] | [
"TAGS\n#transformers #pytorch #safetensors #distilbert #fill-mask #ro #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n",
"# distilbert-base-ro-cased\n\nWe are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.\n\nOur versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] |
fill-mask | transformers |
# distilbert-base-ru-cased
We are sharing smaller versions of [distilbert-base-multilingual-cased](https://huggingface.co/distilbert-base-multilingual-cased) that handle a custom number of languages.
Our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: [Load What You Need: Smaller Versions of Multilingual BERT](https://www.aclweb.org/anthology/2020.sustainlp-1.16.pdf).
## How to use
```python
from transformers import AutoTokenizer, AutoModel
tokenizer = AutoTokenizer.from_pretrained("Geotrend/distilbert-base-ru-cased")
model = AutoModel.from_pretrained("Geotrend/distilbert-base-ru-cased")
```
To generate other smaller versions of multilingual transformers please visit [our Github repo](https://github.com/Geotrend-research/smaller-transformers).
### How to cite
```bibtex
@inproceedings{smallermdistilbert,
title={Load What You Need: Smaller Versions of Mutlilingual BERT},
author={Abdaoui, Amine and Pradel, Camille and Sigel, Grégoire},
booktitle={SustaiNLP / EMNLP},
year={2020}
}
```
## Contact
Please contact [email protected] for any question, feedback or request. | {"language": "ru", "license": "apache-2.0", "datasets": "wikipedia"} | Geotrend/distilbert-base-ru-cased | null | [
"transformers",
"pytorch",
"safetensors",
"distilbert",
"fill-mask",
"ru",
"dataset:wikipedia",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | [] | [
"ru"
] | TAGS
#transformers #pytorch #safetensors #distilbert #fill-mask #ru #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
|
# distilbert-base-ru-cased
We are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.
Our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.
## How to use
To generate other smaller versions of multilingual transformers please visit our Github repo.
### How to cite
## Contact
Please contact amine@URL for any question, feedback or request. | [
"# distilbert-base-ru-cased\n\nWe are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.\n\nOur versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] | [
"TAGS\n#transformers #pytorch #safetensors #distilbert #fill-mask #ru #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n",
"# distilbert-base-ru-cased\n\nWe are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.\n\nOur versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] |
fill-mask | transformers |
# distilbert-base-sw-cased
We are sharing smaller versions of [distilbert-base-multilingual-cased](https://huggingface.co/distilbert-base-multilingual-cased) that handle a custom number of languages.
Our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: [Load What You Need: Smaller Versions of Multilingual BERT](https://www.aclweb.org/anthology/2020.sustainlp-1.16.pdf).
## How to use
```python
from transformers import AutoTokenizer, AutoModel
tokenizer = AutoTokenizer.from_pretrained("Geotrend/distilbert-base-sw-cased")
model = AutoModel.from_pretrained("Geotrend/distilbert-base-sw-cased")
```
To generate other smaller versions of multilingual transformers please visit [our Github repo](https://github.com/Geotrend-research/smaller-transformers).
### How to cite
```bibtex
@inproceedings{smallermdistilbert,
title={Load What You Need: Smaller Versions of Mutlilingual BERT},
author={Abdaoui, Amine and Pradel, Camille and Sigel, Grégoire},
booktitle={SustaiNLP / EMNLP},
year={2020}
}
```
## Contact
Please contact [email protected] for any question, feedback or request. | {"language": "sw", "license": "apache-2.0", "datasets": "wikipedia"} | Geotrend/distilbert-base-sw-cased | null | [
"transformers",
"pytorch",
"distilbert",
"fill-mask",
"sw",
"dataset:wikipedia",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | [] | [
"sw"
] | TAGS
#transformers #pytorch #distilbert #fill-mask #sw #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
|
# distilbert-base-sw-cased
We are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.
Our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.
## How to use
To generate other smaller versions of multilingual transformers please visit our Github repo.
### How to cite
## Contact
Please contact amine@URL for any question, feedback or request. | [
"# distilbert-base-sw-cased\n\nWe are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.\n\nOur versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] | [
"TAGS\n#transformers #pytorch #distilbert #fill-mask #sw #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n",
"# distilbert-base-sw-cased\n\nWe are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.\n\nOur versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] |
fill-mask | transformers |
# distilbert-base-th-cased
We are sharing smaller versions of [distilbert-base-multilingual-cased](https://huggingface.co/distilbert-base-multilingual-cased) that handle a custom number of languages.
Our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: [Load What You Need: Smaller Versions of Multilingual BERT](https://www.aclweb.org/anthology/2020.sustainlp-1.16.pdf).
## How to use
```python
from transformers import AutoTokenizer, AutoModel
tokenizer = AutoTokenizer.from_pretrained("Geotrend/distilbert-base-th-cased")
model = AutoModel.from_pretrained("Geotrend/distilbert-base-th-cased")
```
To generate other smaller versions of multilingual transformers please visit [our Github repo](https://github.com/Geotrend-research/smaller-transformers).
### How to cite
```bibtex
@inproceedings{smallermdistilbert,
title={Load What You Need: Smaller Versions of Mutlilingual BERT},
author={Abdaoui, Amine and Pradel, Camille and Sigel, Grégoire},
booktitle={SustaiNLP / EMNLP},
year={2020}
}
```
## Contact
Please contact [email protected] for any question, feedback or request. | {"language": "th", "license": "apache-2.0", "datasets": "wikipedia"} | Geotrend/distilbert-base-th-cased | null | [
"transformers",
"pytorch",
"safetensors",
"distilbert",
"fill-mask",
"th",
"dataset:wikipedia",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | [] | [
"th"
] | TAGS
#transformers #pytorch #safetensors #distilbert #fill-mask #th #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
|
# distilbert-base-th-cased
We are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.
Our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.
## How to use
To generate other smaller versions of multilingual transformers please visit our Github repo.
### How to cite
## Contact
Please contact amine@URL for any question, feedback or request. | [
"# distilbert-base-th-cased\n\nWe are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.\n\nOur versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] | [
"TAGS\n#transformers #pytorch #safetensors #distilbert #fill-mask #th #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n",
"# distilbert-base-th-cased\n\nWe are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.\n\nOur versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] |
fill-mask | transformers |
# distilbert-base-tr-cased
We are sharing smaller versions of [distilbert-base-multilingual-cased](https://huggingface.co/distilbert-base-multilingual-cased) that handle a custom number of languages.
Our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: [Load What You Need: Smaller Versions of Multilingual BERT](https://www.aclweb.org/anthology/2020.sustainlp-1.16.pdf).
## How to use
```python
from transformers import AutoTokenizer, AutoModel
tokenizer = AutoTokenizer.from_pretrained("Geotrend/distilbert-base-tr-cased")
model = AutoModel.from_pretrained("Geotrend/distilbert-base-tr-cased")
```
To generate other smaller versions of multilingual transformers please visit [our Github repo](https://github.com/Geotrend-research/smaller-transformers).
### How to cite
```bibtex
@inproceedings{smallermdistilbert,
title={Load What You Need: Smaller Versions of Mutlilingual BERT},
author={Abdaoui, Amine and Pradel, Camille and Sigel, Grégoire},
booktitle={SustaiNLP / EMNLP},
year={2020}
}
```
## Contact
Please contact [email protected] for any question, feedback or request. | {"language": "tr", "license": "apache-2.0", "datasets": "wikipedia"} | Geotrend/distilbert-base-tr-cased | null | [
"transformers",
"pytorch",
"distilbert",
"fill-mask",
"tr",
"dataset:wikipedia",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | [] | [
"tr"
] | TAGS
#transformers #pytorch #distilbert #fill-mask #tr #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
|
# distilbert-base-tr-cased
We are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.
Our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.
## How to use
To generate other smaller versions of multilingual transformers please visit our Github repo.
### How to cite
## Contact
Please contact amine@URL for any question, feedback or request. | [
"# distilbert-base-tr-cased\n\nWe are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.\n\nOur versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] | [
"TAGS\n#transformers #pytorch #distilbert #fill-mask #tr #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n",
"# distilbert-base-tr-cased\n\nWe are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.\n\nOur versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] |
fill-mask | transformers |
# distilbert-base-uk-cased
We are sharing smaller versions of [distilbert-base-multilingual-cased](https://huggingface.co/distilbert-base-multilingual-cased) that handle a custom number of languages.
Our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: [Load What You Need: Smaller Versions of Multilingual BERT](https://www.aclweb.org/anthology/2020.sustainlp-1.16.pdf).
## How to use
```python
from transformers import AutoTokenizer, AutoModel
tokenizer = AutoTokenizer.from_pretrained("Geotrend/distilbert-base-uk-cased")
model = AutoModel.from_pretrained("Geotrend/distilbert-base-uk-cased")
```
To generate other smaller versions of multilingual transformers please visit [our Github repo](https://github.com/Geotrend-research/smaller-transformers).
### How to cite
```bibtex
@inproceedings{smallermdistilbert,
title={Load What You Need: Smaller Versions of Mutlilingual BERT},
author={Abdaoui, Amine and Pradel, Camille and Sigel, Grégoire},
booktitle={SustaiNLP / EMNLP},
year={2020}
}
```
## Contact
Please contact [email protected] for any question, feedback or request. | {"language": "uk", "license": "apache-2.0", "datasets": "wikipedia"} | Geotrend/distilbert-base-uk-cased | null | [
"transformers",
"pytorch",
"safetensors",
"distilbert",
"fill-mask",
"uk",
"dataset:wikipedia",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | [] | [
"uk"
] | TAGS
#transformers #pytorch #safetensors #distilbert #fill-mask #uk #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
|
# distilbert-base-uk-cased
We are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.
Our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.
## How to use
To generate other smaller versions of multilingual transformers please visit our Github repo.
### How to cite
## Contact
Please contact amine@URL for any question, feedback or request. | [
"# distilbert-base-uk-cased\n\nWe are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.\n\nOur versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] | [
"TAGS\n#transformers #pytorch #safetensors #distilbert #fill-mask #uk #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n",
"# distilbert-base-uk-cased\n\nWe are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.\n\nOur versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] |
fill-mask | transformers |
# distilbert-base-ur-cased
We are sharing smaller versions of [distilbert-base-multilingual-cased](https://huggingface.co/distilbert-base-multilingual-cased) that handle a custom number of languages.
Our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: [Load What You Need: Smaller Versions of Multilingual BERT](https://www.aclweb.org/anthology/2020.sustainlp-1.16.pdf).
## How to use
```python
from transformers import AutoTokenizer, AutoModel
tokenizer = AutoTokenizer.from_pretrained("Geotrend/distilbert-base-ur-cased")
model = AutoModel.from_pretrained("Geotrend/distilbert-base-ur-cased")
```
To generate other smaller versions of multilingual transformers please visit [our Github repo](https://github.com/Geotrend-research/smaller-transformers).
### How to cite
```bibtex
@inproceedings{smallermdistilbert,
title={Load What You Need: Smaller Versions of Mutlilingual BERT},
author={Abdaoui, Amine and Pradel, Camille and Sigel, Grégoire},
booktitle={SustaiNLP / EMNLP},
year={2020}
}
```
## Contact
Please contact [email protected] for any question, feedback or request. | {"language": "ur", "license": "apache-2.0", "datasets": "wikipedia"} | Geotrend/distilbert-base-ur-cased | null | [
"transformers",
"pytorch",
"distilbert",
"fill-mask",
"ur",
"dataset:wikipedia",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | [] | [
"ur"
] | TAGS
#transformers #pytorch #distilbert #fill-mask #ur #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
|
# distilbert-base-ur-cased
We are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.
Our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.
## How to use
To generate other smaller versions of multilingual transformers please visit our Github repo.
### How to cite
## Contact
Please contact amine@URL for any question, feedback or request. | [
"# distilbert-base-ur-cased\n\nWe are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.\n\nOur versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] | [
"TAGS\n#transformers #pytorch #distilbert #fill-mask #ur #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n",
"# distilbert-base-ur-cased\n\nWe are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.\n\nOur versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] |
fill-mask | transformers |
# distilbert-base-vi-cased
We are sharing smaller versions of [distilbert-base-multilingual-cased](https://huggingface.co/distilbert-base-multilingual-cased) that handle a custom number of languages.
Our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: [Load What You Need: Smaller Versions of Multilingual BERT](https://www.aclweb.org/anthology/2020.sustainlp-1.16.pdf).
## How to use
```python
from transformers import AutoTokenizer, AutoModel
tokenizer = AutoTokenizer.from_pretrained("Geotrend/distilbert-base-vi-cased")
model = AutoModel.from_pretrained("Geotrend/distilbert-base-vi-cased")
```
To generate other smaller versions of multilingual transformers please visit [our Github repo](https://github.com/Geotrend-research/smaller-transformers).
### How to cite
```bibtex
@inproceedings{smallermdistilbert,
title={Load What You Need: Smaller Versions of Mutlilingual BERT},
author={Abdaoui, Amine and Pradel, Camille and Sigel, Grégoire},
booktitle={SustaiNLP / EMNLP},
year={2020}
}
```
## Contact
Please contact [email protected] for any question, feedback or request. | {"language": "vi", "license": "apache-2.0", "datasets": "wikipedia"} | Geotrend/distilbert-base-vi-cased | null | [
"transformers",
"pytorch",
"safetensors",
"distilbert",
"fill-mask",
"vi",
"dataset:wikipedia",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | [] | [
"vi"
] | TAGS
#transformers #pytorch #safetensors #distilbert #fill-mask #vi #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
|
# distilbert-base-vi-cased
We are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.
Our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.
## How to use
To generate other smaller versions of multilingual transformers please visit our Github repo.
### How to cite
## Contact
Please contact amine@URL for any question, feedback or request. | [
"# distilbert-base-vi-cased\n\nWe are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.\n\nOur versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] | [
"TAGS\n#transformers #pytorch #safetensors #distilbert #fill-mask #vi #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n",
"# distilbert-base-vi-cased\n\nWe are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.\n\nOur versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] |
fill-mask | transformers |
# distilbert-base-zh-cased
We are sharing smaller versions of [distilbert-base-multilingual-cased](https://huggingface.co/distilbert-base-multilingual-cased) that handle a custom number of languages.
Our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: [Load What You Need: Smaller Versions of Multilingual BERT](https://www.aclweb.org/anthology/2020.sustainlp-1.16.pdf).
## How to use
```python
from transformers import AutoTokenizer, AutoModel
tokenizer = AutoTokenizer.from_pretrained("Geotrend/distilbert-base-zh-cased")
model = AutoModel.from_pretrained("Geotrend/distilbert-base-zh-cased")
```
To generate other smaller versions of multilingual transformers please visit [our Github repo](https://github.com/Geotrend-research/smaller-transformers).
### How to cite
```bibtex
@inproceedings{smallermdistilbert,
title={Load What You Need: Smaller Versions of Mutlilingual BERT},
author={Abdaoui, Amine and Pradel, Camille and Sigel, Grégoire},
booktitle={SustaiNLP / EMNLP},
year={2020}
}
```
## Contact
Please contact [email protected] for any question, feedback or request. | {"language": "zh", "license": "apache-2.0", "datasets": "wikipedia"} | Geotrend/distilbert-base-zh-cased | null | [
"transformers",
"pytorch",
"safetensors",
"distilbert",
"fill-mask",
"zh",
"dataset:wikipedia",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | null | 2022-03-02T23:29:04+00:00 | [] | [
"zh"
] | TAGS
#transformers #pytorch #safetensors #distilbert #fill-mask #zh #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
|
# distilbert-base-zh-cased
We are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.
Our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.
## How to use
To generate other smaller versions of multilingual transformers please visit our Github repo.
### How to cite
## Contact
Please contact amine@URL for any question, feedback or request. | [
"# distilbert-base-zh-cased\n\nWe are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.\n\nOur versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] | [
"TAGS\n#transformers #pytorch #safetensors #distilbert #fill-mask #zh #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n",
"# distilbert-base-zh-cased\n\nWe are sharing smaller versions of distilbert-base-multilingual-cased that handle a custom number of languages.\n\nOur versions give exactly the same representations produced by the original model which preserves the original accuracy.\n\n\nFor more information please visit our paper: Load What You Need: Smaller Versions of Multilingual BERT.",
"## How to use\n\n\n\nTo generate other smaller versions of multilingual transformers please visit our Github repo.",
"### How to cite",
"## Contact \n\nPlease contact amine@URL for any question, feedback or request."
] |
Subsets and Splits