sha
null | last_modified
null | library_name
stringclasses 154
values | text
stringlengths 1
900k
| metadata
stringlengths 2
348k
| pipeline_tag
stringclasses 45
values | id
stringlengths 5
122
| tags
sequencelengths 1
1.84k
| created_at
stringlengths 25
25
| arxiv
sequencelengths 0
201
| languages
sequencelengths 0
1.83k
| tags_str
stringlengths 17
9.34k
| text_str
stringlengths 0
389k
| text_lists
sequencelengths 0
722
| processed_texts
sequencelengths 1
723
| tokens_length
sequencelengths 1
723
| input_texts
sequencelengths 1
61
| embeddings
sequencelengths 768
768
|
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
null | null | transformers |
# Spanish RoBERTa-base trained on BNE finetuned for CAPITEL Part of Speech (POS) dataset
## Table of contents
<details>
<summary>Click to expand</summary>
- [Model description](#model-description)
- [Intended uses and limitations](#intended-use)
- [How to use](#how-to-use)
- [Limitations and bias](#limitations-and-bias)
- [Training](#training)
- [Training](#training)
- [Training data](#training-data)
- [Training procedure](#training-procedure)
- [Evaluation](#evaluation)
- [Evaluation](#evaluation)
- [Variable and metrics](#variable-and-metrics)
- [Evaluation results](#evaluation-results)
- [Additional information](#additional-information)
- [Author](#author)
- [Contact information](#contact-information)
- [Copyright](#copyright)
- [Licensing information](#licensing-information)
- [Funding](#funding)
- [Citing information](#citing-information)
- [Disclaimer](#disclaimer)
</details>
## Model description
The **roberta-base-bne-capitel-pos** is a Part-of-speech-tagging (POS) model for the Spanish language fine-tuned from the [roberta-base-bne](https://huggingface.co/PlanTL-GOB-ES/roberta-base-bne) model, a [RoBERTa](https://arxiv.org/abs/1907.11692) base model pre-trained using the largest Spanish corpus known to date, with a total of 570GB of clean and deduplicated text, processed for this work, compiled from the web crawlings performed by the [National Library of Spain (Biblioteca Nacional de España)](http://www.bne.es/en/Inicio/index.html) from 2009 to 2019.
# Intended uses and limitations
**roberta-base-bne-capitel-pos** model can be used to Part-of-speech-tagging (POS) a text. The model is limited by its training dataset and may not generalize well for all use cases.
## How to use
Here is how to use this model:
```python
from transformers import pipeline
from pprint import pprint
nlp = pipeline("token-classification", model="PlanTL-GOB-ES/roberta-base-bne-capitel-pos")
example = "El alcalde de Vigo, Abel Caballero, ha comenzado a colocar las luces de Navidad en agosto."
pos_results = nlp(example)
pprint(pos_results)
```
## Limitations and bias
At the time of submission, no measures have been taken to estimate the bias embedded in the model. However, we are well aware that our models may be biased since the corpora have been collected using crawling techniques on multiple web sources. We intend to conduct research in these areas in the future, and if completed, this model card will be updated.
## Training
The dataset used is the one from the [CAPITEL competition at IberLEF 2020](https://sites.google.com/view/capitel2020) (sub-task 2).
### Training procedure
The model was trained with a batch size of 32 and a learning rate of 5e-5 for 5 epochs. We then selected the best checkpoint using the downstream task metric in the corresponding development set and then evaluated it on the test set.
## Evaluation
### Variable and metrics
This model was finetuned maximizing F1 score.
## Evaluation results
We evaluated the **roberta-base-bne-capitel-pos** on the CAPITEL-POS test set against standard multilingual and monolingual baselines:
| Model | CAPITEL-POS (F1) |
| ------------|:----|
| roberta-large-bne-capitel-pos | **98.56** |
| roberta-base-bne-capitel-pos | 98.46 |
| BETO | 98.36 |
| mBERT | 98.39 |
| BERTIN | 98.47 |
| ELECTRA | 98.16 |
For more details, check the fine-tuning and evaluation scripts in the official [GitHub repository](https://github.com/PlanTL-GOB-ES/lm-spanish).
## Additional information
### Author
Text Mining Unit (TeMU) at the Barcelona Supercomputing Center ([email protected])
### Contact information
For further information, send an email to <[email protected]>
### Copyright
Copyright by the Spanish State Secretariat for Digitalization and Artificial Intelligence (SEDIA) (2022)
### Licensing information
[Apache License, Version 2.0](https://www.apache.org/licenses/LICENSE-2.0)
### Funding
This work was funded by the Spanish State Secretariat for Digitalization and Artificial Intelligence (SEDIA) within the framework of the Plan-TL.
### Citing information
If you use this model, please cite our [paper](http://journal.sepln.org/sepln/ojs/ojs/index.php/pln/article/view/6405):
```
@article{,
abstract = {We want to thank the National Library of Spain for such a large effort on the data gathering and the Future of Computing Center, a
Barcelona Supercomputing Center and IBM initiative (2020). This work was funded by the Spanish State Secretariat for Digitalization and Artificial
Intelligence (SEDIA) within the framework of the Plan-TL.},
author = {Asier Gutiérrez Fandiño and Jordi Armengol Estapé and Marc Pàmies and Joan Llop Palao and Joaquin Silveira Ocampo and Casimiro Pio Carrino and Carme Armentano Oller and Carlos Rodriguez Penagos and Aitor Gonzalez Agirre and Marta Villegas},
doi = {10.26342/2022-68-3},
issn = {1135-5948},
journal = {Procesamiento del Lenguaje Natural},
keywords = {Artificial intelligence,Benchmarking,Data processing.,MarIA,Natural language processing,Spanish language modelling,Spanish language resources,Tractament del llenguatge natural (Informàtica),Àrees temàtiques de la UPC::Informàtica::Intel·ligència artificial::Llenguatge natural},
publisher = {Sociedad Española para el Procesamiento del Lenguaje Natural},
title = {MarIA: Spanish Language Models},
volume = {68},
url = {https://upcommons.upc.edu/handle/2117/367156#.YyMTB4X9A-0.mendeley},
year = {2022},
}
```
### Disclaimer
The models published in this repository are intended for a generalist purpose and are available to third parties. These models may have bias and/or any other undesirable distortions.
When third parties, deploy or provide systems and/or services to other parties using any of these models (or using systems based on these models) or become users of the models, they should note that it is their responsibility to mitigate the risks arising from their use and, in any event, to comply with applicable regulations, including regulations regarding the use of artificial intelligence.
In no event shall the owner of the models (SEDIA – State Secretariat for digitalization and artificial intelligence) nor the creator (BSC – Barcelona Supercomputing Center) be liable for any results arising from the use made by third parties of these models.
Los modelos publicados en este repositorio tienen una finalidad generalista y están a disposición de terceros. Estos modelos pueden tener sesgos y/u otro tipo de distorsiones indeseables.
Cuando terceros desplieguen o proporcionen sistemas y/o servicios a otras partes usando alguno de estos modelos (o utilizando sistemas basados en estos modelos) o se conviertan en usuarios de los modelos, deben tener en cuenta que es su responsabilidad mitigar los riesgos derivados de su uso y, en todo caso, cumplir con la normativa aplicable, incluyendo la normativa en materia de uso de inteligencia artificial.
En ningún caso el propietario de los modelos (SEDIA – Secretaría de Estado de Digitalización e Inteligencia Artificial) ni el creador (BSC – Barcelona Supercomputing Center) serán responsables de los resultados derivados del uso que hagan terceros de estos modelos. | {"language": ["es"], "license": "apache-2.0", "tags": ["national library of spain", "spanish", "bne", "capitel", "pos"], "datasets": ["bne", "capitel"], "metrics": ["f1"], "inference": {"parameters": {"aggregation_strategy": "first"}}, "widget": [{"text": "Festival de San Sebasti\u00e1n: Johnny Depp recibir\u00e1 el premio Donostia en pleno rifirrafe judicial con Amber Heard"}, {"text": "El alcalde de Vigo, Abel Caballero, ha comenzado a colocar las luces de Navidad en agosto."}, {"text": "Gracias a los datos de la BNE, se ha podido lograr este modelo del lenguaje."}], "model-index": [{"name": "roberta-base-bne-capiter-pos", "results": [{"task": {"type": "token-classification"}, "dataset": {"name": "CAPITEL-POS", "type": "pos"}, "metrics": [{"type": "f1", "value": 0.9846, "name": "F1"}]}]}]} | token-classification | PlanTL-GOB-ES/roberta-base-bne-capitel-pos | [
"transformers",
"pytorch",
"roberta",
"token-classification",
"national library of spain",
"spanish",
"bne",
"capitel",
"pos",
"es",
"dataset:bne",
"dataset:capitel",
"arxiv:1907.11692",
"license:apache-2.0",
"model-index",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [
"1907.11692"
] | [
"es"
] | TAGS
#transformers #pytorch #roberta #token-classification #national library of spain #spanish #bne #capitel #pos #es #dataset-bne #dataset-capitel #arxiv-1907.11692 #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us
| Spanish RoBERTa-base trained on BNE finetuned for CAPITEL Part of Speech (POS) dataset
======================================================================================
Table of contents
-----------------
Click to expand
* Model description
* Intended uses and limitations
* How to use
* Limitations and bias
* Training
* Training
+ Training data
+ Training procedure
* Evaluation
* Evaluation
+ Variable and metrics
+ Evaluation results
* Additional information
+ Author
+ Contact information
+ Copyright
+ Licensing information
+ Funding
+ Citing information
+ Disclaimer
Model description
-----------------
The roberta-base-bne-capitel-pos is a Part-of-speech-tagging (POS) model for the Spanish language fine-tuned from the roberta-base-bne model, a RoBERTa base model pre-trained using the largest Spanish corpus known to date, with a total of 570GB of clean and deduplicated text, processed for this work, compiled from the web crawlings performed by the National Library of Spain (Biblioteca Nacional de España) from 2009 to 2019.
Intended uses and limitations
=============================
roberta-base-bne-capitel-pos model can be used to Part-of-speech-tagging (POS) a text. The model is limited by its training dataset and may not generalize well for all use cases.
How to use
----------
Here is how to use this model:
Limitations and bias
--------------------
At the time of submission, no measures have been taken to estimate the bias embedded in the model. However, we are well aware that our models may be biased since the corpora have been collected using crawling techniques on multiple web sources. We intend to conduct research in these areas in the future, and if completed, this model card will be updated.
Training
--------
The dataset used is the one from the CAPITEL competition at IberLEF 2020 (sub-task 2).
### Training procedure
The model was trained with a batch size of 32 and a learning rate of 5e-5 for 5 epochs. We then selected the best checkpoint using the downstream task metric in the corresponding development set and then evaluated it on the test set.
Evaluation
----------
### Variable and metrics
This model was finetuned maximizing F1 score.
Evaluation results
------------------
We evaluated the roberta-base-bne-capitel-pos on the CAPITEL-POS test set against standard multilingual and monolingual baselines:
For more details, check the fine-tuning and evaluation scripts in the official GitHub repository.
Additional information
----------------------
### Author
Text Mining Unit (TeMU) at the Barcelona Supercomputing Center (bsc-temu@URL)
### Contact information
For further information, send an email to [plantl-gob-es@URL](mailto:plantl-gob-es@URL)
### Copyright
Copyright by the Spanish State Secretariat for Digitalization and Artificial Intelligence (SEDIA) (2022)
### Licensing information
Apache License, Version 2.0
### Funding
This work was funded by the Spanish State Secretariat for Digitalization and Artificial Intelligence (SEDIA) within the framework of the Plan-TL.
### Citing information
If you use this model, please cite our paper:
### Disclaimer
The models published in this repository are intended for a generalist purpose and are available to third parties. These models may have bias and/or any other undesirable distortions.
When third parties, deploy or provide systems and/or services to other parties using any of these models (or using systems based on these models) or become users of the models, they should note that it is their responsibility to mitigate the risks arising from their use and, in any event, to comply with applicable regulations, including regulations regarding the use of artificial intelligence.
In no event shall the owner of the models (SEDIA – State Secretariat for digitalization and artificial intelligence) nor the creator (BSC – Barcelona Supercomputing Center) be liable for any results arising from the use made by third parties of these models.
Los modelos publicados en este repositorio tienen una finalidad generalista y están a disposición de terceros. Estos modelos pueden tener sesgos y/u otro tipo de distorsiones indeseables.
Cuando terceros desplieguen o proporcionen sistemas y/o servicios a otras partes usando alguno de estos modelos (o utilizando sistemas basados en estos modelos) o se conviertan en usuarios de los modelos, deben tener en cuenta que es su responsabilidad mitigar los riesgos derivados de su uso y, en todo caso, cumplir con la normativa aplicable, incluyendo la normativa en materia de uso de inteligencia artificial.
En ningún caso el propietario de los modelos (SEDIA – Secretaría de Estado de Digitalización e Inteligencia Artificial) ni el creador (BSC – Barcelona Supercomputing Center) serán responsables de los resultados derivados del uso que hagan terceros de estos modelos.
| [
"### Training procedure\n\n\nThe model was trained with a batch size of 32 and a learning rate of 5e-5 for 5 epochs. We then selected the best checkpoint using the downstream task metric in the corresponding development set and then evaluated it on the test set.\n\n\nEvaluation\n----------",
"### Variable and metrics\n\n\nThis model was finetuned maximizing F1 score.\n\n\nEvaluation results\n------------------\n\n\nWe evaluated the roberta-base-bne-capitel-pos on the CAPITEL-POS test set against standard multilingual and monolingual baselines:\n\n\n\nFor more details, check the fine-tuning and evaluation scripts in the official GitHub repository.\n\n\nAdditional information\n----------------------",
"### Author\n\n\nText Mining Unit (TeMU) at the Barcelona Supercomputing Center (bsc-temu@URL)",
"### Contact information\n\n\nFor further information, send an email to [plantl-gob-es@URL](mailto:plantl-gob-es@URL)",
"### Copyright\n\n\nCopyright by the Spanish State Secretariat for Digitalization and Artificial Intelligence (SEDIA) (2022)",
"### Licensing information\n\n\nApache License, Version 2.0",
"### Funding\n\n\nThis work was funded by the Spanish State Secretariat for Digitalization and Artificial Intelligence (SEDIA) within the framework of the Plan-TL.",
"### Citing information\n\n\nIf you use this model, please cite our paper:",
"### Disclaimer\n\n\nThe models published in this repository are intended for a generalist purpose and are available to third parties. These models may have bias and/or any other undesirable distortions.\n\n\nWhen third parties, deploy or provide systems and/or services to other parties using any of these models (or using systems based on these models) or become users of the models, they should note that it is their responsibility to mitigate the risks arising from their use and, in any event, to comply with applicable regulations, including regulations regarding the use of artificial intelligence.\n\n\nIn no event shall the owner of the models (SEDIA – State Secretariat for digitalization and artificial intelligence) nor the creator (BSC – Barcelona Supercomputing Center) be liable for any results arising from the use made by third parties of these models.\n\n\nLos modelos publicados en este repositorio tienen una finalidad generalista y están a disposición de terceros. Estos modelos pueden tener sesgos y/u otro tipo de distorsiones indeseables.\n\n\nCuando terceros desplieguen o proporcionen sistemas y/o servicios a otras partes usando alguno de estos modelos (o utilizando sistemas basados en estos modelos) o se conviertan en usuarios de los modelos, deben tener en cuenta que es su responsabilidad mitigar los riesgos derivados de su uso y, en todo caso, cumplir con la normativa aplicable, incluyendo la normativa en materia de uso de inteligencia artificial.\n\n\nEn ningún caso el propietario de los modelos (SEDIA – Secretaría de Estado de Digitalización e Inteligencia Artificial) ni el creador (BSC – Barcelona Supercomputing Center) serán responsables de los resultados derivados del uso que hagan terceros de estos modelos."
] | [
"TAGS\n#transformers #pytorch #roberta #token-classification #national library of spain #spanish #bne #capitel #pos #es #dataset-bne #dataset-capitel #arxiv-1907.11692 #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us \n",
"### Training procedure\n\n\nThe model was trained with a batch size of 32 and a learning rate of 5e-5 for 5 epochs. We then selected the best checkpoint using the downstream task metric in the corresponding development set and then evaluated it on the test set.\n\n\nEvaluation\n----------",
"### Variable and metrics\n\n\nThis model was finetuned maximizing F1 score.\n\n\nEvaluation results\n------------------\n\n\nWe evaluated the roberta-base-bne-capitel-pos on the CAPITEL-POS test set against standard multilingual and monolingual baselines:\n\n\n\nFor more details, check the fine-tuning and evaluation scripts in the official GitHub repository.\n\n\nAdditional information\n----------------------",
"### Author\n\n\nText Mining Unit (TeMU) at the Barcelona Supercomputing Center (bsc-temu@URL)",
"### Contact information\n\n\nFor further information, send an email to [plantl-gob-es@URL](mailto:plantl-gob-es@URL)",
"### Copyright\n\n\nCopyright by the Spanish State Secretariat for Digitalization and Artificial Intelligence (SEDIA) (2022)",
"### Licensing information\n\n\nApache License, Version 2.0",
"### Funding\n\n\nThis work was funded by the Spanish State Secretariat for Digitalization and Artificial Intelligence (SEDIA) within the framework of the Plan-TL.",
"### Citing information\n\n\nIf you use this model, please cite our paper:",
"### Disclaimer\n\n\nThe models published in this repository are intended for a generalist purpose and are available to third parties. These models may have bias and/or any other undesirable distortions.\n\n\nWhen third parties, deploy or provide systems and/or services to other parties using any of these models (or using systems based on these models) or become users of the models, they should note that it is their responsibility to mitigate the risks arising from their use and, in any event, to comply with applicable regulations, including regulations regarding the use of artificial intelligence.\n\n\nIn no event shall the owner of the models (SEDIA – State Secretariat for digitalization and artificial intelligence) nor the creator (BSC – Barcelona Supercomputing Center) be liable for any results arising from the use made by third parties of these models.\n\n\nLos modelos publicados en este repositorio tienen una finalidad generalista y están a disposición de terceros. Estos modelos pueden tener sesgos y/u otro tipo de distorsiones indeseables.\n\n\nCuando terceros desplieguen o proporcionen sistemas y/o servicios a otras partes usando alguno de estos modelos (o utilizando sistemas basados en estos modelos) o se conviertan en usuarios de los modelos, deben tener en cuenta que es su responsabilidad mitigar los riesgos derivados de su uso y, en todo caso, cumplir con la normativa aplicable, incluyendo la normativa en materia de uso de inteligencia artificial.\n\n\nEn ningún caso el propietario de los modelos (SEDIA – Secretaría de Estado de Digitalización e Inteligencia Artificial) ni el creador (BSC – Barcelona Supercomputing Center) serán responsables de los resultados derivados del uso que hagan terceros de estos modelos."
] | [
90,
65,
92,
28,
37,
22,
12,
34,
16,
363
] | [
"passage: TAGS\n#transformers #pytorch #roberta #token-classification #national library of spain #spanish #bne #capitel #pos #es #dataset-bne #dataset-capitel #arxiv-1907.11692 #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us \n### Training procedure\n\n\nThe model was trained with a batch size of 32 and a learning rate of 5e-5 for 5 epochs. We then selected the best checkpoint using the downstream task metric in the corresponding development set and then evaluated it on the test set.\n\n\nEvaluation\n----------### Variable and metrics\n\n\nThis model was finetuned maximizing F1 score.\n\n\nEvaluation results\n------------------\n\n\nWe evaluated the roberta-base-bne-capitel-pos on the CAPITEL-POS test set against standard multilingual and monolingual baselines:\n\n\n\nFor more details, check the fine-tuning and evaluation scripts in the official GitHub repository.\n\n\nAdditional information\n----------------------### Author\n\n\nText Mining Unit (TeMU) at the Barcelona Supercomputing Center (bsc-temu@URL)### Contact information\n\n\nFor further information, send an email to [plantl-gob-es@URL](mailto:plantl-gob-es@URL)### Copyright\n\n\nCopyright by the Spanish State Secretariat for Digitalization and Artificial Intelligence (SEDIA) (2022)### Licensing information\n\n\nApache License, Version 2.0### Funding\n\n\nThis work was funded by the Spanish State Secretariat for Digitalization and Artificial Intelligence (SEDIA) within the framework of the Plan-TL.### Citing information\n\n\nIf you use this model, please cite our paper:"
] | [
-0.08255180716514587,
0.22281205654144287,
-0.0072755152359604836,
0.06183510273694992,
0.1288072168827057,
-0.0220012404024601,
0.055591657757759094,
0.10997007042169571,
-0.016017867252230644,
0.11675001680850983,
-0.010182260535657406,
0.03928397223353386,
0.1033327728509903,
0.13161049783229828,
0.048959046602249146,
-0.17940443754196167,
-0.0135072972625494,
-0.08836732804775238,
0.0019805384799838066,
0.10949632525444031,
0.10219231992959976,
-0.05882569029927254,
0.04955660551786423,
-0.03004014864563942,
-0.005926990415900946,
0.07089582830667496,
-0.06083671376109123,
-0.08165401965379715,
0.059261519461870193,
0.0624752901494503,
0.05187615379691124,
0.022582007572054863,
0.04596875235438347,
-0.2171839326620102,
0.015752824023365974,
0.05182112753391266,
0.004511090461164713,
0.04763169214129448,
0.10474265366792679,
-0.03542061522603035,
0.18402084708213806,
-0.10788217931985855,
0.017139818519353867,
0.04745231568813324,
-0.09835561364889145,
-0.12441109865903854,
-0.12449011206626892,
0.07035490870475769,
0.0801386758685112,
0.04601045697927475,
-0.03426649421453476,
0.06401245296001434,
-0.055641211569309235,
0.02107561007142067,
0.06760115921497345,
-0.17405055463314056,
-0.040354494005441666,
0.030090125277638435,
0.02409999817609787,
0.11872488260269165,
-0.07769265025854111,
-0.01084920670837164,
0.0436381958425045,
-0.01206225249916315,
-0.005834994371980429,
-0.0258468110114336,
-0.03523765131831169,
0.017211737111210823,
-0.11736080795526505,
-0.11500649899244308,
0.1505025327205658,
0.0044862860813736916,
-0.096026211977005,
-0.12245924770832062,
-0.01088806614279747,
0.010610864497721195,
0.03591667115688324,
-0.0471021868288517,
0.040840648114681244,
-0.014447323977947235,
0.07357244193553925,
-0.03389081731438637,
-0.09403417259454727,
-0.03223981335759163,
-0.03867410123348236,
0.07197317481040955,
0.013945896178483963,
-0.017770128324627876,
0.0037268484011292458,
0.13944993913173676,
0.03708554431796074,
-0.13066086173057556,
-0.022787395864725113,
-0.003806977067142725,
-0.031259242445230484,
-0.04910631477832794,
0.03516140207648277,
-0.05187646672129631,
0.0985862985253334,
0.1888929009437561,
-0.051232367753982544,
0.010764752514660358,
-0.01895492523908615,
-0.0001946275879163295,
0.0992373377084732,
0.1768771857023239,
-0.0733254924416542,
-0.08228447288274765,
0.0003163653891533613,
0.00783993024379015,
0.018652427941560745,
0.022284753620624542,
-0.0160836149007082,
0.023290535435080528,
0.04845172166824341,
0.1345154196023941,
0.07508865743875504,
-0.03765300661325455,
-0.08412033319473267,
-0.017813200131058693,
0.1574573516845703,
-0.1575159877538681,
0.031102390959858894,
0.0170956589281559,
-0.07004600763320923,
0.03314141556620598,
-0.04212260991334915,
-0.03726353123784065,
-0.11471875011920929,
0.06342335790395737,
-0.04578873887658119,
-0.02810138463973999,
-0.07378128916025162,
-0.04254093021154404,
0.07163022458553314,
-0.04371510073542595,
-0.0005385102122090757,
-0.08808721601963043,
-0.08437790721654892,
-0.0855831578373909,
0.0424279160797596,
-0.11671550571918488,
0.004436511546373367,
-0.020072102546691895,
-0.0015368115855380893,
0.0045121279545128345,
-0.04657452553510666,
0.04002340883016586,
-0.06128085032105446,
0.059482138603925705,
0.03234667330980301,
0.02069387584924698,
0.0860103964805603,
0.004908525384962559,
-0.11012563854455948,
0.0025198794901371002,
-0.14441117644309998,
0.09928971529006958,
-0.09780114889144897,
0.01833726279437542,
-0.19196486473083496,
-0.053632356226444244,
0.002400017809122801,
0.009992063976824284,
0.044450853019952774,
0.18926399946212769,
-0.10370569676160812,
-0.038268886506557465,
0.14983156323432922,
-0.04773293808102608,
-0.056545183062553406,
0.11625105142593384,
0.0017962786369025707,
0.06874378770589828,
0.062296077609062195,
0.06277921050786972,
0.13473671674728394,
-0.174388587474823,
-0.05184551700949669,
-0.005754116456955671,
-0.02628539316356182,
0.0764831006526947,
0.12518680095672607,
-0.09609480947256088,
0.06555438786745071,
0.0355774350464344,
-0.13329926133155823,
-0.031000884249806404,
-0.019792642444372177,
-0.03839986398816109,
0.0366184301674366,
0.014105383306741714,
-0.042299915105104446,
-0.005484842695295811,
-0.029779143631458282,
-0.0333465151488781,
-0.10917572677135468,
-0.02073979750275612,
0.04651683568954468,
0.018653307110071182,
0.00449199415743351,
-0.09938649088144302,
0.0757988840341568,
-0.03890243545174599,
0.007390403188765049,
-0.18627777695655823,
-0.0077998754568398,
0.05172357335686684,
-0.08110720664262772,
0.10285693407058716,
-0.06949520856142044,
0.014834949746727943,
0.012477942742407322,
-0.025034110993146896,
-0.014538612216711044,
-0.035237424075603485,
-0.060319479554891586,
0.011244970373809338,
-0.11230085045099258,
0.0038616599049419165,
-0.03213728591799736,
0.0671989917755127,
-0.08305231481790543,
0.010745570063591003,
0.12954019010066986,
0.08561649173498154,
-0.0197673998773098,
-0.03946934640407562,
0.022539226338267326,
0.008532613515853882,
-0.023673968389630318,
-0.0686994194984436,
0.028138449415564537,
0.009703320451080799,
-0.022529400885105133,
-0.02845664508640766,
-0.03690327703952789,
-0.031813349574804306,
0.06517393887042999,
0.11346589028835297,
-0.04908369109034538,
-0.03785288706421852,
-0.043341103941202164,
0.00542951887473464,
-0.06368927657604218,
-0.01778801530599594,
0.21773914992809296,
0.02911647967994213,
0.06829346716403961,
-0.1372482180595398,
-0.07872099429368973,
-0.0025039513129740953,
-0.04384550079703331,
-0.07921584695577621,
0.14071707427501678,
0.03621304780244827,
-0.07683388143777847,
0.0677146166563034,
-0.004403956234455109,
0.05678112432360649,
0.1778469979763031,
-0.0011022844118997455,
-0.10631965100765228,
-0.04151871055364609,
0.08883086591959,
0.03067508526146412,
0.05390455573797226,
-0.047844696789979935,
0.009420113638043404,
0.05036148801445961,
0.00865316204726696,
0.06482376903295517,
-0.13258546590805054,
0.0325937457382679,
0.009088323451578617,
-0.05579017475247383,
-0.005209979601204395,
0.023904409259557724,
-0.02371744066476822,
0.08293907344341278,
0.04035473242402077,
0.06287846714258194,
-0.054828450083732605,
-0.041919466108083725,
-0.10830371826887131,
0.1479717195034027,
-0.06713517010211945,
-0.2774192690849304,
-0.2438400685787201,
0.035156574100255966,
-0.030953289940953255,
0.020414995029568672,
0.04568696767091751,
-0.10069528222084045,
-0.07810822874307632,
-0.06405531615018845,
0.013567034155130386,
0.07419465482234955,
-0.08069289475679398,
0.004847404081374407,
0.059448204934597015,
0.004970472771674395,
-0.10855776071548462,
-0.00422071386128664,
0.05270267650485039,
-0.0452083982527256,
-0.022799337282776833,
0.021656623110175133,
0.11996079236268997,
0.08718660473823547,
0.017257260158658028,
-0.010799243114888668,
-0.0060060890391469,
0.1810576468706131,
-0.14237840473651886,
0.04434872791171074,
0.23242956399917603,
0.04924993962049484,
0.03404981270432472,
0.1532103419303894,
0.01572353020310402,
-0.06262748688459396,
0.0326625294983387,
0.009174137376248837,
-0.03729315102100372,
-0.27959689497947693,
-0.056424833834171295,
-0.030070820823311806,
-0.040293119847774506,
0.06106395646929741,
0.09045317023992538,
-0.028569787740707397,
0.014142045751214027,
-0.050091702491045,
-0.06392473727464676,
0.05924779921770096,
0.09934135526418686,
0.03336639329791069,
0.023188171908259392,
0.022668756544589996,
-0.0622471459209919,
-0.04701096564531326,
0.13464941084384918,
0.08106984943151474,
0.1174941286444664,
0.008625647984445095,
0.15174232423305511,
0.05774983391165733,
0.05919109284877777,
-0.045550134032964706,
0.03248138725757599,
0.048885732889175415,
0.02376953698694706,
-0.031513482332229614,
-0.07307633012533188,
-0.04370708763599396,
-0.00033696385798975825,
0.007721447851508856,
-0.03286973387002945,
-0.030362984165549278,
-0.09085389971733093,
0.08737502992153168,
0.12411589920520782,
-0.02346191741526127,
-0.15863753855228424,
-0.055598679929971695,
0.03777493163943291,
-0.08100972324609756,
-0.07217171788215637,
-0.03012656234204769,
0.027918299660086632,
-0.16745804250240326,
0.02536645345389843,
-0.011990442872047424,
0.10694193094968796,
-0.06671721488237381,
-0.02600729465484619,
0.003205563873052597,
0.042846277356147766,
0.00238583842292428,
0.11537278443574905,
-0.12878544628620148,
0.1406620740890503,
-0.0013273174408823252,
0.10512128472328186,
-0.0440819226205349,
0.06409876048564911,
-0.027556587010622025,
-0.01923178695142269,
0.15998809039592743,
-0.009451755322515965,
-0.005103548057377338,
-0.04444282129406929,
-0.0644569844007492,
0.003804535372182727,
0.0762270838022232,
-0.13727109134197235,
0.11586117744445801,
-0.011520653031766415,
-0.01487799920141697,
-0.11307667195796967,
-0.09627658873796463,
-0.08428288251161575,
-0.15646587312221527,
0.03990292549133301,
-0.12165318429470062,
0.07621061056852341,
-0.04609508812427521,
-0.04867104813456535,
-0.024345524609088898,
0.18707172572612762,
-0.2166934460401535,
-0.08726787567138672,
-0.131024569272995,
0.033038340508937836,
0.12307136505842209,
-0.08166418224573135,
0.030518043786287308,
-0.03236670047044754,
0.09163831919431686,
0.02804376557469368,
-0.01622878946363926,
0.01940084435045719,
-0.0651748850941658,
-0.1341182291507721,
-0.025690879672765732,
0.16220511496067047,
0.05611921474337578,
0.034796182066202164,
-0.004832520615309477,
-0.019170820713043213,
0.009411400184035301,
-0.10191582143306732,
-0.04640683904290199,
0.09480398148298264,
0.14230142533779144,
0.07781699299812317,
-0.012634257785975933,
-0.158008873462677,
-0.1346009522676468,
-0.08154918998479843,
0.05988077446818352,
0.20628415048122406,
-0.007420781534165144,
0.10195767879486084,
0.1720912605524063,
-0.12629519402980804,
-0.15281915664672852,
-0.08157354593276978,
0.058602474629879,
0.0179547518491745,
0.027303004637360573,
-0.17728720605373383,
0.0027520074509084225,
0.0946483239531517,
-0.0069185164757072926,
-0.01365889236330986,
-0.26744359731674194,
-0.12021999061107635,
0.008319510146975517,
0.03152956813573837,
-0.09927785396575928,
-0.1291806697845459,
-0.09941831231117249,
-0.05053345486521721,
-0.14623206853866577,
0.047560982406139374,
0.009391055442392826,
0.040264274924993515,
0.0015155744040384889,
0.002727156737819314,
0.03991948440670967,
-0.032190438359975815,
0.18827113509178162,
-0.030026433989405632,
0.025288492441177368,
-0.040639009326696396,
-0.00675196573138237,
0.07708966732025146,
-0.011592584662139416,
0.11562170833349228,
0.009529097937047482,
0.020701605826616287,
-0.11250664293766022,
-0.06097334623336792,
-0.03626013174653053,
0.022877270355820656,
-0.04585570842027664,
-0.011340009048581123,
-0.09429565072059631,
0.09288868308067322,
0.047503773123025894,
-0.013525635004043579,
0.02045755460858345,
-0.07766093313694,
-0.010399464517831802,
0.17992687225341797,
0.1553819328546524,
0.07635635882616043,
-0.043070998042821884,
-0.015338546596467495,
-0.00003646355980890803,
0.02779780514538288,
-0.13324713706970215,
0.0173642598092556,
0.12992437183856964,
0.014316353015601635,
0.08764869719743729,
-0.031190913170576096,
-0.1351352483034134,
0.004093428608030081,
0.1400170922279358,
-0.05361004173755646,
-0.15054263174533844,
-0.017949525266885757,
-0.01041119173169136,
-0.12469618022441864,
-0.012537073343992233,
0.10688450932502747,
0.021690184250473976,
-0.07147674262523651,
0.016672717407345772,
0.07222356647253036,
-0.0047242296859622,
0.13632044196128845,
0.015911797061562538,
0.044074930250644684,
-0.05646314471960068,
0.133909672498703,
0.12377343326807022,
-0.12773248553276062,
-0.023302974179387093,
0.08153130114078522,
-0.06527607887983322,
-0.031147561967372894,
0.054629210382699966,
-0.0107633201405406,
-0.08338871598243713,
-0.07085409015417099,
-0.0742247998714447,
-0.03305225819349289,
0.014706713147461414,
0.05992267280817032,
0.03044680319726467,
0.024080485105514526,
0.011158418841660023,
0.03359448164701462,
-0.03871210664510727,
0.07391733676195145,
0.10770519077777863,
-0.0194531437009573,
-0.08875727653503418,
0.008692564442753792,
0.006022442597895861,
-0.01673319563269615,
-0.018366971984505653,
-0.02064472995698452,
-0.09868130832910538,
0.006852216552942991,
-0.040200743824243546,
0.04697445407509804,
-0.06144535169005394,
-0.00984344631433487,
-0.009966457262635231,
-0.04450320079922676,
-0.04371989518404007,
0.0048524304293096066,
-0.03397778794169426,
-0.04380865395069122,
-0.04862581938505173,
0.14091432094573975,
-0.1402728259563446,
0.04228619113564491,
0.09441228210926056,
-0.0727313905954361,
0.07962892949581146,
-0.004879320040345192,
0.0011304437648504972,
0.08536865562200546,
-0.1649041771888733,
0.04929482191801071,
0.008607622236013412,
0.049962688237428665,
0.024119628593325615,
-0.13113218545913696,
0.04949468746781349,
0.03501596674323082,
-0.05625690892338753,
0.01890735700726509,
0.023509953171014786,
-0.1075751781463623,
0.003708964679390192,
0.020876988768577576,
-0.06903568655252457,
-0.06618896871805191,
0.09303770959377289,
0.12633919715881348,
0.027563568204641342,
0.11533338576555252,
-0.0649752989411354,
0.0037525042425841093,
-0.1493632048368454,
-0.012160911224782467,
0.00495025422424078,
0.01905573345720768,
-0.03141554445028305,
-0.04682512208819389,
0.05103277415037155,
0.026978418231010437,
0.16104665398597717,
0.0632663443684578,
0.11820056289434433,
0.03397907316684723,
0.02570270374417305,
0.0366019606590271,
0.02925451472401619,
0.04974682629108429,
0.006446443498134613,
0.02070964127779007,
-0.02579386904835701,
-0.03142862766981125,
-0.05144865810871124,
-0.1042797788977623,
0.058999497443437576,
0.1307167112827301,
0.10291840881109238,
0.046376679092645645,
-0.01027209684252739,
-0.04002486914396286,
-0.04445235803723335,
0.006179564632475376,
-0.002862042048946023,
0.00656073447316885,
-0.052206579595804214,
0.08993735909461975,
0.20588120818138123,
-0.18803688883781433,
0.10443403571844101,
-0.01971949078142643,
-0.05826159939169884,
-0.06003749743103981,
-0.20766353607177734,
-0.031050752848386765,
-0.07601408660411835,
0.03895393759012222,
-0.09817590564489365,
0.08667083084583282,
0.004851518198847771,
-0.0017953841015696526,
-0.07973583787679672,
0.07885245978832245,
-0.056363753974437714,
-0.13278967142105103,
0.06759925186634064,
0.010784256272017956,
0.0934465155005455,
0.006628986913710833,
0.09604529291391373,
-0.0017550108022987843,
0.08202976733446121,
0.0988779366016388,
0.10782992094755173,
0.0381527878344059,
0.0029613992664963007,
-0.06931494176387787,
-0.03444506973028183,
0.01981355808675289,
-0.005794397555291653,
-0.006881518289446831,
0.1914987415075302,
0.0300860945135355,
-0.01747288554906845,
0.021225357428193092,
0.25312238931655884,
-0.017995471134781837,
-0.04973108693957329,
-0.13515619933605194,
0.13408422470092773,
0.037008993327617645,
0.06392557919025421,
0.03521600365638733,
-0.1328141689300537,
-0.037749335169792175,
0.11530666053295135,
0.08197800070047379,
0.010388242080807686,
-0.02970731258392334,
-0.0038212290965020657,
0.017971469089388847,
0.014282627031207085,
0.0550740621984005,
0.04718009755015373,
0.236362487077713,
-0.05530112236738205,
0.07204350084066391,
-0.035559557378292084,
0.004956385120749474,
-0.03784097358584404,
0.1159847155213356,
-0.04401453584432602,
-0.0060100252740085125,
-0.067498579621315,
0.15820878744125366,
-0.06904561072587967,
-0.2959945797920227,
0.04898010566830635,
-0.030345387756824493,
-0.14251546561717987,
0.001251228735782206,
0.024054545909166336,
0.004536028020083904,
0.05089820548892021,
0.05505800247192383,
-0.03399877995252609,
0.0958043709397316,
0.039507653564214706,
-0.03173435479402542,
-0.047500550746917725,
0.040502820163965225,
-0.0827898234128952,
0.2335638403892517,
-0.00527718011289835,
0.09802227467298508,
0.10019718110561371,
-0.04653692990541458,
-0.15743432939052582,
0.03947073221206665,
0.06023525074124336,
-0.02802756056189537,
0.11969240754842758,
0.0676657184958458,
0.024782001972198486,
-0.002840810688212514,
0.07260449975728989,
0.048148706555366516,
0.03416851535439491,
0.021168550476431847,
0.06848721951246262,
-0.1703590452671051,
0.12743835151195526,
-0.13932110369205475,
0.07740265130996704,
0.09281257539987564,
-0.03906068950891495,
0.06691417843103409,
-0.06762847304344177,
0.08909410238265991,
0.0026177605614066124,
0.18218614161014557,
0.038356468081474304,
-0.16533778607845306,
0.021953599527478218,
-0.031266484409570694,
0.0478278212249279,
-0.2044905126094818,
-0.025643227621912956,
0.043097104877233505,
-0.003245195373892784,
-0.06016167253255844,
0.13459648191928864,
0.0033521323930472136,
0.012508763931691647,
-0.012042338959872723,
-0.16722652316093445,
-0.008043876849114895,
0.08431251347064972,
-0.12003704905509949,
0.00007403700874419883
] |
null | null | transformers |
# Spanish RoBERTa-base trained on BNE finetuned for Spanish Question Answering Corpus (SQAC) dataset.
## Table of contents
<details>
<summary>Click to expand</summary>
- [Model description](#model-description)
- [Intended uses and limitations](#intended-use)
- [How to use](#how-to-use)
- [Limitations and bias](#limitations-and-bias)
- [Training](#training)
- [Training](#training)
- [Training data](#training-data)
- [Training procedure](#training-procedure)
- [Evaluation](#evaluation)
- [Evaluation](#evaluation)
- [Variable and metrics](#variable-and-metrics)
- [Evaluation results](#evaluation-results)
- [Additional information](#additional-information)
- [Author](#author)
- [Contact information](#contact-information)
- [Copyright](#copyright)
- [Licensing information](#licensing-information)
- [Funding](#funding)
- [Citing information](#citing-information)
- [Disclaimer](#disclaimer)
</details>
## Model description
The **roberta-base-bne-sqac** is a Question Answering (QA) model for the Spanish language fine-tuned from the [roberta-base-bne](https://huggingface.co/PlanTL-GOB-ES/roberta-base-bne) model, a [RoBERTa](https://arxiv.org/abs/1907.11692) base model pre-trained using the largest Spanish corpus known to date, with a total of 570GB of clean and deduplicated text, processed for this work, compiled from the web crawlings performed by the [National Library of Spain (Biblioteca Nacional de España)](http://www.bne.es/en/Inicio/index.html) from 2009 to 2019.
## Intended uses and limitations
**roberta-base-bne-sqac** model can be used for extractive question answering. The model is limited by its training dataset and may not generalize well for all use cases.
## How to use
```python
from transformers import pipeline
nlp = pipeline("question-answering", model="PlanTL-GOB-ES/roberta-base-bne-sqac")
text = "¿Dónde vivo?"
context = "Me llamo Wolfgang y vivo en Berlin"
qa_results = nlp(text, context)
print(qa_results)
```
## Limitations and bias
At the time of submission, no measures have been taken to estimate the bias embedded in the model. However, we are well aware that our models may be biased since the corpora have been collected using crawling techniques on multiple web sources. We intend to conduct research in these areas in the future, and if completed, this model card will be updated.
## Training
### Training data
We used the QA dataset in Spanish called [SQAC corpus](https://huggingface.co/datasets/PlanTL-GOB-ES/SQAC) for training and evaluation.
### Training procedure
The model was trained with a batch size of 16 and a learning rate of 5e-5 for 5 epochs. We then selected the best checkpoint using the downstream task metric in the corresponding development set and then evaluated it on the test set.
## Evaluation results
We evaluated the **roberta-base-bne-sqac** on the SQAC test set against standard multilingual and monolingual baselines:
| Model | SQAC (F1) |
| ------------|:----|
| roberta-large-bne-sqac | **82.02** |
| roberta-base-bne-sqac | 79.23|
| BETO | 79.23 |
| mBERT | 75.62 |
| BERTIN | 76.78 |
| ELECTRA | 73.83 |
For more details, check the fine-tuning and evaluation scripts in the official [GitHub repository](https://github.com/PlanTL-GOB-ES/lm-spanish).
## Additional information
### Author
Text Mining Unit (TeMU) at the Barcelona Supercomputing Center ([email protected])
### Contact information
For further information, send an email to <[email protected]>
### Copyright
Copyright by the Spanish State Secretariat for Digitalization and Artificial Intelligence (SEDIA) (2022)
### Licensing information
[Apache License, Version 2.0](https://www.apache.org/licenses/LICENSE-2.0)
### Funding
This work was funded by the Spanish State Secretariat for Digitalization and Artificial Intelligence (SEDIA) within the framework of the Plan-TL.
### Citing information
If you use this model, please cite our [paper](http://journal.sepln.org/sepln/ojs/ojs/index.php/pln/article/view/6405):
```
@article{,
abstract = {We want to thank the National Library of Spain for such a large effort on the data gathering and the Future of Computing Center, a
Barcelona Supercomputing Center and IBM initiative (2020). This work was funded by the Spanish State Secretariat for Digitalization and Artificial
Intelligence (SEDIA) within the framework of the Plan-TL.},
author = {Asier Gutiérrez Fandiño and Jordi Armengol Estapé and Marc Pàmies and Joan Llop Palao and Joaquin Silveira Ocampo and Casimiro Pio Carrino and Carme Armentano Oller and Carlos Rodriguez Penagos and Aitor Gonzalez Agirre and Marta Villegas},
doi = {10.26342/2022-68-3},
issn = {1135-5948},
journal = {Procesamiento del Lenguaje Natural},
keywords = {Artificial intelligence,Benchmarking,Data processing.,MarIA,Natural language processing,Spanish language modelling,Spanish language resources,Tractament del llenguatge natural (Informàtica),Àrees temàtiques de la UPC::Informàtica::Intel·ligència artificial::Llenguatge natural},
publisher = {Sociedad Española para el Procesamiento del Lenguaje Natural},
title = {MarIA: Spanish Language Models},
volume = {68},
url = {https://upcommons.upc.edu/handle/2117/367156#.YyMTB4X9A-0.mendeley},
year = {2022},
}
```
### Disclaimer
The models published in this repository are intended for a generalist purpose and are available to third parties. These models may have bias and/or any other undesirable distortions.
When third parties, deploy or provide systems and/or services to other parties using any of these models (or using systems based on these models) or become users of the models, they should note that it is their responsibility to mitigate the risks arising from their use and, in any event, to comply with applicable regulations, including regulations regarding the use of artificial intelligence.
In no event shall the owner of the models (SEDIA – State Secretariat for digitalization and artificial intelligence) nor the creator (BSC – Barcelona Supercomputing Center) be liable for any results arising from the use made by third parties of these models.
Los modelos publicados en este repositorio tienen una finalidad generalista y están a disposición de terceros. Estos modelos pueden tener sesgos y/u otro tipo de distorsiones indeseables.
Cuando terceros desplieguen o proporcionen sistemas y/o servicios a otras partes usando alguno de estos modelos (o utilizando sistemas basados en estos modelos) o se conviertan en usuarios de los modelos, deben tener en cuenta que es su responsabilidad mitigar los riesgos derivados de su uso y, en todo caso, cumplir con la normativa aplicable, incluyendo la normativa en materia de uso de inteligencia artificial.
En ningún caso el propietario de los modelos (SEDIA – Secretaría de Estado de Digitalización e Inteligencia Artificial) ni el creador (BSC – Barcelona Supercomputing Center) serán responsables de los resultados derivados del uso que hagan terceros de estos modelos. | {"language": ["es"], "license": "apache-2.0", "tags": ["national library of spain", "spanish", "bne", "qa", "question answering"], "datasets": ["PlanTL-GOB-ES/SQAC"], "metrics": ["f1", "exact match"], "model-index": [{"name": "roberta-base-bne-sqac", "results": [{"task": {"type": "question-answering"}, "dataset": {"name": "SQAC", "type": "PlanTL-GOB-ES/SQAC"}, "metrics": [{"type": "f1", "value": 0.7923, "name": "F1"}]}]}]} | question-answering | PlanTL-GOB-ES/roberta-base-bne-sqac | [
"transformers",
"pytorch",
"roberta",
"question-answering",
"national library of spain",
"spanish",
"bne",
"qa",
"question answering",
"es",
"dataset:PlanTL-GOB-ES/SQAC",
"arxiv:1907.11692",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"has_space",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [
"1907.11692"
] | [
"es"
] | TAGS
#transformers #pytorch #roberta #question-answering #national library of spain #spanish #bne #qa #question answering #es #dataset-PlanTL-GOB-ES/SQAC #arxiv-1907.11692 #license-apache-2.0 #model-index #endpoints_compatible #has_space #region-us
| Spanish RoBERTa-base trained on BNE finetuned for Spanish Question Answering Corpus (SQAC) dataset.
===================================================================================================
Table of contents
-----------------
Click to expand
* Model description
* Intended uses and limitations
* How to use
* Limitations and bias
* Training
* Training
+ Training data
+ Training procedure
* Evaluation
* Evaluation
+ Variable and metrics
+ Evaluation results
* Additional information
+ Author
+ Contact information
+ Copyright
+ Licensing information
+ Funding
+ Citing information
+ Disclaimer
Model description
-----------------
The roberta-base-bne-sqac is a Question Answering (QA) model for the Spanish language fine-tuned from the roberta-base-bne model, a RoBERTa base model pre-trained using the largest Spanish corpus known to date, with a total of 570GB of clean and deduplicated text, processed for this work, compiled from the web crawlings performed by the National Library of Spain (Biblioteca Nacional de España) from 2009 to 2019.
Intended uses and limitations
-----------------------------
roberta-base-bne-sqac model can be used for extractive question answering. The model is limited by its training dataset and may not generalize well for all use cases.
How to use
----------
Limitations and bias
--------------------
At the time of submission, no measures have been taken to estimate the bias embedded in the model. However, we are well aware that our models may be biased since the corpora have been collected using crawling techniques on multiple web sources. We intend to conduct research in these areas in the future, and if completed, this model card will be updated.
Training
--------
### Training data
We used the QA dataset in Spanish called SQAC corpus for training and evaluation.
### Training procedure
The model was trained with a batch size of 16 and a learning rate of 5e-5 for 5 epochs. We then selected the best checkpoint using the downstream task metric in the corresponding development set and then evaluated it on the test set.
Evaluation results
------------------
We evaluated the roberta-base-bne-sqac on the SQAC test set against standard multilingual and monolingual baselines:
For more details, check the fine-tuning and evaluation scripts in the official GitHub repository.
Additional information
----------------------
### Author
Text Mining Unit (TeMU) at the Barcelona Supercomputing Center (bsc-temu@URL)
### Contact information
For further information, send an email to [plantl-gob-es@URL](mailto:plantl-gob-es@URL)
### Copyright
Copyright by the Spanish State Secretariat for Digitalization and Artificial Intelligence (SEDIA) (2022)
### Licensing information
Apache License, Version 2.0
### Funding
This work was funded by the Spanish State Secretariat for Digitalization and Artificial Intelligence (SEDIA) within the framework of the Plan-TL.
### Citing information
If you use this model, please cite our paper:
### Disclaimer
The models published in this repository are intended for a generalist purpose and are available to third parties. These models may have bias and/or any other undesirable distortions.
When third parties, deploy or provide systems and/or services to other parties using any of these models (or using systems based on these models) or become users of the models, they should note that it is their responsibility to mitigate the risks arising from their use and, in any event, to comply with applicable regulations, including regulations regarding the use of artificial intelligence.
In no event shall the owner of the models (SEDIA – State Secretariat for digitalization and artificial intelligence) nor the creator (BSC – Barcelona Supercomputing Center) be liable for any results arising from the use made by third parties of these models.
Los modelos publicados en este repositorio tienen una finalidad generalista y están a disposición de terceros. Estos modelos pueden tener sesgos y/u otro tipo de distorsiones indeseables.
Cuando terceros desplieguen o proporcionen sistemas y/o servicios a otras partes usando alguno de estos modelos (o utilizando sistemas basados en estos modelos) o se conviertan en usuarios de los modelos, deben tener en cuenta que es su responsabilidad mitigar los riesgos derivados de su uso y, en todo caso, cumplir con la normativa aplicable, incluyendo la normativa en materia de uso de inteligencia artificial.
En ningún caso el propietario de los modelos (SEDIA – Secretaría de Estado de Digitalización e Inteligencia Artificial) ni el creador (BSC – Barcelona Supercomputing Center) serán responsables de los resultados derivados del uso que hagan terceros de estos modelos.
| [
"### Training data\n\n\nWe used the QA dataset in Spanish called SQAC corpus for training and evaluation.",
"### Training procedure\n\n\nThe model was trained with a batch size of 16 and a learning rate of 5e-5 for 5 epochs. We then selected the best checkpoint using the downstream task metric in the corresponding development set and then evaluated it on the test set.\n\n\nEvaluation results\n------------------\n\n\nWe evaluated the roberta-base-bne-sqac on the SQAC test set against standard multilingual and monolingual baselines:\n\n\n\nFor more details, check the fine-tuning and evaluation scripts in the official GitHub repository.\n\n\nAdditional information\n----------------------",
"### Author\n\n\nText Mining Unit (TeMU) at the Barcelona Supercomputing Center (bsc-temu@URL)",
"### Contact information\n\n\nFor further information, send an email to [plantl-gob-es@URL](mailto:plantl-gob-es@URL)",
"### Copyright\n\n\nCopyright by the Spanish State Secretariat for Digitalization and Artificial Intelligence (SEDIA) (2022)",
"### Licensing information\n\n\nApache License, Version 2.0",
"### Funding\n\n\nThis work was funded by the Spanish State Secretariat for Digitalization and Artificial Intelligence (SEDIA) within the framework of the Plan-TL.",
"### Citing information\n\n\nIf you use this model, please cite our paper:",
"### Disclaimer\n\n\nThe models published in this repository are intended for a generalist purpose and are available to third parties. These models may have bias and/or any other undesirable distortions.\n\n\nWhen third parties, deploy or provide systems and/or services to other parties using any of these models (or using systems based on these models) or become users of the models, they should note that it is their responsibility to mitigate the risks arising from their use and, in any event, to comply with applicable regulations, including regulations regarding the use of artificial intelligence.\n\n\nIn no event shall the owner of the models (SEDIA – State Secretariat for digitalization and artificial intelligence) nor the creator (BSC – Barcelona Supercomputing Center) be liable for any results arising from the use made by third parties of these models.\n\n\nLos modelos publicados en este repositorio tienen una finalidad generalista y están a disposición de terceros. Estos modelos pueden tener sesgos y/u otro tipo de distorsiones indeseables.\n\n\nCuando terceros desplieguen o proporcionen sistemas y/o servicios a otras partes usando alguno de estos modelos (o utilizando sistemas basados en estos modelos) o se conviertan en usuarios de los modelos, deben tener en cuenta que es su responsabilidad mitigar los riesgos derivados de su uso y, en todo caso, cumplir con la normativa aplicable, incluyendo la normativa en materia de uso de inteligencia artificial.\n\n\nEn ningún caso el propietario de los modelos (SEDIA – Secretaría de Estado de Digitalización e Inteligencia Artificial) ni el creador (BSC – Barcelona Supercomputing Center) serán responsables de los resultados derivados del uso que hagan terceros de estos modelos."
] | [
"TAGS\n#transformers #pytorch #roberta #question-answering #national library of spain #spanish #bne #qa #question answering #es #dataset-PlanTL-GOB-ES/SQAC #arxiv-1907.11692 #license-apache-2.0 #model-index #endpoints_compatible #has_space #region-us \n",
"### Training data\n\n\nWe used the QA dataset in Spanish called SQAC corpus for training and evaluation.",
"### Training procedure\n\n\nThe model was trained with a batch size of 16 and a learning rate of 5e-5 for 5 epochs. We then selected the best checkpoint using the downstream task metric in the corresponding development set and then evaluated it on the test set.\n\n\nEvaluation results\n------------------\n\n\nWe evaluated the roberta-base-bne-sqac on the SQAC test set against standard multilingual and monolingual baselines:\n\n\n\nFor more details, check the fine-tuning and evaluation scripts in the official GitHub repository.\n\n\nAdditional information\n----------------------",
"### Author\n\n\nText Mining Unit (TeMU) at the Barcelona Supercomputing Center (bsc-temu@URL)",
"### Contact information\n\n\nFor further information, send an email to [plantl-gob-es@URL](mailto:plantl-gob-es@URL)",
"### Copyright\n\n\nCopyright by the Spanish State Secretariat for Digitalization and Artificial Intelligence (SEDIA) (2022)",
"### Licensing information\n\n\nApache License, Version 2.0",
"### Funding\n\n\nThis work was funded by the Spanish State Secretariat for Digitalization and Artificial Intelligence (SEDIA) within the framework of the Plan-TL.",
"### Citing information\n\n\nIf you use this model, please cite our paper:",
"### Disclaimer\n\n\nThe models published in this repository are intended for a generalist purpose and are available to third parties. These models may have bias and/or any other undesirable distortions.\n\n\nWhen third parties, deploy or provide systems and/or services to other parties using any of these models (or using systems based on these models) or become users of the models, they should note that it is their responsibility to mitigate the risks arising from their use and, in any event, to comply with applicable regulations, including regulations regarding the use of artificial intelligence.\n\n\nIn no event shall the owner of the models (SEDIA – State Secretariat for digitalization and artificial intelligence) nor the creator (BSC – Barcelona Supercomputing Center) be liable for any results arising from the use made by third parties of these models.\n\n\nLos modelos publicados en este repositorio tienen una finalidad generalista y están a disposición de terceros. Estos modelos pueden tener sesgos y/u otro tipo de distorsiones indeseables.\n\n\nCuando terceros desplieguen o proporcionen sistemas y/o servicios a otras partes usando alguno de estos modelos (o utilizando sistemas basados en estos modelos) o se conviertan en usuarios de los modelos, deben tener en cuenta que es su responsabilidad mitigar los riesgos derivados de su uso y, en todo caso, cumplir con la normativa aplicable, incluyendo la normativa en materia de uso de inteligencia artificial.\n\n\nEn ningún caso el propietario de los modelos (SEDIA – Secretaría de Estado de Digitalización e Inteligencia Artificial) ni el creador (BSC – Barcelona Supercomputing Center) serán responsables de los resultados derivados del uso que hagan terceros de estos modelos."
] | [
91,
23,
131,
28,
37,
22,
12,
34,
16,
363
] | [
"passage: TAGS\n#transformers #pytorch #roberta #question-answering #national library of spain #spanish #bne #qa #question answering #es #dataset-PlanTL-GOB-ES/SQAC #arxiv-1907.11692 #license-apache-2.0 #model-index #endpoints_compatible #has_space #region-us \n### Training data\n\n\nWe used the QA dataset in Spanish called SQAC corpus for training and evaluation.### Training procedure\n\n\nThe model was trained with a batch size of 16 and a learning rate of 5e-5 for 5 epochs. We then selected the best checkpoint using the downstream task metric in the corresponding development set and then evaluated it on the test set.\n\n\nEvaluation results\n------------------\n\n\nWe evaluated the roberta-base-bne-sqac on the SQAC test set against standard multilingual and monolingual baselines:\n\n\n\nFor more details, check the fine-tuning and evaluation scripts in the official GitHub repository.\n\n\nAdditional information\n----------------------### Author\n\n\nText Mining Unit (TeMU) at the Barcelona Supercomputing Center (bsc-temu@URL)### Contact information\n\n\nFor further information, send an email to [plantl-gob-es@URL](mailto:plantl-gob-es@URL)### Copyright\n\n\nCopyright by the Spanish State Secretariat for Digitalization and Artificial Intelligence (SEDIA) (2022)### Licensing information\n\n\nApache License, Version 2.0### Funding\n\n\nThis work was funded by the Spanish State Secretariat for Digitalization and Artificial Intelligence (SEDIA) within the framework of the Plan-TL.### Citing information\n\n\nIf you use this model, please cite our paper:"
] | [
-0.06436365842819214,
0.20910683274269104,
-0.00667588971555233,
0.05515431612730026,
0.13713438808918,
-0.022855816408991814,
0.056986842304468155,
0.11437749117612839,
0.006370719522237778,
0.10568217933177948,
-0.01021106168627739,
0.024036221206188202,
0.09686021506786346,
0.11708657443523407,
0.030307510867714882,
-0.1394128054380417,
-0.023310722783207893,
-0.09831421822309494,
-0.015270084142684937,
0.10492410510778427,
0.09447555243968964,
-0.0671597570180893,
0.04824298992753029,
-0.03424688056111336,
-0.0036300362553447485,
0.0944446250796318,
-0.07591354101896286,
-0.07544726133346558,
0.06269600987434387,
0.07033071666955948,
0.07105719298124313,
0.022913139313459396,
0.04631144925951958,
-0.22972331941127777,
0.01956840418279171,
0.042559314519166946,
0.003657463937997818,
0.04024423286318779,
0.09735164046287537,
-0.02667064405977726,
0.13941124081611633,
-0.09171807020902634,
0.010920746251940727,
0.03597331792116165,
-0.11376896500587463,
-0.0996890664100647,
-0.11416091024875641,
0.0772501528263092,
0.09710747748613358,
0.04161800444126129,
-0.03591655194759369,
0.06609856337308884,
-0.07710625976324081,
0.020169757306575775,
0.04259883239865303,
-0.18313558399677277,
-0.03732654079794884,
0.027613883838057518,
0.043368298560380936,
0.13221052289009094,
-0.08520807325839996,
0.004130152985453606,
0.04214395210146904,
-0.024314291775226593,
-0.014466064982116222,
-0.028053229674696922,
-0.07569658756256104,
0.033575668931007385,
-0.10021626204252243,
-0.12470504641532898,
0.16626964509487152,
0.008427528664469719,
-0.08392353355884552,
-0.10460507869720459,
-0.0062680612318217754,
0.031156091019511223,
0.049069251865148544,
-0.045836981385946274,
0.02795957401394844,
-0.011188113130629063,
0.061455171555280685,
-0.022219020873308182,
-0.1090320497751236,
-0.04288801550865173,
-0.020945901051163673,
0.0749109759926796,
0.020275991410017014,
-0.014112258329987526,
-0.011431118473410606,
0.15066088736057281,
0.06024223566055298,
-0.12551355361938477,
-0.014300663024187088,
-0.010938392952084541,
-0.04506468400359154,
-0.044718675315380096,
0.04507236182689667,
-0.027589058503508568,
0.10512533783912659,
0.19916829466819763,
-0.01344261597841978,
0.004423054400831461,
-0.019108837470412254,
0.005807072855532169,
0.07852553576231003,
0.15965315699577332,
-0.06365601718425751,
-0.07352779060602188,
-0.009442041628062725,
0.024696754291653633,
0.006758348550647497,
0.03257729858160019,
-0.019147619605064392,
0.014630088582634926,
0.015948878601193428,
0.13916589319705963,
0.07645182311534882,
-0.0271395705640316,
-0.08133282512426376,
-0.01591263897716999,
0.17983850836753845,
-0.14883632957935333,
0.027368780225515366,
0.027853241190314293,
-0.08048644661903381,
-0.007766056340187788,
-0.036485705524683,
-0.019378703087568283,
-0.09911580383777618,
0.051631394773721695,
-0.03130533918738365,
-0.02713942527770996,
-0.07438942790031433,
-0.05149981752038002,
0.0663551390171051,
-0.05224858969449997,
0.0005499523831531405,
-0.09291199594736099,
-0.08609296381473541,
-0.08525370061397552,
0.04088199883699417,
-0.1045062392950058,
0.01499013602733612,
-0.013596612960100174,
0.006337686441838741,
0.017482798546552658,
-0.05367906019091606,
0.04085402190685272,
-0.06998561322689056,
0.055807001888751984,
0.029976973310112953,
0.029814139008522034,
0.05970897898077965,
0.0033135346602648497,
-0.08741160482168198,
-0.006496673449873924,
-0.1339162439107895,
0.10917364060878754,
-0.09669090062379837,
0.025736873969435692,
-0.17892006039619446,
-0.022145643830299377,
-0.0012710735900327563,
0.022715969011187553,
0.040466949343681335,
0.1970309615135193,
-0.105125293135643,
-0.02349090576171875,
0.134817436337471,
-0.04215426743030548,
-0.06304015219211578,
0.11156615614891052,
-0.012424799613654613,
0.07683273404836655,
0.055137865245342255,
0.06618138402700424,
0.14422518014907837,
-0.17842261493206024,
-0.057355817407369614,
-0.016368083655834198,
-0.02714165858924389,
0.07364631444215775,
0.12516218423843384,
-0.08905373513698578,
0.09673674404621124,
0.03516339510679245,
-0.1442147046327591,
-0.017747463658452034,
-0.02218654938042164,
-0.04618849977850914,
0.049356020987033844,
-0.0004899562918581069,
-0.038185276091098785,
0.000879586033988744,
-0.02099738083779812,
-0.031465984880924225,
-0.094760961830616,
-0.04309253767132759,
0.04118315875530243,
0.006881121080368757,
-0.01028179656714201,
-0.11422038078308105,
0.06179817020893097,
-0.03888825699687004,
0.009912148118019104,
-0.1819450557231903,
-0.02693660743534565,
0.05051552876830101,
-0.07469136267900467,
0.09456060826778412,
-0.03595014289021492,
0.02210417576134205,
0.003937975037842989,
-0.03668633848428726,
-0.020541062578558922,
-0.03861170634627342,
-0.0715394988656044,
0.0032151404302567244,
-0.10484582930803299,
-0.004490753170102835,
-0.03461817279458046,
0.08238504827022552,
-0.10273116081953049,
0.020095931366086006,
0.1316513568162918,
0.07932613044977188,
-0.014203107915818691,
-0.03249117359519005,
0.029826970770955086,
0.012399361468851566,
-0.016925064846873283,
-0.05511502921581268,
0.0307198204100132,
0.006890926975756884,
-0.027471331879496574,
-0.03562378138303757,
-0.03364662453532219,
-0.029536761343479156,
0.043707143515348434,
0.10146792978048325,
-0.03964081034064293,
-0.03028036653995514,
-0.04945652186870575,
0.0034228411968797445,
-0.0691840648651123,
-0.03778780251741409,
0.23297211527824402,
0.0180815402418375,
0.06030513346195221,
-0.12640567123889923,
-0.08349888771772385,
-0.022938121110200882,
-0.03879682347178459,
-0.05442972853779793,
0.1403380036354065,
0.028905175626277924,
-0.0708746537566185,
0.06848183274269104,
-0.02106749266386032,
0.06456220149993896,
0.19234338402748108,
-0.0037025820929557085,
-0.10267660766839981,
-0.018116740509867668,
0.10822608321905136,
0.023336010053753853,
0.056700415909290314,
-0.03157549723982811,
-0.019131895154714584,
0.05275775119662285,
-0.007065283600240946,
0.0648295134305954,
-0.1381472647190094,
0.029813097789883614,
0.007086030673235655,
-0.062090110033750534,
-0.030593764036893845,
0.020825613290071487,
-0.020295923575758934,
0.07359664142131805,
0.04371464625000954,
0.06544230878353119,
-0.04490918293595314,
-0.0526399165391922,
-0.09879810363054276,
0.12686705589294434,
-0.05223271623253822,
-0.2968417704105377,
-0.24958793818950653,
0.052938539534807205,
-0.03231823444366455,
0.022491125389933586,
0.05512883886694908,
-0.11453098803758621,
-0.08452941477298737,
-0.0460810661315918,
-0.0027119778096675873,
0.11230463534593582,
-0.09029565751552582,
0.015595766715705395,
0.06352014094591141,
0.015440869145095348,
-0.11687787622213364,
0.004933515563607216,
0.053356707096099854,
-0.03522498905658722,
-0.027874313294887543,
0.02584683708846569,
0.13579855859279633,
0.05688890814781189,
0.017698820680379868,
-0.015951529145240784,
-0.003853938076645136,
0.1940915286540985,
-0.14591184258460999,
0.04563063755631447,
0.2139970362186432,
0.019802089780569077,
0.02504127472639084,
0.17884229123592377,
0.014523817226290703,
-0.06305240094661713,
0.040888916701078415,
0.014058547094464302,
-0.02951657585799694,
-0.2848331034183502,
-0.05811905860900879,
-0.037890978157520294,
-0.08572927862405777,
0.04561711475253105,
0.09355879575014114,
-0.03277850151062012,
0.008995798416435719,
-0.059683166444301605,
-0.07096272706985474,
0.06279423832893372,
0.09639213234186172,
0.012212628498673439,
0.05102471634745598,
0.0331856943666935,
-0.07522960752248764,
-0.053484100848436356,
0.12177356332540512,
0.10045132786035538,
0.12878349423408508,
0.022555673494935036,
0.14425110816955566,
0.07485897839069366,
0.06734713912010193,
-0.01783253811299801,
0.03913261741399765,
0.0547771081328392,
0.018819620832800865,
-0.03280319646000862,
-0.06936059147119522,
-0.043106988072395325,
-0.002727117156609893,
0.004937784280627966,
-0.04784892871975899,
-0.01972130872309208,
-0.0891011506319046,
0.09326381981372833,
0.15451204776763916,
-0.02410818636417389,
-0.13785909116268158,
-0.040716320276260376,
0.036719657480716705,
-0.09148526191711426,
-0.05715939402580261,
-0.041676320135593414,
0.04837518930435181,
-0.16070136427879333,
0.015963509678840637,
-0.0036322614178061485,
0.11683239042758942,
-0.0621659979224205,
-0.014982128515839577,
0.0021829702891409397,
-0.004159043543040752,
0.0045039597898721695,
0.10628784447908401,
-0.1374005824327469,
0.17049944400787354,
0.010709022171795368,
0.11694879084825516,
-0.036699797958135605,
0.06061552092432976,
-0.02823072299361229,
-0.01908453181385994,
0.14055263996124268,
-0.012397734448313713,
0.0020144290756434202,
-0.04518570750951767,
-0.05031919479370117,
0.025386415421962738,
0.07050624489784241,
-0.16087184846401215,
0.1327109932899475,
-0.007193167228251696,
-0.012387963943183422,
-0.11171634495258331,
-0.060266800224781036,
-0.07769985496997833,
-0.17129234969615936,
0.01995014399290085,
-0.11497299373149872,
0.107699453830719,
-0.02979234606027603,
-0.03417856991291046,
-0.03931040316820145,
0.15217027068138123,
-0.23087921738624573,
-0.09844079613685608,
-0.11707044392824173,
0.006618180777877569,
0.11355427652597427,
-0.07369127124547958,
0.023914512246847153,
-0.008592084981501102,
0.07989116758108139,
0.04817558825016022,
-0.005243790801614523,
0.007236936129629612,
-0.06882480531930923,
-0.11945807188749313,
-0.014574181288480759,
0.1562676578760147,
0.053869545459747314,
0.02599853090941906,
0.003829095745459199,
-0.03286261856555939,
0.007559264078736305,
-0.09521465003490448,
-0.032825883477926254,
0.09661168605089188,
0.14782801270484924,
0.08707420527935028,
-0.02227834425866604,
-0.15657956898212433,
-0.1453557163476944,
-0.0784887745976448,
0.054331425577402115,
0.22104349732398987,
-0.00009049189247889444,
0.09230416268110275,
0.20501708984375,
-0.1281365156173706,
-0.13713356852531433,
-0.09980633109807968,
0.055442824959754944,
0.023371703922748566,
0.020266693085432053,
-0.18007200956344604,
-0.018565310165286064,
0.08491166681051254,
-0.00611578393727541,
-0.029658349230885506,
-0.24988052248954773,
-0.11650513112545013,
-0.0015253971796482801,
0.0018777659861370921,
-0.0821884423494339,
-0.1328759342432022,
-0.0888592079281807,
-0.06044723093509674,
-0.17270231246948242,
0.038791507482528687,
0.0138306999579072,
0.037321802228689194,
0.0020070478785783052,
0.011885495856404305,
0.03207257390022278,
-0.027042049914598465,
0.18335218727588654,
-0.018359340727329254,
0.0491933599114418,
-0.03657796233892441,
0.006111979018896818,
0.07472988218069077,
-0.009426590986549854,
0.0985681563615799,
0.004696633201092482,
0.01173730380833149,
-0.1337893009185791,
-0.05516652762889862,
-0.030403679236769676,
0.009570575319230556,
-0.03411424160003662,
-0.003096918109804392,
-0.09872166067361832,
0.06991726160049438,
0.05300994962453842,
0.005257890559732914,
0.004731577821075916,
-0.08181090652942657,
0.03379422798752785,
0.16503410041332245,
0.17285022139549255,
0.07144889235496521,
-0.041050586849451065,
-0.020576225593686104,
0.0009188159601762891,
0.043159276247024536,
-0.10261182487010956,
0.03316747397184372,
0.1330236941576004,
-0.006245564203709364,
0.09915599972009659,
-0.03022167645394802,
-0.1151222363114357,
0.020991720259189606,
0.14708122611045837,
-0.06896563619375229,
-0.17343436181545258,
-0.012278844602406025,
-0.03734077513217926,
-0.1136668473482132,
-0.020285990089178085,
0.09410760551691055,
0.03851564601063728,
-0.07399967312812805,
0.017348874360322952,
0.05969644710421562,
0.0023543990682810545,
0.1333521604537964,
0.019757529720664024,
0.056641194969415665,
-0.061567388474941254,
0.13063053786754608,
0.11950671672821045,
-0.11408278346061707,
-0.023378925397992134,
0.07582234591245651,
-0.06551279872655869,
-0.026461774483323097,
0.04383544251322746,
0.017515214160084724,
-0.0721973404288292,
-0.08329489827156067,
-0.08065062016248703,
-0.007053635083138943,
-0.004760283045470715,
0.06265690177679062,
0.016478024423122406,
0.025313524529337883,
0.030635835602879524,
0.040653765201568604,
-0.03497384116053581,
0.07408729195594788,
0.09989386051893234,
-0.033448226749897,
-0.10578955709934235,
0.01789495162665844,
-0.00004709773202193901,
-0.037329912185668945,
-0.017680218443274498,
-0.01563951186835766,
-0.11355407536029816,
0.015086564235389233,
-0.06025910750031471,
0.05607054382562637,
-0.041963353753089905,
-0.006598432548344135,
-0.0032649484928697348,
-0.04885941371321678,
-0.05056959390640259,
0.00431197602301836,
-0.05652854964137077,
-0.03220492973923683,
-0.021082768216729164,
0.1355525255203247,
-0.1445275843143463,
0.028506524860858917,
0.09551519900560379,
-0.059384364634752274,
0.07175865769386292,
-0.0005268760723993182,
-0.01429719291627407,
0.07989921420812607,
-0.16087540984153748,
0.034510765224695206,
0.01693039759993553,
0.052850015461444855,
0.01821061410009861,
-0.09892593324184418,
0.0430319719016552,
0.03834870457649231,
-0.05055695399641991,
0.026172520592808723,
0.022972770035266876,
-0.08523048460483551,
0.007284762803465128,
0.03208708390593529,
-0.06156614050269127,
-0.05936206504702568,
0.08642464131116867,
0.14112041890621185,
0.0513574443757534,
0.11565480381250381,
-0.0804535374045372,
-0.01627228781580925,
-0.13101044297218323,
-0.0030503678135573864,
0.008910102769732475,
0.024015890434384346,
-0.042573925107717514,
-0.05656511336565018,
0.044968441128730774,
0.015283777378499508,
0.14722277224063873,
0.06346310675144196,
0.1279202401638031,
0.0262238048017025,
0.010715995915234089,
0.019669586792588234,
0.025559354573488235,
0.024119025096297264,
0.006347936578094959,
0.01184737030416727,
-0.01907758042216301,
-0.04758920148015022,
-0.06301657110452652,
-0.11206679046154022,
0.05899032577872276,
0.1370968073606491,
0.11154752224683762,
0.040257204324007034,
0.003164202906191349,
-0.06067347526550293,
-0.04623576998710632,
-0.035307761281728745,
-0.020204799249768257,
0.0063638463616371155,
-0.07863065600395203,
0.07674439996480942,
0.20069092512130737,
-0.18452227115631104,
0.09375520795583725,
-0.0384388230741024,
-0.055698979645967484,
-0.06438031047582626,
-0.19100947678089142,
-0.029780898243188858,
-0.07697778195142746,
0.04056128114461899,
-0.09108162671327591,
0.10086791217327118,
0.014773890376091003,
-0.011137410998344421,
-0.07925362139940262,
0.07421562075614929,
-0.05213790759444237,
-0.11965085566043854,
0.05466967448592186,
-0.004794152919203043,
0.10936214029788971,
-0.01859394833445549,
0.10544028133153915,
0.015531312674283981,
0.08292515575885773,
0.08171568065881729,
0.10411370545625687,
0.03532993048429489,
-0.01158760767430067,
-0.07776881009340286,
-0.03395765647292137,
0.015103347599506378,
0.004186098463833332,
0.010354959405958652,
0.19487009942531586,
0.03540216013789177,
-0.016438279300928116,
0.027178898453712463,
0.2650220990180969,
-0.019456228241324425,
-0.04552484676241875,
-0.15229184925556183,
0.08258616924285889,
0.030322687700390816,
0.06516893953084946,
0.023947374895215034,
-0.13604271411895752,
-0.042731545865535736,
0.12087477743625641,
0.06188184395432472,
0.002710101194679737,
-0.018247000873088837,
-0.025630170479416847,
0.02627522498369217,
-0.002937519224360585,
0.07477369904518127,
0.04419320449233055,
0.23009173572063446,
-0.04958652704954147,
0.048772767186164856,
-0.00734289363026619,
0.007089418359100819,
-0.029942873865365982,
0.12629573047161102,
-0.056020691990852356,
-0.016843529418110847,
-0.07558271288871765,
0.14987485110759735,
-0.07902663201093674,
-0.2929273545742035,
0.016289031133055687,
-0.031692638993263245,
-0.14354342222213745,
-0.0002954895026050508,
0.027680592611432076,
0.007338210474699736,
0.04626592621207237,
0.0418144129216671,
-0.04092630743980408,
0.09041713178157806,
0.04172644764184952,
-0.01995830424129963,
-0.05115332081913948,
0.06131906434893608,
-0.08633139729499817,
0.21139279007911682,
-0.004711959045380354,
0.09580731391906738,
0.09422484040260315,
-0.06567087024450302,
-0.1582048088312149,
0.02569758892059326,
0.054690681397914886,
-0.042085159569978714,
0.11257295310497284,
0.0852200835943222,
0.025558248162269592,
-0.01395547017455101,
0.07494038343429565,
0.06310983747243881,
0.0395793579518795,
0.03515734151005745,
0.08517377823591232,
-0.16976121068000793,
0.10453087836503983,
-0.12760384380817413,
0.08023438602685928,
0.0857028141617775,
-0.03094620630145073,
0.08410723507404327,
-0.053039293736219406,
0.09477297216653824,
0.009488897398114204,
0.16791671514511108,
0.02972353622317314,
-0.18011103570461273,
0.025857839733362198,
-0.03335977718234062,
0.042777370661497116,
-0.17780259251594543,
-0.019588448107242584,
0.038856569677591324,
0.0016258028335869312,
-0.053607843816280365,
0.13305190205574036,
0.031532660126686096,
0.0236599612981081,
-0.006496659945696592,
-0.20094960927963257,
-0.01206058170646429,
0.07575643062591553,
-0.12152130901813507,
0.0033157323487102985
] |
null | null | transformers |
# RoBERTa base trained with data from the National Library of Spain (BNE)
## Table of Contents
<details>
<summary>Click to expand</summary>
- [Overview](#overview)
- [Model description](#model-description)
- [Intended uses and limitations](#intended-uses-and-limitations)
- [How to use](#how-to-use)
- [Limitations and bias](#limitations-and-bias)
- [Training](#training)
- [Training data](#training-data)
- [Training procedure](#training-procedure)
- [Evaluation](#evaluation)
- [Additional information](#additional-information)
- [Author](#author)
- [Contact information](#contact-information)
- [Copyright](#copyright)
- [Licensing information](#licensing-information)
- [Funding](#funding)
- [Citation Information](#citation-information)
- [Disclaimer](#disclaimer)
</details>
## Overview
- **Architecture:** roberta-base
- **Language:** Spanish
- **Task:** fill-mask
- **Data:** BNE
## Model description
The **roberta-base-bne** is a transformer-based masked language model for the Spanish language. It is based on the [RoBERTa](https://arxiv.org/abs/1907.11692) base model and has been pre-trained using the largest Spanish corpus known to date, with a total of 570GB of clean and deduplicated text processed for this work, compiled from the web crawlings performed by the [National Library of Spain (Biblioteca Nacional de España)](http://www.bne.es/en/Inicio/index.html) from 2009 to 2019.
## Intended uses and limitations
The **roberta-base-bne** model is ready-to-use only for masked language modeling to perform the Fill Mask task (try the inference API or read the next section).
However, it is intended to be fine-tuned on non-generative downstream tasks such as Question Answering, Text Classification, or Named Entity Recognition.
You can use the raw model for fill mask or fine-tune it to a downstream task.
## How to use
Here is how to use this model:
```python
>>> from transformers import pipeline
>>> from pprint import pprint
>>> unmasker = pipeline('fill-mask', model='PlanTL-GOB-ES/roberta-base-bne')
>>> pprint(unmasker("Gracias a los datos de la BNE se ha podido <mask> este modelo del lenguaje."))
[{'score': 0.08422081917524338,
'token': 3832,
'token_str': ' desarrollar',
'sequence': 'Gracias a los datos de la BNE se ha podido desarrollar este modelo del lenguaje.'},
{'score': 0.06348305940628052,
'token': 3078,
'token_str': ' crear',
'sequence': 'Gracias a los datos de la BNE se ha podido crear este modelo del lenguaje.'},
{'score': 0.06148449331521988,
'token': 2171,
'token_str': ' realizar',
'sequence': 'Gracias a los datos de la BNE se ha podido realizar este modelo del lenguaje.'},
{'score': 0.056218471378088,
'token': 10880,
'token_str': ' elaborar',
'sequence': 'Gracias a los datos de la BNE se ha podido elaborar este modelo del lenguaje.'},
{'score': 0.05133328214287758,
'token': 31915,
'token_str': ' validar',
'sequence': 'Gracias a los datos de la BNE se ha podido validar este modelo del lenguaje.'}]
```
Here is how to use this model to get the features of a given text in PyTorch:
```python
>>> from transformers import RobertaTokenizer, RobertaModel
>>> tokenizer = RobertaTokenizer.from_pretrained('PlanTL-GOB-ES/roberta-base-bne')
>>> model = RobertaModel.from_pretrained('PlanTL-GOB-ES/roberta-base-bne')
>>> text = "Gracias a los datos de la BNE se ha podido desarrollar este modelo del lenguaje."
>>> encoded_input = tokenizer(text, return_tensors='pt')
>>> output = model(**encoded_input)
>>> print(output.last_hidden_state.shape)
torch.Size([1, 19, 768])
```
## Limitations and bias
At the time of submission, no measures have been taken to estimate the bias and toxicity embedded in the model. However, we are well aware that our models may be biased since the corpora have been collected using crawling techniques on multiple web sources. We intend to conduct research in these areas in the future, and if completed, this model card will be updated. Nevertheless, here's an example of how the model can have biased predictions:
```python
>>> from transformers import pipeline, set_seed
>>> from pprint import pprint
>>> unmasker = pipeline('fill-mask', model='PlanTL-GOB-ES/roberta-base-bne')
>>> set_seed(42)
>>> pprint(unmasker("Antonio está pensando en <mask>."))
[{'score': 0.07950365543365479,
'sequence': 'Antonio está pensando en ti.',
'token': 486,
'token_str': ' ti'},
{'score': 0.03375273942947388,
'sequence': 'Antonio está pensando en irse.',
'token': 13134,
'token_str': ' irse'},
{'score': 0.031026942655444145,
'sequence': 'Antonio está pensando en casarse.',
'token': 24852,
'token_str': ' casarse'},
{'score': 0.030703715980052948,
'sequence': 'Antonio está pensando en todo.',
'token': 665,
'token_str': ' todo'},
{'score': 0.02838558703660965,
'sequence': 'Antonio está pensando en ello.',
'token': 1577,
'token_str': ' ello'}]
>>> set_seed(42)
>>> pprint(unmasker("Mohammed está pensando en <mask>."))
[{'score': 0.05433618649840355,
'sequence': 'Mohammed está pensando en morir.',
'token': 9459,
'token_str': ' morir'},
{'score': 0.0400255024433136,
'sequence': 'Mohammed está pensando en irse.',
'token': 13134,
'token_str': ' irse'},
{'score': 0.03705748915672302,
'sequence': 'Mohammed está pensando en todo.',
'token': 665,
'token_str': ' todo'},
{'score': 0.03658654913306236,
'sequence': 'Mohammed está pensando en quedarse.',
'token': 9331,
'token_str': ' quedarse'},
{'score': 0.03329474478960037,
'sequence': 'Mohammed está pensando en ello.',
'token': 1577,
'token_str': ' ello'}]
```
## Training
### Training data
The [National Library of Spain (Biblioteca Nacional de España)](http://www.bne.es/en/Inicio/index.html) crawls all .es domains once a year. The training corpus consists of 59TB of WARC files from these crawls, carried out from 2009 to 2019.
To obtain a high-quality training corpus, the corpus has been preprocessed with a pipeline of operations, including among others, sentence splitting, language detection, filtering of bad-formed sentences, and deduplication of repetitive contents. During the process, document boundaries are kept. This resulted in 2TB of Spanish clean corpus. Further global deduplication among the corpus is applied, resulting in 570GB of text.
Some of the statistics of the corpus:
| Corpora | Number of documents | Number of tokens | Size (GB) |
|---------|---------------------|------------------|-----------|
| BNE | 201,080,084 | 135,733,450,668 | 570GB |
### Training procedure
The training corpus has been tokenized using a byte version of Byte-Pair Encoding (BPE) used in the original [RoBERTA](https://arxiv.org/abs/1907.11692) model with a vocabulary size of 50,262 tokens.
The **roberta-base-bne** pre-training consists of a masked language model training, that follows the approach employed for the RoBERTa base. The training lasted a total of 48 hours with 16 computing nodes, each one with 4 NVIDIA V100 GPUs of 16GB VRAM.
## Evaluation
When fine-tuned on downstream tasks, this model achieves the following results:
| Dataset | Metric | [**RoBERTa-base**](https://huggingface.co/PlanTL-GOB-ES/roberta-base-bne) |
|--------------|----------|------------|
| MLDoc | F1 | 0.9664 |
| CoNLL-NERC | F1 | 0.8851 |
| CAPITEL-NERC | F1 | 0.8960 |
| PAWS-X | F1 | 0.9020 |
| UD-POS | F1 | 0.9907 |
| CAPITEL-POS | F1 | 0.9846 |
| SQAC | F1 | 0.7923 |
| STS | Combined | 0.8533 |
| XNLI | Accuracy | 0.8016 |
For more evaluation details visit our [GitHub repository](https://github.com/PlanTL-GOB-ES/lm-spanish) or [paper](http://journal.sepln.org/sepln/ojs/ojs/index.php/pln/article/view/6405).
## Additional information
### Author
Text Mining Unit (TeMU) from Barcelona Supercomputing Center (<[email protected]>).
### Contact information
For further information, send an email to <[email protected]>.
### Copyright
Copyright by the [Spanish State Secretariat for Digitalization and Artificial Intelligence (SEDIA)](https://portal.mineco.gob.es/en-us/digitalizacionIA/Pages/sedia.aspx).
### Licensing information
This work is licensed under a [Apache License, Version 2.0](https://www.apache.org/licenses/LICENSE-2.0)
### Funding
This work was funded by the [Spanish State Secretariat for Digitalization and Artificial Intelligence (SEDIA)](https://portal.mineco.gob.es/en-us/digitalizacionIA/Pages/sedia.aspx) within the framework of the Plan-TL.
### Citation information
If you use this model, please cite our [paper](http://journal.sepln.org/sepln/ojs/ojs/index.php/pln/article/view/6405):
```
@article{,
title = {MarIA: Spanish Language Models},
author = {Asier Gutiérrez Fandiño and Jordi Armengol Estapé and Marc Pàmies and Joan Llop Palao and Joaquin Silveira Ocampo and Casimiro Pio Carrino and Carme Armentano Oller and Carlos Rodriguez Penagos and Aitor Gonzalez Agirre and Marta Villegas},
doi = {10.26342/2022-68-3},
issn = {1135-5948},
journal = {Procesamiento del Lenguaje Natural},
publisher = {Sociedad Española para el Procesamiento del Lenguaje Natural},
url = {https://upcommons.upc.edu/handle/2117/367156#.YyMTB4X9A-0.mendeley},
volume = {68},
year = {2022},
}
```
### Disclaimer
<details>
<summary>Click to expand</summary>
The models published in this repository are intended for a generalist purpose and are available to third parties. These models may have bias and/or any other undesirable distortions.
When third parties deploy or provide systems and/or services to other parties using any of these models (or using systems based on these models) or become users of the models, they should note that it is their responsibility to mitigate the risks arising from their use and, in any event, to comply with applicable regulations, including regulations regarding the use of Artificial Intelligence.
In no event shall the owner of the models (SEDIA) nor the creator (BSC) be liable for any results arising from the use made by third parties of these models.
Los modelos publicados en este repositorio tienen una finalidad generalista y están a disposición de terceros. Estos modelos pueden tener sesgos y/u otro tipo de distorsiones indeseables.
Cuando terceros desplieguen o proporcionen sistemas y/o servicios a otras partes usando alguno de estos modelos (o utilizando sistemas basados en estos modelos) o se conviertan en usuarios de los modelos, deben tener en cuenta que es su responsabilidad mitigar los riesgos derivados de su uso y, en todo caso, cumplir con la normativa aplicable, incluyendo la normativa en materia de uso de Inteligencia Artificial.
En ningún caso el propietario de los modelos (SEDIA) ni el creador (BSC) serán responsables de los resultados derivados del uso que hagan terceros de estos models.
</details> | {"language": ["es"], "license": "apache-2.0", "tags": ["national library of spain", "spanish", "bne", "roberta-base-bne"], "datasets": ["bne"], "metrics": ["ppl"], "widget": [{"text": "Por la ventanilla del coche vi la Giralda y pens\u00e9 que bonita que es la ciudad de <mask>."}, {"text": "M\u00e1s vale <mask> que lamentar."}, {"text": "Caminante no hay camino, se hace camino al <mask>."}, {"text": "Tengo una pelota roja y otra amarilla. Si le doy la roja a Jose, s\u00f3lo me queda la <mask>."}, {"text": "Tengo una pelota roja y otra amarilla. Si le doy la amarilla a Jose, s\u00f3lo me queda la <mask>."}, {"text": "El <mask> es el pico m\u00e1s alto de Espa\u00f1a."}]} | fill-mask | PlanTL-GOB-ES/roberta-base-bne | [
"transformers",
"pytorch",
"roberta",
"fill-mask",
"national library of spain",
"spanish",
"bne",
"roberta-base-bne",
"es",
"dataset:bne",
"arxiv:1907.11692",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [
"1907.11692"
] | [
"es"
] | TAGS
#transformers #pytorch #roberta #fill-mask #national library of spain #spanish #bne #roberta-base-bne #es #dataset-bne #arxiv-1907.11692 #license-apache-2.0 #autotrain_compatible #endpoints_compatible #has_space #region-us
| RoBERTa base trained with data from the National Library of Spain (BNE)
=======================================================================
Table of Contents
-----------------
Click to expand
* Overview
* Model description
* Intended uses and limitations
* How to use
* Limitations and bias
* Training
+ Training data
+ Training procedure
* Evaluation
* Additional information
+ Author
+ Contact information
+ Copyright
+ Licensing information
+ Funding
+ Citation Information
+ Disclaimer
Overview
--------
* Architecture: roberta-base
* Language: Spanish
* Task: fill-mask
* Data: BNE
Model description
-----------------
The roberta-base-bne is a transformer-based masked language model for the Spanish language. It is based on the RoBERTa base model and has been pre-trained using the largest Spanish corpus known to date, with a total of 570GB of clean and deduplicated text processed for this work, compiled from the web crawlings performed by the National Library of Spain (Biblioteca Nacional de España) from 2009 to 2019.
Intended uses and limitations
-----------------------------
The roberta-base-bne model is ready-to-use only for masked language modeling to perform the Fill Mask task (try the inference API or read the next section).
However, it is intended to be fine-tuned on non-generative downstream tasks such as Question Answering, Text Classification, or Named Entity Recognition.
You can use the raw model for fill mask or fine-tune it to a downstream task.
How to use
----------
Here is how to use this model:
Here is how to use this model to get the features of a given text in PyTorch:
Limitations and bias
--------------------
At the time of submission, no measures have been taken to estimate the bias and toxicity embedded in the model. However, we are well aware that our models may be biased since the corpora have been collected using crawling techniques on multiple web sources. We intend to conduct research in these areas in the future, and if completed, this model card will be updated. Nevertheless, here's an example of how the model can have biased predictions:
Training
--------
### Training data
The National Library of Spain (Biblioteca Nacional de España) crawls all .es domains once a year. The training corpus consists of 59TB of WARC files from these crawls, carried out from 2009 to 2019.
To obtain a high-quality training corpus, the corpus has been preprocessed with a pipeline of operations, including among others, sentence splitting, language detection, filtering of bad-formed sentences, and deduplication of repetitive contents. During the process, document boundaries are kept. This resulted in 2TB of Spanish clean corpus. Further global deduplication among the corpus is applied, resulting in 570GB of text.
Some of the statistics of the corpus:
### Training procedure
The training corpus has been tokenized using a byte version of Byte-Pair Encoding (BPE) used in the original RoBERTA model with a vocabulary size of 50,262 tokens.
The roberta-base-bne pre-training consists of a masked language model training, that follows the approach employed for the RoBERTa base. The training lasted a total of 48 hours with 16 computing nodes, each one with 4 NVIDIA V100 GPUs of 16GB VRAM.
Evaluation
----------
When fine-tuned on downstream tasks, this model achieves the following results:
Dataset: MLDoc, Metric: F1, RoBERTa-base: 0.9664
Dataset: CoNLL-NERC, Metric: F1, RoBERTa-base: 0.8851
Dataset: CAPITEL-NERC, Metric: F1, RoBERTa-base: 0.8960
Dataset: PAWS-X, Metric: F1, RoBERTa-base: 0.9020
Dataset: UD-POS, Metric: F1, RoBERTa-base: 0.9907
Dataset: CAPITEL-POS, Metric: F1, RoBERTa-base: 0.9846
Dataset: SQAC, Metric: F1, RoBERTa-base: 0.7923
Dataset: STS, Metric: Combined, RoBERTa-base: 0.8533
Dataset: XNLI, Metric: Accuracy, RoBERTa-base: 0.8016
For more evaluation details visit our GitHub repository or paper.
Additional information
----------------------
### Author
Text Mining Unit (TeMU) from Barcelona Supercomputing Center ([bsc-temu@URL](mailto:bsc-temu@URL)).
### Contact information
For further information, send an email to [plantl-gob-es@URL](mailto:plantl-gob-es@URL).
### Copyright
Copyright by the Spanish State Secretariat for Digitalization and Artificial Intelligence (SEDIA).
### Licensing information
This work is licensed under a Apache License, Version 2.0
### Funding
This work was funded by the Spanish State Secretariat for Digitalization and Artificial Intelligence (SEDIA) within the framework of the Plan-TL.
information
If you use this model, please cite our paper:
### Disclaimer
Click to expand
The models published in this repository are intended for a generalist purpose and are available to third parties. These models may have bias and/or any other undesirable distortions.
When third parties deploy or provide systems and/or services to other parties using any of these models (or using systems based on these models) or become users of the models, they should note that it is their responsibility to mitigate the risks arising from their use and, in any event, to comply with applicable regulations, including regulations regarding the use of Artificial Intelligence.
In no event shall the owner of the models (SEDIA) nor the creator (BSC) be liable for any results arising from the use made by third parties of these models.
Los modelos publicados en este repositorio tienen una finalidad generalista y están a disposición de terceros. Estos modelos pueden tener sesgos y/u otro tipo de distorsiones indeseables.
Cuando terceros desplieguen o proporcionen sistemas y/o servicios a otras partes usando alguno de estos modelos (o utilizando sistemas basados en estos modelos) o se conviertan en usuarios de los modelos, deben tener en cuenta que es su responsabilidad mitigar los riesgos derivados de su uso y, en todo caso, cumplir con la normativa aplicable, incluyendo la normativa en materia de uso de Inteligencia Artificial.
En ningún caso el propietario de los modelos (SEDIA) ni el creador (BSC) serán responsables de los resultados derivados del uso que hagan terceros de estos models.
| [
"### Training data\n\n\nThe National Library of Spain (Biblioteca Nacional de España) crawls all .es domains once a year. The training corpus consists of 59TB of WARC files from these crawls, carried out from 2009 to 2019.\n\n\nTo obtain a high-quality training corpus, the corpus has been preprocessed with a pipeline of operations, including among others, sentence splitting, language detection, filtering of bad-formed sentences, and deduplication of repetitive contents. During the process, document boundaries are kept. This resulted in 2TB of Spanish clean corpus. Further global deduplication among the corpus is applied, resulting in 570GB of text.\n\n\nSome of the statistics of the corpus:",
"### Training procedure\n\n\nThe training corpus has been tokenized using a byte version of Byte-Pair Encoding (BPE) used in the original RoBERTA model with a vocabulary size of 50,262 tokens.\n\n\nThe roberta-base-bne pre-training consists of a masked language model training, that follows the approach employed for the RoBERTa base. The training lasted a total of 48 hours with 16 computing nodes, each one with 4 NVIDIA V100 GPUs of 16GB VRAM.\n\n\nEvaluation\n----------\n\n\nWhen fine-tuned on downstream tasks, this model achieves the following results:\n\n\nDataset: MLDoc, Metric: F1, RoBERTa-base: 0.9664\nDataset: CoNLL-NERC, Metric: F1, RoBERTa-base: 0.8851\nDataset: CAPITEL-NERC, Metric: F1, RoBERTa-base: 0.8960\nDataset: PAWS-X, Metric: F1, RoBERTa-base: 0.9020\nDataset: UD-POS, Metric: F1, RoBERTa-base: 0.9907\nDataset: CAPITEL-POS, Metric: F1, RoBERTa-base: 0.9846\nDataset: SQAC, Metric: F1, RoBERTa-base: 0.7923\nDataset: STS, Metric: Combined, RoBERTa-base: 0.8533\nDataset: XNLI, Metric: Accuracy, RoBERTa-base: 0.8016\n\n\nFor more evaluation details visit our GitHub repository or paper.\n\n\nAdditional information\n----------------------",
"### Author\n\n\nText Mining Unit (TeMU) from Barcelona Supercomputing Center ([bsc-temu@URL](mailto:bsc-temu@URL)).",
"### Contact information\n\n\nFor further information, send an email to [plantl-gob-es@URL](mailto:plantl-gob-es@URL).",
"### Copyright\n\n\nCopyright by the Spanish State Secretariat for Digitalization and Artificial Intelligence (SEDIA).",
"### Licensing information\n\n\nThis work is licensed under a Apache License, Version 2.0",
"### Funding\n\n\nThis work was funded by the Spanish State Secretariat for Digitalization and Artificial Intelligence (SEDIA) within the framework of the Plan-TL.\n\n\ninformation\nIf you use this model, please cite our paper:",
"### Disclaimer\n\n\n\nClick to expand\nThe models published in this repository are intended for a generalist purpose and are available to third parties. These models may have bias and/or any other undesirable distortions.\n\n\nWhen third parties deploy or provide systems and/or services to other parties using any of these models (or using systems based on these models) or become users of the models, they should note that it is their responsibility to mitigate the risks arising from their use and, in any event, to comply with applicable regulations, including regulations regarding the use of Artificial Intelligence.\n\n\nIn no event shall the owner of the models (SEDIA) nor the creator (BSC) be liable for any results arising from the use made by third parties of these models.\n\n\nLos modelos publicados en este repositorio tienen una finalidad generalista y están a disposición de terceros. Estos modelos pueden tener sesgos y/u otro tipo de distorsiones indeseables.\n\n\nCuando terceros desplieguen o proporcionen sistemas y/o servicios a otras partes usando alguno de estos modelos (o utilizando sistemas basados en estos modelos) o se conviertan en usuarios de los modelos, deben tener en cuenta que es su responsabilidad mitigar los riesgos derivados de su uso y, en todo caso, cumplir con la normativa aplicable, incluyendo la normativa en materia de uso de Inteligencia Artificial.\n\n\nEn ningún caso el propietario de los modelos (SEDIA) ni el creador (BSC) serán responsables de los resultados derivados del uso que hagan terceros de estos models."
] | [
"TAGS\n#transformers #pytorch #roberta #fill-mask #national library of spain #spanish #bne #roberta-base-bne #es #dataset-bne #arxiv-1907.11692 #license-apache-2.0 #autotrain_compatible #endpoints_compatible #has_space #region-us \n",
"### Training data\n\n\nThe National Library of Spain (Biblioteca Nacional de España) crawls all .es domains once a year. The training corpus consists of 59TB of WARC files from these crawls, carried out from 2009 to 2019.\n\n\nTo obtain a high-quality training corpus, the corpus has been preprocessed with a pipeline of operations, including among others, sentence splitting, language detection, filtering of bad-formed sentences, and deduplication of repetitive contents. During the process, document boundaries are kept. This resulted in 2TB of Spanish clean corpus. Further global deduplication among the corpus is applied, resulting in 570GB of text.\n\n\nSome of the statistics of the corpus:",
"### Training procedure\n\n\nThe training corpus has been tokenized using a byte version of Byte-Pair Encoding (BPE) used in the original RoBERTA model with a vocabulary size of 50,262 tokens.\n\n\nThe roberta-base-bne pre-training consists of a masked language model training, that follows the approach employed for the RoBERTa base. The training lasted a total of 48 hours with 16 computing nodes, each one with 4 NVIDIA V100 GPUs of 16GB VRAM.\n\n\nEvaluation\n----------\n\n\nWhen fine-tuned on downstream tasks, this model achieves the following results:\n\n\nDataset: MLDoc, Metric: F1, RoBERTa-base: 0.9664\nDataset: CoNLL-NERC, Metric: F1, RoBERTa-base: 0.8851\nDataset: CAPITEL-NERC, Metric: F1, RoBERTa-base: 0.8960\nDataset: PAWS-X, Metric: F1, RoBERTa-base: 0.9020\nDataset: UD-POS, Metric: F1, RoBERTa-base: 0.9907\nDataset: CAPITEL-POS, Metric: F1, RoBERTa-base: 0.9846\nDataset: SQAC, Metric: F1, RoBERTa-base: 0.7923\nDataset: STS, Metric: Combined, RoBERTa-base: 0.8533\nDataset: XNLI, Metric: Accuracy, RoBERTa-base: 0.8016\n\n\nFor more evaluation details visit our GitHub repository or paper.\n\n\nAdditional information\n----------------------",
"### Author\n\n\nText Mining Unit (TeMU) from Barcelona Supercomputing Center ([bsc-temu@URL](mailto:bsc-temu@URL)).",
"### Contact information\n\n\nFor further information, send an email to [plantl-gob-es@URL](mailto:plantl-gob-es@URL).",
"### Copyright\n\n\nCopyright by the Spanish State Secretariat for Digitalization and Artificial Intelligence (SEDIA).",
"### Licensing information\n\n\nThis work is licensed under a Apache License, Version 2.0",
"### Funding\n\n\nThis work was funded by the Spanish State Secretariat for Digitalization and Artificial Intelligence (SEDIA) within the framework of the Plan-TL.\n\n\ninformation\nIf you use this model, please cite our paper:",
"### Disclaimer\n\n\n\nClick to expand\nThe models published in this repository are intended for a generalist purpose and are available to third parties. These models may have bias and/or any other undesirable distortions.\n\n\nWhen third parties deploy or provide systems and/or services to other parties using any of these models (or using systems based on these models) or become users of the models, they should note that it is their responsibility to mitigate the risks arising from their use and, in any event, to comply with applicable regulations, including regulations regarding the use of Artificial Intelligence.\n\n\nIn no event shall the owner of the models (SEDIA) nor the creator (BSC) be liable for any results arising from the use made by third parties of these models.\n\n\nLos modelos publicados en este repositorio tienen una finalidad generalista y están a disposición de terceros. Estos modelos pueden tener sesgos y/u otro tipo de distorsiones indeseables.\n\n\nCuando terceros desplieguen o proporcionen sistemas y/o servicios a otras partes usando alguno de estos modelos (o utilizando sistemas basados en estos modelos) o se conviertan en usuarios de los modelos, deben tener en cuenta que es su responsabilidad mitigar los riesgos derivados de su uso y, en todo caso, cumplir con la normativa aplicable, incluyendo la normativa en materia de uso de Inteligencia Artificial.\n\n\nEn ningún caso el propietario de los modelos (SEDIA) ni el creador (BSC) serán responsables de los resultados derivados del uso que hagan terceros de estos models."
] | [
86,
160,
373,
41,
37,
20,
19,
46,
329
] | [
"passage: TAGS\n#transformers #pytorch #roberta #fill-mask #national library of spain #spanish #bne #roberta-base-bne #es #dataset-bne #arxiv-1907.11692 #license-apache-2.0 #autotrain_compatible #endpoints_compatible #has_space #region-us \n### Training data\n\n\nThe National Library of Spain (Biblioteca Nacional de España) crawls all .es domains once a year. The training corpus consists of 59TB of WARC files from these crawls, carried out from 2009 to 2019.\n\n\nTo obtain a high-quality training corpus, the corpus has been preprocessed with a pipeline of operations, including among others, sentence splitting, language detection, filtering of bad-formed sentences, and deduplication of repetitive contents. During the process, document boundaries are kept. This resulted in 2TB of Spanish clean corpus. Further global deduplication among the corpus is applied, resulting in 570GB of text.\n\n\nSome of the statistics of the corpus:"
] | [
-0.026672355830669403,
0.08837445825338364,
-0.00524394866079092,
0.03905853256583214,
0.09903594106435776,
0.038263458758592606,
0.00974984746426344,
0.08988680690526962,
-0.08972584456205368,
-0.00775254238396883,
0.047148577868938446,
0.13611502945423126,
-0.004443035926669836,
-0.02561848796904087,
0.031087348237633705,
-0.1897028684616089,
0.05789173021912575,
-0.06846750527620316,
-0.10314563661813736,
0.000002590298663562862,
0.045186351984739304,
-0.019752154126763344,
0.007969193160533905,
-0.09645169228315353,
0.015447620302438736,
0.07728633284568787,
-0.051741503179073334,
-0.13178615272045135,
0.1558569222688675,
0.05176534503698349,
0.07374117523431778,
0.005260080564767122,
0.011801308952271938,
-0.05473820120096207,
0.011317438445985317,
0.0399969145655632,
-0.03459421172738075,
0.03268961235880852,
0.012339377775788307,
-0.08038914203643799,
0.11705560237169266,
-0.05613819137215614,
-0.008284766227006912,
0.0037521987687796354,
-0.18284036219120026,
-0.2622779905796051,
-0.07231830805540085,
-0.12970152497291565,
-0.004951252602040768,
0.019253022968769073,
0.015445970930159092,
0.04356960952281952,
-0.08140221238136292,
-0.010501348413527012,
0.09613372385501862,
-0.23288775980472565,
0.005393928848206997,
0.13066880404949188,
0.1180962398648262,
0.13161367177963257,
-0.005872823763638735,
0.030154699459671974,
0.027468016371130943,
0.024332338944077492,
-0.03848070278763771,
-0.11179906874895096,
-0.14507509768009186,
0.0030182499904185534,
-0.048680875450372696,
-0.1130753830075264,
0.17841826379299164,
-0.10346341878175735,
-0.08198393881320953,
-0.006710529327392578,
-0.043575454503297806,
0.05066094174981117,
0.03927532583475113,
0.001200638827867806,
0.05327843129634857,
0.020012445747852325,
0.03672906011343002,
-0.034954436123371124,
-0.0674414336681366,
-0.0479840524494648,
-0.19449906051158905,
0.1256379932165146,
0.025936778634786606,
-0.008034823462367058,
0.02620721235871315,
0.10578808188438416,
0.026103826239705086,
-0.03591340407729149,
0.009927650913596153,
-0.003749330062419176,
-0.017921078950166702,
0.05831069499254227,
-0.07278721779584885,
-0.11803358793258667,
0.03485796973109245,
0.08839283138513565,
-0.07713109254837036,
-0.034116752445697784,
-0.05042283236980438,
0.07347740232944489,
0.08628163486719131,
-0.002626272616907954,
-0.06732521951198578,
-0.037751782685518265,
0.013508581556379795,
-0.12700630724430084,
0.020215751603245735,
0.009938393719494343,
-0.1102035790681839,
-0.09701567888259888,
-0.03536216914653778,
0.12384393066167831,
-0.014409860596060753,
-0.012554649263620377,
-0.012537416070699692,
-0.027748379856348038,
0.046773456037044525,
-0.07410529255867004,
-0.07559143006801605,
-0.036067813634872437,
-0.08538073301315308,
-0.09321469068527222,
0.009517031721770763,
-0.05581473931670189,
-0.05965212732553482,
0.009021780453622341,
-0.0839695855975151,
-0.12698747217655182,
-0.05552194267511368,
-0.1384180337190628,
0.04178612679243088,
-0.1911008656024933,
0.012522873468697071,
-0.15192052721977234,
-0.17957045137882233,
-0.030368618667125702,
0.026839623227715492,
-0.08384345471858978,
0.1200726181268692,
-0.05692498758435249,
0.013194102793931961,
-0.02779512107372284,
-0.06853213906288147,
0.06495250016450882,
-0.06511766463518143,
0.1296224743127823,
-0.013166067190468311,
0.08722541481256485,
-0.1467270702123642,
0.018123725429177284,
-0.13010917603969574,
0.010767594911158085,
-0.11064302176237106,
0.13380488753318787,
-0.02892880327999592,
0.010556289926171303,
-0.09136311709880829,
-0.026576539501547813,
-0.10252249240875244,
0.09169794619083405,
0.03841624781489372,
0.11789549142122269,
-0.15048061311244965,
-0.0633893758058548,
0.2134770005941391,
-0.03226795792579651,
0.005568360444158316,
0.07178492099046707,
-0.06369607150554657,
0.19234015047550201,
0.0825638622045517,
0.2162402719259262,
-0.014493521302938461,
-0.03764396905899048,
0.039713479578495026,
0.022452272474765778,
-0.03464663028717041,
-0.0444442555308342,
0.07535054534673691,
-0.08182613551616669,
0.07050126045942307,
0.03563214838504791,
-0.021105540916323662,
0.04656103253364563,
-0.01877332478761673,
-0.03700513765215874,
0.10619934648275375,
0.022586893290281296,
-0.00465982872992754,
-0.012073226273059845,
0.05812995135784149,
-0.03449320048093796,
0.03799230977892876,
0.0406174436211586,
0.012706798501312733,
-0.042120859026908875,
0.02608971856534481,
-0.004073353484272957,
0.04396698623895645,
-0.0789690613746643,
0.07027152180671692,
-0.13524284958839417,
-0.05034786835312843,
0.014189841225743294,
0.0007429022807627916,
0.09491543471813202,
0.09642480313777924,
0.025455638766288757,
0.053326237946748734,
-0.022062595933675766,
0.01276587974280119,
0.021606342867016792,
-0.08248244225978851,
-0.02851817011833191,
-0.08774019032716751,
0.042993392795324326,
-0.04913874343037605,
-0.052664875984191895,
-0.03296241909265518,
0.0019279210828244686,
-0.06928039342164993,
-0.0004757217247970402,
-0.00804387591779232,
0.019966520369052887,
0.002347987610846758,
0.09312719851732254,
-0.06545712798833847,
-0.04874854534864426,
0.06718053668737411,
-0.0032093157060444355,
-0.04262465983629227,
0.11150532215833664,
-0.10541027784347534,
0.1163458526134491,
0.10301048308610916,
-0.015594793483614922,
-0.023624824360013008,
-0.03351215645670891,
-0.016631735488772392,
0.003813090268522501,
-0.01999754086136818,
-0.07262004166841507,
0.2368285357952118,
-0.023392848670482635,
0.13252726197242737,
-0.1585380882024765,
0.01206271629780531,
0.026281481608748436,
-0.0758604109287262,
-0.04014451429247856,
0.10548191517591476,
-0.08183997869491577,
-0.10331344604492188,
0.09171359241008759,
0.023007815703749657,
-0.03971170261502266,
0.2024378627538681,
0.061188168823719025,
-0.08423835784196854,
-0.007679992355406284,
0.05843617394566536,
0.06677544116973877,
0.09327160567045212,
-0.12919574975967407,
-0.009723043069243431,
0.0356401652097702,
0.08760859817266464,
0.10615666955709457,
-0.07733908295631409,
-0.04594062641263008,
-0.030630571767687798,
-0.09681521356105804,
-0.018927861005067825,
0.05843055248260498,
-0.029513487592339516,
0.12155145406723022,
0.05903351306915283,
-0.02415175922214985,
0.019093181937932968,
-0.027622206136584282,
0.0013725419994443655,
0.19470396637916565,
-0.18074344098567963,
-0.337653249502182,
-0.19475054740905762,
-0.014514374546706676,
-0.017502734437584877,
0.14483578503131866,
0.07191150635480881,
-0.11762207001447678,
-0.018159933388233185,
0.031434375792741776,
0.14231714606285095,
-0.021081795915961266,
-0.06503815948963165,
-0.14796432852745056,
0.12000846117734909,
-0.07296021282672882,
-0.18042129278182983,
0.019249431788921356,
-0.017185593023896217,
-0.11556745320558548,
0.04569234699010849,
-0.06836504489183426,
0.1380574256181717,
0.04309682920575142,
0.012525735422968864,
-0.06447836011648178,
-0.03290839493274689,
0.03242607042193413,
-0.07984011620283127,
-0.1047905907034874,
0.12604787945747375,
0.02508493699133396,
-0.029480427503585815,
0.04505191370844841,
-0.025151558220386505,
-0.12712693214416504,
0.025917144492268562,
0.003666314762085676,
-0.11316370218992233,
-0.26387226581573486,
-0.10921434313058853,
-0.051858510822057724,
-0.009328004904091358,
0.021846190094947815,
0.03578772768378258,
0.04141436144709587,
0.014440316706895828,
0.006785528268665075,
0.07215029001235962,
0.030194135382771492,
0.06466425210237503,
0.14498014748096466,
-0.05759408324956894,
0.059375789016485214,
-0.06673677265644073,
-0.11142295598983765,
0.10214941203594208,
0.18408317863941193,
0.24291417002677917,
-0.03973562642931938,
0.12682747840881348,
0.06057681888341904,
0.029100079089403152,
-0.0053567783907055855,
0.08708255738019943,
0.017839310690760612,
0.038752321153879166,
-0.125475212931633,
-0.03671819716691971,
-0.15231440961360931,
-0.0008743889047764242,
0.001432314282283187,
-0.056547265499830246,
-0.06877439469099045,
-0.13999150693416595,
0.07626130431890488,
0.03536481410264969,
-0.018156783655285835,
-0.18907421827316284,
-0.07321535050868988,
0.06742532551288605,
-0.0171602051705122,
-0.1099110022187233,
0.021251419559121132,
0.14058545231819153,
-0.12396196275949478,
-0.054242782294750214,
0.03455226495862007,
0.07668676227331161,
-0.16973938047885895,
0.033598579466342926,
-0.13429351150989532,
-0.019461974501609802,
0.020015385001897812,
0.07882556319236755,
-0.20530012249946594,
0.20826119184494019,
0.033525098115205765,
0.007856986485421658,
-0.0676923617720604,
-0.016313619911670685,
-0.04910833761096001,
-0.14059576392173767,
0.15168613195419312,
0.016326000913977623,
0.08575516939163208,
-0.0520421639084816,
-0.004929143004119396,
0.06299227476119995,
0.05667923390865326,
-0.03024153970181942,
0.052223749458789825,
0.009936430491507053,
0.04098697006702423,
-0.037424616515636444,
0.018555201590061188,
0.010685956105589867,
-0.13217215240001678,
-0.021932760253548622,
-0.09131838381290436,
0.025798648595809937,
-0.026604972779750824,
0.003613967215642333,
-0.007232221774756908,
0.19724710285663605,
-0.11851004511117935,
-0.04897589609026909,
-0.04150236397981644,
0.02196030505001545,
0.14102990925312042,
-0.02479531802237034,
-0.015875568613409996,
-0.0017967100720852613,
-0.06939693540334702,
-0.03833813592791557,
-0.09329156577587128,
0.11605548113584518,
-0.06901385635137558,
0.00002354097341594752,
-0.11822795122861862,
0.039320603013038635,
0.04222654178738594,
0.029992230236530304,
0.023831550031900406,
-0.0359184704720974,
0.05103611573576927,
-0.05674075707793236,
-0.04065345600247383,
-0.011321292258799076,
0.10655750334262848,
0.1522347778081894,
-0.2213713824748993,
-0.07399142533540726,
-0.01926952600479126,
-0.03822218254208565,
0.1569368839263916,
0.2014949470758438,
0.013104052282869816,
0.1322328746318817,
0.28838643431663513,
-0.16938160359859467,
-0.25257962942123413,
-0.020917031913995743,
0.0026479102671146393,
0.05288933590054512,
-0.057404883205890656,
-0.25702184438705444,
-0.04914234206080437,
0.23682785034179688,
0.0155902449041605,
0.07382302731275558,
-0.38501957058906555,
-0.06477197259664536,
0.07477628439664841,
0.030773118138313293,
0.3724879324436188,
-0.14710423350334167,
-0.08226339519023895,
-0.10690214484930038,
0.022820137441158295,
0.11484729498624802,
-0.09530225396156311,
0.14511246979236603,
0.0062344991602003574,
-0.05563594400882721,
0.017062274739146233,
-0.014301375485956669,
0.2118767350912094,
0.04861642047762871,
0.09374767541885376,
-0.03516535833477974,
0.08383101969957352,
0.3015521764755249,
0.03435409069061279,
0.044390805065631866,
0.11035884916782379,
-0.0071073174476623535,
-0.11992679536342621,
-0.035312313586473465,
-0.05286693945527077,
0.04050308093428612,
0.0009416301036253572,
-0.08721061795949936,
-0.00479157967492938,
0.06916841119527817,
0.015940802171826363,
0.007663366384804249,
0.12164583802223206,
-0.056182779371738434,
0.1151655912399292,
0.06512156873941422,
0.06809987872838974,
0.03223313018679619,
0.08522694557905197,
0.03742632269859314,
0.007053263485431671,
0.06260785460472107,
-0.036018405109643936,
0.0035031051374971867,
0.11028923094272614,
-0.05599348619580269,
0.06018531695008278,
0.05689024180173874,
-0.08277519047260284,
0.0846836119890213,
0.15751837193965912,
-0.07350583374500275,
0.035778265446424484,
0.07989644259214401,
-0.12407071143388748,
0.061680957674980164,
0.03586326912045479,
0.1808585375547409,
0.07069054245948792,
-0.031967055052518845,
0.004339807201176882,
-0.020196232944726944,
-0.05562750995159149,
0.17073512077331543,
-0.032995134592056274,
-0.012974671088159084,
-0.09403035044670105,
0.16011883318424225,
0.0897224172949791,
-0.15097838640213013,
-0.0176664050668478,
0.09199133515357971,
-0.0655355229973793,
-0.04587528854608536,
-0.12391666322946548,
0.037521447986364365,
-0.2091338336467743,
-0.02638613060116768,
-0.10631153732538223,
-0.021469730883836746,
-0.0048487079329788685,
0.07104380428791046,
0.025257108733057976,
0.05447358638048172,
-0.0444568507373333,
0.04018917679786682,
0.006543939001858234,
-0.026712438091635704,
-0.010559052228927612,
0.006906378548592329,
0.02497478947043419,
0.10308757424354553,
-0.007178343366831541,
-0.012875078245997429,
-0.04925786331295967,
-0.06039221212267876,
-0.15813890099525452,
0.04941130429506302,
-0.1406177431344986,
-0.01944933831691742,
-0.08465670049190521,
-0.03730684146285057,
-0.047538306564092636,
0.05119963362812996,
0.0025285077281296253,
-0.07602684199810028,
-0.03104264661669731,
-0.010916526429355145,
-0.04630671814084053,
0.0695960521697998,
-0.041763223707675934,
-0.06000956520438194,
-0.01015117671340704,
-0.03469721972942352,
0.08919662237167358,
0.06371499598026276,
0.008652607910335064,
0.09716490656137466,
-0.11913983523845673,
0.023778390139341354,
0.14934082329273224,
0.04734037071466446,
0.024850623682141304,
0.01928931288421154,
0.03003094159066677,
0.050274454057216644,
0.021488958969712257,
0.04772055894136429,
0.045020196586847305,
-0.12892690300941467,
0.010618356056511402,
-0.028838081285357475,
-0.07296882569789886,
-0.037115149199962616,
0.022814810276031494,
0.12538224458694458,
0.09850804507732391,
0.045486219227313995,
-0.07346338778734207,
-0.02620590664446354,
-0.018090009689331055,
-0.0175449438393116,
-0.03734144940972328,
-0.07265834510326385,
-0.019169669598340988,
-0.004523683339357376,
0.005433073733001947,
0.0380428321659565,
0.28541192412376404,
0.03484450280666351,
0.05628452077507973,
-0.03172244131565094,
-0.033639177680015564,
0.005289619322866201,
-0.0009222709923051298,
0.115540511906147,
0.17159585654735565,
-0.03917902335524559,
-0.15160198509693146,
0.10713951289653778,
0.033446457237005234,
-0.012477540411055088,
0.03481747582554817,
0.1440240889787674,
0.2701897621154785,
0.11264505237340927,
0.02773655392229557,
-0.10902730375528336,
0.02868496999144554,
0.11172225326299667,
0.11230884492397308,
0.01623312011361122,
0.026441724970936775,
-0.06356500834226608,
0.1759931892156601,
-0.14124450087547302,
0.1040227860212326,
0.06060925871133804,
-0.006822200957685709,
-0.07591956108808517,
-0.09340982139110565,
0.03005427122116089,
-0.008967030793428421,
-0.03598463535308838,
-0.11648087948560715,
0.0029707036446779966,
-0.010911311954259872,
0.005647859536111355,
-0.019228292629122734,
0.019292905926704407,
-0.07362616807222366,
-0.19491034746170044,
-0.013290893286466599,
0.02444314770400524,
0.13875743746757507,
0.0029038535431027412,
-0.022872524335980415,
0.0013664707075804472,
0.09723350405693054,
0.0008965859306044877,
0.08932308852672577,
0.09576821327209473,
0.04033904895186424,
-0.12009666115045547,
-0.016353221610188484,
-0.046889159828424454,
0.11026900261640549,
-0.0610731840133667,
0.22953219711780548,
0.07500988990068436,
-0.1187298446893692,
0.05661835893988609,
0.22223010659217834,
-0.0035173387732356787,
-0.008740088902413845,
-0.036371439695358276,
0.21619008481502533,
0.10601827502250671,
0.11935911327600479,
-0.008483304642140865,
-0.055990032851696014,
-0.04775194823741913,
0.1471635103225708,
0.33599939942359924,
-0.0962725281715393,
-0.044088490307331085,
0.09049075841903687,
-0.0023251441307365894,
0.07504913955926895,
0.12503200769424438,
0.07578481733798981,
0.43325385451316833,
-0.048865437507629395,
-0.04515475779771805,
-0.01115079689770937,
0.08893096446990967,
-0.04598744213581085,
0.07904458791017532,
-0.08868851512670517,
-0.03860785812139511,
0.007649407256394625,
0.06947584450244904,
-0.21084168553352356,
-0.28664156794548035,
0.06797746568918228,
-0.1107838824391365,
-0.10489623248577118,
-0.037209510803222656,
-0.11169377714395523,
0.016314471140503883,
0.09913980215787888,
0.026591036468744278,
-0.003456793026998639,
-0.002718148287385702,
0.019345570355653763,
-0.1257120817899704,
-0.14311309158802032,
0.018839793279767036,
0.14447221159934998,
0.08544960618019104,
-0.03881075233221054,
0.03103139065206051,
0.05353008583188057,
0.07652252912521362,
-0.06536504626274109,
0.0628238394856453,
0.002903153421357274,
0.05426407232880592,
0.043457865715026855,
-0.0565444752573967,
-0.018265895545482635,
0.0682116150856018,
0.07091841101646423,
-0.061382729560136795,
0.0761985033750534,
0.08437929302453995,
-0.015049099922180176,
-0.09015614539384842,
0.01930275559425354,
-0.09532050788402557,
0.053152896463871,
0.15087567269802094,
0.059863001108169556,
0.04483754560351372,
-0.05326174199581146,
0.006650166120380163,
0.07133586704730988,
0.0031875185668468475,
-0.04225756227970123,
-0.14363442361354828,
-0.028175804764032364,
-0.06200394406914711,
-0.04215789586305618,
-0.27702608704566956,
0.018194889649748802,
0.022605933248996735,
-0.007624723017215729,
-0.03466073051095009,
0.047788165509700775,
-0.06248258799314499,
0.03692711889743805,
0.006938077509403229,
-0.08615981787443161,
0.01105883065611124,
0.03789618983864784,
-0.09561031311750412,
-0.10288017243146896
] |
null | null | transformers |
# BERTa: RoBERTa-based Catalan language model
## Table of contents
<details>
<summary>Click to expand</summary>
- [Model description](#model-description)
- [Intended uses and limitations](#intended-use)
- [How to use](#how-to-use)
- [Limitations and bias](#limitations-and-bias)
- [Training](#training)
- [Evaluation](#evaluation)
- [Additional information](#additional-information)
- [Author](#author)
- [Contact information](#contact-information)
- [Copyright](#copyright)
- [Licensing information](#licensing-information)
- [Funding](#funding)
- [Citing information](#citing-information)
- [Disclaimer](#disclaimer)
</details>
## Model description
BERTa is a transformer-based masked language model for the Catalan language.
It is based on the [RoBERTA](https://github.com/pytorch/fairseq/tree/master/examples/roberta) base model
and has been trained on a medium-size corpus collected from publicly available corpora and crawlers.
This model was originally published as [bsc/roberta-base-ca-cased](https://huggingface.co/bsc/roberta-base-ca-cased).
## Intended uses and limitations
The model is ready-to-use only for masked language modelling to perform the Fill Mask task (try the inference API or read the next section).
However, it is intended to be fine-tuned on non-generative downstream tasks such as Question Answering, Text Classification or Named Entity Recognition.
## How to use
### Load model and tokenizer
``` python
from transformers import AutoTokenizer, AutoModelForMaskedLM
tokenizer = AutoTokenizer.from_pretrained("PlanTL-GOB-ES/roberta-base-ca-cased")
model = AutoModelForMaskedLM.from_pretrained("PlanTL-GOB-ES/roberta-base-ca-cased")
```
### Fill Mask task
Below, an example of how to use the masked language modelling task with a pipeline.
```python
>>> from transformers import pipeline
>>> unmasker = pipeline('fill-mask', model='PlanTL-GOB-ES/roberta-base-ca-cased')
>>> unmasker("Situada a la costa de la mar Mediterrània, <mask> s'assenta en una plana formada "
"entre els deltes de les desembocadures dels rius Llobregat, al sud-oest, "
"i Besòs, al nord-est, i limitada pel sud-est per la línia de costa,"
"i pel nord-oest per la serralada de Collserola "
"(amb el cim del Tibidabo, 516,2 m, com a punt més alt) que segueix paral·lela "
"la línia de costa encaixant la ciutat en un perímetre molt definit.")
[
{
"sequence": " Situada a la costa de la mar Mediterrània, <mask> s'assenta en una plana formada "
"entre els deltes de les desembocadures dels rius Llobregat, al sud-oest, "
"i Besòs, al nord-est, i limitada pel sud-est per la línia de costa,"
"i pel nord-oest per la serralada de Collserola "
"(amb el cim del Tibidabo, 516,2 m, com a punt més alt) que segueix paral·lela "
"la línia de costa encaixant la ciutat en un perímetre molt definit.",
"score": 0.4177263379096985,
"token": 734,
"token_str": " Barcelona"
},
{
"sequence": " Situada a la costa de la mar Mediterrània, <mask> s'assenta en una plana formada "
"entre els deltes de les desembocadures dels rius Llobregat, al sud-oest, "
"i Besòs, al nord-est, i limitada pel sud-est per la línia de costa,"
"i pel nord-oest per la serralada de Collserola "
"(amb el cim del Tibidabo, 516,2 m, com a punt més alt) que segueix paral·lela "
"la línia de costa encaixant la ciutat en un perímetre molt definit.",
"score": 0.10696165263652802,
"token": 3849,
"token_str": " Badalona"
},
{
"sequence": " Situada a la costa de la mar Mediterrània, <mask> s'assenta en una plana formada "
"entre els deltes de les desembocadures dels rius Llobregat, al sud-oest, "
"i Besòs, al nord-est, i limitada pel sud-est per la línia de costa,"
"i pel nord-oest per la serralada de Collserola "
"(amb el cim del Tibidabo, 516,2 m, com a punt més alt) que segueix paral·lela "
"la línia de costa encaixant la ciutat en un perímetre molt definit.",
"score": 0.08135009557008743,
"token": 19349,
"token_str": " Collserola"
},
{
"sequence": " Situada a la costa de la mar Mediterrània, <mask> s'assenta en una plana formada "
"entre els deltes de les desembocadures dels rius Llobregat, al sud-oest, "
"i Besòs, al nord-est, i limitada pel sud-est per la línia de costa,"
"i pel nord-oest per la serralada de Collserola "
"(amb el cim del Tibidabo, 516,2 m, com a punt més alt) que segueix paral·lela "
"la línia de costa encaixant la ciutat en un perímetre molt definit.",
"score": 0.07330769300460815,
"token": 4974,
"token_str": " Terrassa"
},
{
"sequence": " Situada a la costa de la mar Mediterrània, <mask> s'assenta en una plana formada "
"entre els deltes de les desembocadures dels rius Llobregat, al sud-oest, "
"i Besòs, al nord-est, i limitada pel sud-est per la línia de costa,"
"i pel nord-oest per la serralada de Collserola "
"(amb el cim del Tibidabo, 516,2 m, com a punt més alt) que segueix paral·lela "
"la línia de costa encaixant la ciutat en un perímetre molt definit.",
"score": 0.03317456692457199,
"token": 14333,
"token_str": " Gavà"
}
]
```
## Limitations and bias
## Training
### Training corpora and preprocessing
The training corpus consists of several corpora gathered from web crawling and public corpora.
The publicly available corpora are:
1. the Catalan part of the [DOGC](http://opus.nlpl.eu/DOGC-v2.php) corpus, a set of documents from the Official Gazette of the Catalan Government
2. the [Catalan Open Subtitles](http://opus.nlpl.eu/download.php?f=OpenSubtitles/v2018/mono/OpenSubtitles.raw.ca.gz), a collection of translated movie subtitles
3. the non-shuffled version of the Catalan part of the [OSCAR](https://traces1.inria.fr/oscar/) corpus \\\\cite{suarez2019asynchronous},
a collection of monolingual corpora, filtered from [Common Crawl](https://commoncrawl.org/about/)
4. The [CaWac](http://nlp.ffzg.hr/resources/corpora/cawac/) corpus, a web corpus of Catalan built from the .cat top-level-domain in late 2013
the non-deduplicated version
5. the [Catalan Wikipedia articles](https://ftp.acc.umu.se/mirror/wikimedia.org/dumps/cawiki/20200801/) downloaded on 18-08-2020.
The crawled corpora are:
6. The Catalan General Crawling, obtained by crawling the 500 most popular .cat and .ad domains
7. the Catalan Government Crawling, obtained by crawling the .gencat domain and subdomains, belonging to the Catalan Government
8. the ACN corpus with 220k news items from March 2015 until October 2020, crawled from the [Catalan News Agency](https://www.acn.cat/)
To obtain a high-quality training corpus, each corpus have preprocessed with a pipeline of operations, including among the others,
sentence splitting, language detection, filtering of bad-formed sentences and deduplication of repetitive contents.
During the process, we keep document boundaries are kept.
Finally, the corpora are concatenated and further global deduplication among the corpora is applied.
The final training corpus consists of about 1,8B tokens.
### Tokenization and pretraining
The training corpus has been tokenized using a byte version of [Byte-Pair Encoding (BPE)](https://github.com/openai/gpt-2)
used in the original [RoBERTA](https://github.com/pytorch/fairseq/tree/master/examples/roberta) model with a vocabulary size of 52,000 tokens.
The BERTa pretraining consists of a masked language model training that follows the approach employed for the RoBERTa base model
with the same hyperparameters as in the original work.
The training lasted a total of 48 hours with 16 NVIDIA V100 GPUs of 16GB DDRAM.
## Evaluation
### CLUB benchmark
The BERTa model has been fine-tuned on the downstream tasks of the Catalan Language Understanding Evaluation benchmark (CLUB),
that has been created along with the model.
It contains the following tasks and their related datasets:
1. Part-of-Speech Tagging (POS)
Catalan-Ancora: from the [Universal Dependencies treebank](https://github.com/UniversalDependencies/UD_Catalan-AnCora) of the well-known Ancora corpus
2. Named Entity Recognition (NER)
**[AnCora Catalan 2.0.0](https://zenodo.org/record/4762031#.YKaFjqGxWUk)**: extracted named entities from the original [Ancora](https://doi.org/10.5281/zenodo.4762030) version,
filtering out some unconventional ones, like book titles, and transcribed them into a standard CONLL-IOB format
3. Text Classification (TC)
**[TeCla](https://doi.org/10.5281/zenodo.4627197)**: consisting of 137k news pieces from the Catalan News Agency ([ACN](https://www.acn.cat/)) corpus
4. Semantic Textual Similarity (STS)
**[Catalan semantic textual similarity](https://doi.org/10.5281/zenodo.4529183)**: consisting of more than 3000 sentence pairs, annotated with the semantic similarity between them,
scraped from the [Catalan Textual Corpus](https://doi.org/10.5281/zenodo.4519349)
5. Question Answering (QA):
**[ViquiQuAD](https://doi.org/10.5281/zenodo.4562344)**: consisting of more than 15,000 questions outsourced from Catalan Wikipedia randomly chosen from a set of 596 articles that were originally written in Catalan.
**[XQuAD](https://doi.org/10.5281/zenodo.4526223)**: the Catalan translation of XQuAD, a multilingual collection of manual translations of 1,190 question-answer pairs from English Wikipedia used only as a _test set_
Here are the train/dev/test splits of the datasets:
| Task (Dataset) | Total | Train | Dev | Test |
|:--|:--|:--|:--|:--|
| NER (Ancora) |13,581 | 10,628 | 1,427 | 1,526 |
| POS (Ancora)| 16,678 | 13,123 | 1,709 | 1,846 |
| STS | 3,073 | 2,073 | 500 | 500 |
| TC (TeCla) | 137,775 | 110,203 | 13,786 | 13,786|
| QA (ViquiQuAD) | 14,239 | 11,255 | 1,492 | 1,429 |
_The fine-tuning on downstream tasks have been performed with the HuggingFace [**Transformers**](https://github.com/huggingface/transformers) library_
### Results
Below the evaluation results on the CLUB tasks compared with the multilingual mBERT, XLM-RoBERTa models and
the Catalan WikiBERT-ca model
| Task | NER (F1) | POS (F1) | STS (Pearson) | TC (accuracy) | QA (ViquiQuAD) (F1/EM) | QA (XQuAD) (F1/EM) |
| ------------|:-------------:| -----:|:------|:-------|:------|:----|
| BERTa | **88.13** | **98.97** | **79.73** | **74.16** | **86.97/72.29** | **68.89/48.87** |
| mBERT | 86.38 | 98.82 | 76.34 | 70.56 | 86.97/72.22 | 67.15/46.51 |
| XLM-RoBERTa | 87.66 | 98.89 | 75.40 | 71.68 | 85.50/70.47 | 67.10/46.42 |
| WikiBERT-ca | 77.66 | 97.60 | 77.18 | 73.22 | 85.45/70.75 | 65.21/36.60 |
## Additional information
### Author
Text Mining Unit (TeMU) at the Barcelona Supercomputing Center ([email protected])
### Contact information
For further information, send an email to <[email protected]>
### Copyright
Copyright by the Spanish State Secretariat for Digitalization and Artificial Intelligence (SEDIA) (2022)
### Licensing information
[Apache License, Version 2.0](https://www.apache.org/licenses/LICENSE-2.0)
### Funding
This work was funded by the Spanish State Secretariat for Digitalization and Artificial Intelligence (SEDIA) within the framework of the Plan-TL.
### Citing information
If you use this model, please cite our latest paper:
```bibtex
@inproceedings{armengol-estape-etal-2021-multilingual,
title = "Are Multilingual Models the Best Choice for Moderately Under-resourced Languages? {A} Comprehensive Assessment for {C}atalan",
author = "Armengol-Estap{\'e}, Jordi and
Carrino, Casimiro Pio and
Rodriguez-Penagos, Carlos and
de Gibert Bonet, Ona and
Armentano-Oller, Carme and
Gonzalez-Agirre, Aitor and
Melero, Maite and
Villegas, Marta",
booktitle = "Findings of the Association for Computational Linguistics: ACL-IJCNLP 2021",
month = aug,
year = "2021",
address = "Online",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2021.findings-acl.437",
doi = "10.18653/v1/2021.findings-acl.437",
pages = "4933--4946",
}
```
### Disclaimer
The models published in this repository are intended for a generalist purpose and are available to third parties. These models may have bias and/or any other undesirable distortions.
When third parties, deploy or provide systems and/or services to other parties using any of these models (or using systems based on these models) or become users of the models, they should note that it is their responsibility to mitigate the risks arising from their use and, in any event, to comply with applicable regulations, including regulations regarding the use of artificial intelligence.
In no event shall the owner of the models (SEDIA – State Secretariat for digitalization and artificial intelligence) nor the creator (BSC – Barcelona Supercomputing Center) be liable for any results arising from the use made by third parties of these models.
Los modelos publicados en este repositorio tienen una finalidad generalista y están a disposición de terceros. Estos modelos pueden tener sesgos y/u otro tipo de distorsiones indeseables.
Cuando terceros desplieguen o proporcionen sistemas y/o servicios a otras partes usando alguno de estos modelos (o utilizando sistemas basados en estos modelos) o se conviertan en usuarios de los modelos, deben tener en cuenta que es su responsabilidad mitigar los riesgos derivados de su uso y, en todo caso, cumplir con la normativa aplicable, incluyendo la normativa en materia de uso de inteligencia artificial.
En ningún caso el propietario de los modelos (SEDIA – Secretaría de Estado de Digitalización e Inteligencia Artificial) ni el creador (BSC – Barcelona Supercomputing Center) serán responsables de los resultados derivados del uso que hagan terceros de estos modelos. | {"language": "ca", "license": "apache-2.0", "tags": ["masked-lm", "BERTa", "catalan"], "widget": [{"text": "El Catal\u00e0 \u00e9s una llengua molt <mask>."}, {"text": "Salvador Dal\u00ed va viure a <mask>."}, {"text": "La Costa Brava t\u00e9 les millors <mask> d'Espanya."}, {"text": "El cacaolat \u00e9s un batut de <mask>."}, {"text": "<mask> \u00e9s la capital de la Garrotxa."}, {"text": "Vaig al <mask> a buscar bolets."}, {"text": "Antoni Gaud\u00ed vas ser un <mask> molt important per la ciutat."}, {"text": "Catalunya \u00e9s una refer\u00e8ncia en <mask> a nivell europeu."}]} | fill-mask | PlanTL-GOB-ES/roberta-base-ca | [
"transformers",
"pytorch",
"roberta",
"fill-mask",
"masked-lm",
"BERTa",
"catalan",
"ca",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [
"ca"
] | TAGS
#transformers #pytorch #roberta #fill-mask #masked-lm #BERTa #catalan #ca #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
| BERTa: RoBERTa-based Catalan language model
===========================================
Table of contents
-----------------
Click to expand
* Model description
* Intended uses and limitations
* How to use
* Limitations and bias
* Training
* Evaluation
* Additional information
+ Author
+ Contact information
+ Copyright
+ Licensing information
+ Funding
+ Citing information
+ Disclaimer
Model description
-----------------
BERTa is a transformer-based masked language model for the Catalan language.
It is based on the RoBERTA base model
and has been trained on a medium-size corpus collected from publicly available corpora and crawlers.
This model was originally published as bsc/roberta-base-ca-cased.
Intended uses and limitations
-----------------------------
The model is ready-to-use only for masked language modelling to perform the Fill Mask task (try the inference API or read the next section).
However, it is intended to be fine-tuned on non-generative downstream tasks such as Question Answering, Text Classification or Named Entity Recognition.
How to use
----------
### Load model and tokenizer
### Fill Mask task
Below, an example of how to use the masked language modelling task with a pipeline.
Limitations and bias
--------------------
Training
--------
### Training corpora and preprocessing
The training corpus consists of several corpora gathered from web crawling and public corpora.
The publicly available corpora are:
1. the Catalan part of the DOGC corpus, a set of documents from the Official Gazette of the Catalan Government
2. the Catalan Open Subtitles, a collection of translated movie subtitles
3. the non-shuffled version of the Catalan part of the OSCAR corpus \\cite{suarez2019asynchronous},
a collection of monolingual corpora, filtered from Common Crawl
4. The CaWac corpus, a web corpus of Catalan built from the .cat top-level-domain in late 2013
the non-deduplicated version
5. the Catalan Wikipedia articles downloaded on 18-08-2020.
The crawled corpora are:
6. The Catalan General Crawling, obtained by crawling the 500 most popular .cat and .ad domains
7. the Catalan Government Crawling, obtained by crawling the .gencat domain and subdomains, belonging to the Catalan Government
8. the ACN corpus with 220k news items from March 2015 until October 2020, crawled from the Catalan News Agency
To obtain a high-quality training corpus, each corpus have preprocessed with a pipeline of operations, including among the others,
sentence splitting, language detection, filtering of bad-formed sentences and deduplication of repetitive contents.
During the process, we keep document boundaries are kept.
Finally, the corpora are concatenated and further global deduplication among the corpora is applied.
The final training corpus consists of about 1,8B tokens.
### Tokenization and pretraining
The training corpus has been tokenized using a byte version of Byte-Pair Encoding (BPE)
used in the original RoBERTA model with a vocabulary size of 52,000 tokens.
The BERTa pretraining consists of a masked language model training that follows the approach employed for the RoBERTa base model
with the same hyperparameters as in the original work.
The training lasted a total of 48 hours with 16 NVIDIA V100 GPUs of 16GB DDRAM.
Evaluation
----------
### CLUB benchmark
The BERTa model has been fine-tuned on the downstream tasks of the Catalan Language Understanding Evaluation benchmark (CLUB),
that has been created along with the model.
It contains the following tasks and their related datasets:
1. Part-of-Speech Tagging (POS)
Catalan-Ancora: from the Universal Dependencies treebank of the well-known Ancora corpus
2. Named Entity Recognition (NER)
AnCora Catalan 2.0.0: extracted named entities from the original Ancora version,
filtering out some unconventional ones, like book titles, and transcribed them into a standard CONLL-IOB format
3. Text Classification (TC)
TeCla: consisting of 137k news pieces from the Catalan News Agency (ACN) corpus
4. Semantic Textual Similarity (STS)
Catalan semantic textual similarity: consisting of more than 3000 sentence pairs, annotated with the semantic similarity between them,
scraped from the Catalan Textual Corpus
5. Question Answering (QA):
ViquiQuAD: consisting of more than 15,000 questions outsourced from Catalan Wikipedia randomly chosen from a set of 596 articles that were originally written in Catalan.
XQuAD: the Catalan translation of XQuAD, a multilingual collection of manual translations of 1,190 question-answer pairs from English Wikipedia used only as a *test set*
Here are the train/dev/test splits of the datasets:
*The fine-tuning on downstream tasks have been performed with the HuggingFace Transformers library*
### Results
Below the evaluation results on the CLUB tasks compared with the multilingual mBERT, XLM-RoBERTa models and
the Catalan WikiBERT-ca model
Additional information
----------------------
### Author
Text Mining Unit (TeMU) at the Barcelona Supercomputing Center (bsc-temu@URL)
### Contact information
For further information, send an email to [plantl-gob-es@URL](mailto:plantl-gob-es@URL)
### Copyright
Copyright by the Spanish State Secretariat for Digitalization and Artificial Intelligence (SEDIA) (2022)
### Licensing information
Apache License, Version 2.0
### Funding
This work was funded by the Spanish State Secretariat for Digitalization and Artificial Intelligence (SEDIA) within the framework of the Plan-TL.
### Citing information
If you use this model, please cite our latest paper:
### Disclaimer
The models published in this repository are intended for a generalist purpose and are available to third parties. These models may have bias and/or any other undesirable distortions.
When third parties, deploy or provide systems and/or services to other parties using any of these models (or using systems based on these models) or become users of the models, they should note that it is their responsibility to mitigate the risks arising from their use and, in any event, to comply with applicable regulations, including regulations regarding the use of artificial intelligence.
In no event shall the owner of the models (SEDIA – State Secretariat for digitalization and artificial intelligence) nor the creator (BSC – Barcelona Supercomputing Center) be liable for any results arising from the use made by third parties of these models.
Los modelos publicados en este repositorio tienen una finalidad generalista y están a disposición de terceros. Estos modelos pueden tener sesgos y/u otro tipo de distorsiones indeseables.
Cuando terceros desplieguen o proporcionen sistemas y/o servicios a otras partes usando alguno de estos modelos (o utilizando sistemas basados en estos modelos) o se conviertan en usuarios de los modelos, deben tener en cuenta que es su responsabilidad mitigar los riesgos derivados de su uso y, en todo caso, cumplir con la normativa aplicable, incluyendo la normativa en materia de uso de inteligencia artificial.
En ningún caso el propietario de los modelos (SEDIA – Secretaría de Estado de Digitalización e Inteligencia Artificial) ni el creador (BSC – Barcelona Supercomputing Center) serán responsables de los resultados derivados del uso que hagan terceros de estos modelos.
| [
"### Load model and tokenizer",
"### Fill Mask task\n\n\nBelow, an example of how to use the masked language modelling task with a pipeline.\n\n\nLimitations and bias\n--------------------\n\n\nTraining\n--------",
"### Training corpora and preprocessing\n\n\nThe training corpus consists of several corpora gathered from web crawling and public corpora.\n\n\nThe publicly available corpora are:\n\n\n1. the Catalan part of the DOGC corpus, a set of documents from the Official Gazette of the Catalan Government\n2. the Catalan Open Subtitles, a collection of translated movie subtitles\n3. the non-shuffled version of the Catalan part of the OSCAR corpus \\\\cite{suarez2019asynchronous},\na collection of monolingual corpora, filtered from Common Crawl\n4. The CaWac corpus, a web corpus of Catalan built from the .cat top-level-domain in late 2013\nthe non-deduplicated version\n5. the Catalan Wikipedia articles downloaded on 18-08-2020.\n\n\nThe crawled corpora are:\n\n\n6. The Catalan General Crawling, obtained by crawling the 500 most popular .cat and .ad domains\n7. the Catalan Government Crawling, obtained by crawling the .gencat domain and subdomains, belonging to the Catalan Government\n8. the ACN corpus with 220k news items from March 2015 until October 2020, crawled from the Catalan News Agency\n\n\nTo obtain a high-quality training corpus, each corpus have preprocessed with a pipeline of operations, including among the others,\nsentence splitting, language detection, filtering of bad-formed sentences and deduplication of repetitive contents.\nDuring the process, we keep document boundaries are kept.\nFinally, the corpora are concatenated and further global deduplication among the corpora is applied.\nThe final training corpus consists of about 1,8B tokens.",
"### Tokenization and pretraining\n\n\nThe training corpus has been tokenized using a byte version of Byte-Pair Encoding (BPE)\nused in the original RoBERTA model with a vocabulary size of 52,000 tokens.\n\n\nThe BERTa pretraining consists of a masked language model training that follows the approach employed for the RoBERTa base model\nwith the same hyperparameters as in the original work.\n\n\nThe training lasted a total of 48 hours with 16 NVIDIA V100 GPUs of 16GB DDRAM.\n\n\nEvaluation\n----------",
"### CLUB benchmark\n\n\nThe BERTa model has been fine-tuned on the downstream tasks of the Catalan Language Understanding Evaluation benchmark (CLUB),\nthat has been created along with the model.\n\n\nIt contains the following tasks and their related datasets:\n\n\n1. Part-of-Speech Tagging (POS)\n\n\nCatalan-Ancora: from the Universal Dependencies treebank of the well-known Ancora corpus\n2. Named Entity Recognition (NER)\n\n\nAnCora Catalan 2.0.0: extracted named entities from the original Ancora version,\nfiltering out some unconventional ones, like book titles, and transcribed them into a standard CONLL-IOB format\n3. Text Classification (TC)\n\n\nTeCla: consisting of 137k news pieces from the Catalan News Agency (ACN) corpus\n4. Semantic Textual Similarity (STS)\n\n\nCatalan semantic textual similarity: consisting of more than 3000 sentence pairs, annotated with the semantic similarity between them,\nscraped from the Catalan Textual Corpus\n5. Question Answering (QA):\n\n\nViquiQuAD: consisting of more than 15,000 questions outsourced from Catalan Wikipedia randomly chosen from a set of 596 articles that were originally written in Catalan.\n\n\nXQuAD: the Catalan translation of XQuAD, a multilingual collection of manual translations of 1,190 question-answer pairs from English Wikipedia used only as a *test set*\n\n\nHere are the train/dev/test splits of the datasets:\n\n\n\n*The fine-tuning on downstream tasks have been performed with the HuggingFace Transformers library*",
"### Results\n\n\nBelow the evaluation results on the CLUB tasks compared with the multilingual mBERT, XLM-RoBERTa models and\nthe Catalan WikiBERT-ca model\n\n\n\nAdditional information\n----------------------",
"### Author\n\n\nText Mining Unit (TeMU) at the Barcelona Supercomputing Center (bsc-temu@URL)",
"### Contact information\n\n\nFor further information, send an email to [plantl-gob-es@URL](mailto:plantl-gob-es@URL)",
"### Copyright\n\n\nCopyright by the Spanish State Secretariat for Digitalization and Artificial Intelligence (SEDIA) (2022)",
"### Licensing information\n\n\nApache License, Version 2.0",
"### Funding\n\n\nThis work was funded by the Spanish State Secretariat for Digitalization and Artificial Intelligence (SEDIA) within the framework of the Plan-TL.",
"### Citing information\n\n\nIf you use this model, please cite our latest paper:",
"### Disclaimer\n\n\nThe models published in this repository are intended for a generalist purpose and are available to third parties. These models may have bias and/or any other undesirable distortions.\n\n\nWhen third parties, deploy or provide systems and/or services to other parties using any of these models (or using systems based on these models) or become users of the models, they should note that it is their responsibility to mitigate the risks arising from their use and, in any event, to comply with applicable regulations, including regulations regarding the use of artificial intelligence.\n\n\nIn no event shall the owner of the models (SEDIA – State Secretariat for digitalization and artificial intelligence) nor the creator (BSC – Barcelona Supercomputing Center) be liable for any results arising from the use made by third parties of these models.\n\n\nLos modelos publicados en este repositorio tienen una finalidad generalista y están a disposición de terceros. Estos modelos pueden tener sesgos y/u otro tipo de distorsiones indeseables.\n\n\nCuando terceros desplieguen o proporcionen sistemas y/o servicios a otras partes usando alguno de estos modelos (o utilizando sistemas basados en estos modelos) o se conviertan en usuarios de los modelos, deben tener en cuenta que es su responsabilidad mitigar los riesgos derivados de su uso y, en todo caso, cumplir con la normativa aplicable, incluyendo la normativa en materia de uso de inteligencia artificial.\n\n\nEn ningún caso el propietario de los modelos (SEDIA – Secretaría de Estado de Digitalización e Inteligencia Artificial) ni el creador (BSC – Barcelona Supercomputing Center) serán responsables de los resultados derivados del uso que hagan terceros de estos modelos."
] | [
"TAGS\n#transformers #pytorch #roberta #fill-mask #masked-lm #BERTa #catalan #ca #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n",
"### Load model and tokenizer",
"### Fill Mask task\n\n\nBelow, an example of how to use the masked language modelling task with a pipeline.\n\n\nLimitations and bias\n--------------------\n\n\nTraining\n--------",
"### Training corpora and preprocessing\n\n\nThe training corpus consists of several corpora gathered from web crawling and public corpora.\n\n\nThe publicly available corpora are:\n\n\n1. the Catalan part of the DOGC corpus, a set of documents from the Official Gazette of the Catalan Government\n2. the Catalan Open Subtitles, a collection of translated movie subtitles\n3. the non-shuffled version of the Catalan part of the OSCAR corpus \\\\cite{suarez2019asynchronous},\na collection of monolingual corpora, filtered from Common Crawl\n4. The CaWac corpus, a web corpus of Catalan built from the .cat top-level-domain in late 2013\nthe non-deduplicated version\n5. the Catalan Wikipedia articles downloaded on 18-08-2020.\n\n\nThe crawled corpora are:\n\n\n6. The Catalan General Crawling, obtained by crawling the 500 most popular .cat and .ad domains\n7. the Catalan Government Crawling, obtained by crawling the .gencat domain and subdomains, belonging to the Catalan Government\n8. the ACN corpus with 220k news items from March 2015 until October 2020, crawled from the Catalan News Agency\n\n\nTo obtain a high-quality training corpus, each corpus have preprocessed with a pipeline of operations, including among the others,\nsentence splitting, language detection, filtering of bad-formed sentences and deduplication of repetitive contents.\nDuring the process, we keep document boundaries are kept.\nFinally, the corpora are concatenated and further global deduplication among the corpora is applied.\nThe final training corpus consists of about 1,8B tokens.",
"### Tokenization and pretraining\n\n\nThe training corpus has been tokenized using a byte version of Byte-Pair Encoding (BPE)\nused in the original RoBERTA model with a vocabulary size of 52,000 tokens.\n\n\nThe BERTa pretraining consists of a masked language model training that follows the approach employed for the RoBERTa base model\nwith the same hyperparameters as in the original work.\n\n\nThe training lasted a total of 48 hours with 16 NVIDIA V100 GPUs of 16GB DDRAM.\n\n\nEvaluation\n----------",
"### CLUB benchmark\n\n\nThe BERTa model has been fine-tuned on the downstream tasks of the Catalan Language Understanding Evaluation benchmark (CLUB),\nthat has been created along with the model.\n\n\nIt contains the following tasks and their related datasets:\n\n\n1. Part-of-Speech Tagging (POS)\n\n\nCatalan-Ancora: from the Universal Dependencies treebank of the well-known Ancora corpus\n2. Named Entity Recognition (NER)\n\n\nAnCora Catalan 2.0.0: extracted named entities from the original Ancora version,\nfiltering out some unconventional ones, like book titles, and transcribed them into a standard CONLL-IOB format\n3. Text Classification (TC)\n\n\nTeCla: consisting of 137k news pieces from the Catalan News Agency (ACN) corpus\n4. Semantic Textual Similarity (STS)\n\n\nCatalan semantic textual similarity: consisting of more than 3000 sentence pairs, annotated with the semantic similarity between them,\nscraped from the Catalan Textual Corpus\n5. Question Answering (QA):\n\n\nViquiQuAD: consisting of more than 15,000 questions outsourced from Catalan Wikipedia randomly chosen from a set of 596 articles that were originally written in Catalan.\n\n\nXQuAD: the Catalan translation of XQuAD, a multilingual collection of manual translations of 1,190 question-answer pairs from English Wikipedia used only as a *test set*\n\n\nHere are the train/dev/test splits of the datasets:\n\n\n\n*The fine-tuning on downstream tasks have been performed with the HuggingFace Transformers library*",
"### Results\n\n\nBelow the evaluation results on the CLUB tasks compared with the multilingual mBERT, XLM-RoBERTa models and\nthe Catalan WikiBERT-ca model\n\n\n\nAdditional information\n----------------------",
"### Author\n\n\nText Mining Unit (TeMU) at the Barcelona Supercomputing Center (bsc-temu@URL)",
"### Contact information\n\n\nFor further information, send an email to [plantl-gob-es@URL](mailto:plantl-gob-es@URL)",
"### Copyright\n\n\nCopyright by the Spanish State Secretariat for Digitalization and Artificial Intelligence (SEDIA) (2022)",
"### Licensing information\n\n\nApache License, Version 2.0",
"### Funding\n\n\nThis work was funded by the Spanish State Secretariat for Digitalization and Artificial Intelligence (SEDIA) within the framework of the Plan-TL.",
"### Citing information\n\n\nIf you use this model, please cite our latest paper:",
"### Disclaimer\n\n\nThe models published in this repository are intended for a generalist purpose and are available to third parties. These models may have bias and/or any other undesirable distortions.\n\n\nWhen third parties, deploy or provide systems and/or services to other parties using any of these models (or using systems based on these models) or become users of the models, they should note that it is their responsibility to mitigate the risks arising from their use and, in any event, to comply with applicable regulations, including regulations regarding the use of artificial intelligence.\n\n\nIn no event shall the owner of the models (SEDIA – State Secretariat for digitalization and artificial intelligence) nor the creator (BSC – Barcelona Supercomputing Center) be liable for any results arising from the use made by third parties of these models.\n\n\nLos modelos publicados en este repositorio tienen una finalidad generalista y están a disposición de terceros. Estos modelos pueden tener sesgos y/u otro tipo de distorsiones indeseables.\n\n\nCuando terceros desplieguen o proporcionen sistemas y/o servicios a otras partes usando alguno de estos modelos (o utilizando sistemas basados en estos modelos) o se conviertan en usuarios de los modelos, deben tener en cuenta que es su responsabilidad mitigar los riesgos derivados de su uso y, en todo caso, cumplir con la normativa aplicable, incluyendo la normativa en materia de uso de inteligencia artificial.\n\n\nEn ningún caso el propietario de los modelos (SEDIA – Secretaría de Estado de Digitalización e Inteligencia Artificial) ni el creador (BSC – Barcelona Supercomputing Center) serán responsables de los resultados derivados del uso que hagan terceros de estos modelos."
] | [
58,
9,
36,
353,
120,
356,
45,
28,
37,
22,
12,
34,
17,
363
] | [
"passage: TAGS\n#transformers #pytorch #roberta #fill-mask #masked-lm #BERTa #catalan #ca #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n### Load model and tokenizer### Fill Mask task\n\n\nBelow, an example of how to use the masked language modelling task with a pipeline.\n\n\nLimitations and bias\n--------------------\n\n\nTraining\n--------### Training corpora and preprocessing\n\n\nThe training corpus consists of several corpora gathered from web crawling and public corpora.\n\n\nThe publicly available corpora are:\n\n\n1. the Catalan part of the DOGC corpus, a set of documents from the Official Gazette of the Catalan Government\n2. the Catalan Open Subtitles, a collection of translated movie subtitles\n3. the non-shuffled version of the Catalan part of the OSCAR corpus \\\\cite{suarez2019asynchronous},\na collection of monolingual corpora, filtered from Common Crawl\n4. The CaWac corpus, a web corpus of Catalan built from the .cat top-level-domain in late 2013\nthe non-deduplicated version\n5. the Catalan Wikipedia articles downloaded on 18-08-2020.\n\n\nThe crawled corpora are:\n\n\n6. The Catalan General Crawling, obtained by crawling the 500 most popular .cat and .ad domains\n7. the Catalan Government Crawling, obtained by crawling the .gencat domain and subdomains, belonging to the Catalan Government\n8. the ACN corpus with 220k news items from March 2015 until October 2020, crawled from the Catalan News Agency\n\n\nTo obtain a high-quality training corpus, each corpus have preprocessed with a pipeline of operations, including among the others,\nsentence splitting, language detection, filtering of bad-formed sentences and deduplication of repetitive contents.\nDuring the process, we keep document boundaries are kept.\nFinally, the corpora are concatenated and further global deduplication among the corpora is applied.\nThe final training corpus consists of about 1,8B tokens.",
"passage: ### Tokenization and pretraining\n\n\nThe training corpus has been tokenized using a byte version of Byte-Pair Encoding (BPE)\nused in the original RoBERTA model with a vocabulary size of 52,000 tokens.\n\n\nThe BERTa pretraining consists of a masked language model training that follows the approach employed for the RoBERTa base model\nwith the same hyperparameters as in the original work.\n\n\nThe training lasted a total of 48 hours with 16 NVIDIA V100 GPUs of 16GB DDRAM.\n\n\nEvaluation\n----------### CLUB benchmark\n\n\nThe BERTa model has been fine-tuned on the downstream tasks of the Catalan Language Understanding Evaluation benchmark (CLUB),\nthat has been created along with the model.\n\n\nIt contains the following tasks and their related datasets:\n\n\n1. Part-of-Speech Tagging (POS)\n\n\nCatalan-Ancora: from the Universal Dependencies treebank of the well-known Ancora corpus\n2. Named Entity Recognition (NER)\n\n\nAnCora Catalan 2.0.0: extracted named entities from the original Ancora version,\nfiltering out some unconventional ones, like book titles, and transcribed them into a standard CONLL-IOB format\n3. Text Classification (TC)\n\n\nTeCla: consisting of 137k news pieces from the Catalan News Agency (ACN) corpus\n4. Semantic Textual Similarity (STS)\n\n\nCatalan semantic textual similarity: consisting of more than 3000 sentence pairs, annotated with the semantic similarity between them,\nscraped from the Catalan Textual Corpus\n5. Question Answering (QA):\n\n\nViquiQuAD: consisting of more than 15,000 questions outsourced from Catalan Wikipedia randomly chosen from a set of 596 articles that were originally written in Catalan.\n\n\nXQuAD: the Catalan translation of XQuAD, a multilingual collection of manual translations of 1,190 question-answer pairs from English Wikipedia used only as a *test set*\n\n\nHere are the train/dev/test splits of the datasets:\n\n\n\n*The fine-tuning on downstream tasks have been performed with the HuggingFace Transformers library*### Results\n\n\nBelow the evaluation results on the CLUB tasks compared with the multilingual mBERT, XLM-RoBERTa models and\nthe Catalan WikiBERT-ca model\n\n\n\nAdditional information\n----------------------### Author\n\n\nText Mining Unit (TeMU) at the Barcelona Supercomputing Center (bsc-temu@URL)### Contact information\n\n\nFor further information, send an email to [plantl-gob-es@URL](mailto:plantl-gob-es@URL)### Copyright\n\n\nCopyright by the Spanish State Secretariat for Digitalization and Artificial Intelligence (SEDIA) (2022)### Licensing information\n\n\nApache License, Version 2.0"
] | [
-0.031014464795589447,
0.09975560009479523,
-0.0069292522966861725,
0.041322626173496246,
0.032383985817432404,
-0.014784992672502995,
-0.018423866480588913,
0.04715178534388542,
-0.05983324721455574,
0.104066863656044,
-0.018548933789134026,
0.013641489669680595,
0.09446610510349274,
-0.09610432386398315,
0.030404269695281982,
-0.16539990901947021,
0.06214684247970581,
-0.09405415505170822,
0.013883557170629501,
0.049127690494060516,
0.061896540224552155,
-0.023857302963733673,
0.06362093985080719,
-0.02619175985455513,
0.003130745142698288,
0.0234384685754776,
-0.031071320176124573,
-0.07090722024440765,
0.07585527002811432,
0.0664837509393692,
0.09485449641942978,
0.017457764595746994,
0.03231355547904968,
-0.12900352478027344,
0.01522392313927412,
0.04840211570262909,
-0.03654836490750313,
0.034795068204402924,
0.15059420466423035,
-0.04338061064481735,
0.158281147480011,
-0.07354041934013367,
0.05465526878833771,
0.029012534767389297,
-0.1417410969734192,
-0.08168180286884308,
-0.10485692322254181,
-0.04285178333520889,
0.09151341021060944,
0.004764333367347717,
-0.01865643635392189,
-0.0017372667789459229,
-0.11686704307794571,
-0.006589286960661411,
0.05288030952215195,
-0.19508087635040283,
-0.061598073691129684,
-0.0005038809031248093,
0.1318449079990387,
0.14320680499076843,
-0.010582605376839638,
0.020314130932092667,
0.02962888404726982,
-0.0010405927896499634,
-0.053945161402225494,
-0.04365071654319763,
-0.06248066574335098,
-0.07502669095993042,
-0.051031626760959625,
-0.10155592858791351,
0.04330435395240784,
-0.03818286210298538,
-0.07284978032112122,
-0.08143250644207001,
-0.01624995656311512,
0.0522599071264267,
-0.0005783503875136375,
-0.029008064419031143,
0.021307066082954407,
0.017472783103585243,
0.0561474934220314,
-0.030974077060818672,
-0.07960863411426544,
-0.012881163507699966,
-0.05021476745605469,
0.05156824737787247,
0.03903888538479805,
0.0004113010363653302,
0.07705210894346237,
0.132042795419693,
0.07785746455192566,
-0.031039781868457794,
-0.030121073126792908,
0.0016628913581371307,
-0.12046489119529724,
0.006256564985960722,
-0.028678666800260544,
-0.013452230952680111,
0.004478637129068375,
0.16683819890022278,
-0.12576203048229218,
0.01342470571398735,
-0.06600349396467209,
0.031116150319576263,
0.005398163106292486,
0.09724096953868866,
-0.03422113507986069,
-0.008546154946088791,
-0.01982303336262703,
-0.06849443167448044,
0.02814999781548977,
-0.02126816287636757,
-0.06919637322425842,
0.002735668793320656,
0.00208671810105443,
0.09993337094783783,
0.05236706882715225,
-0.041374776512384415,
-0.05783369019627571,
-0.0478358194231987,
0.1681242436170578,
-0.0394180566072464,
0.004712420515716076,
0.03949578106403351,
0.005016188137233257,
-0.046490855515003204,
-0.027605140581727028,
0.0321987047791481,
-0.07296669483184814,
0.013049110770225525,
-0.03191443532705307,
-0.05026906728744507,
-0.00758749432861805,
-0.07638964056968689,
0.03414241969585419,
-0.1235373467206955,
-0.08923567831516266,
-0.09885448962450027,
-0.052807193249464035,
-0.09109444916248322,
0.031302645802497864,
-0.09385043382644653,
0.018291953951120377,
-0.048516519367694855,
0.02189234271645546,
-0.0359329916536808,
0.002709640422835946,
-0.05380280315876007,
-0.06291395425796509,
-0.002852322068065405,
-0.08858280628919601,
0.028662458062171936,
-0.0677175521850586,
0.009242028929293156,
-0.06948550045490265,
0.050689563155174255,
-0.1506727635860443,
0.12569019198417664,
-0.019007734954357147,
0.032868146896362305,
-0.0938943475484848,
-0.022799793630838394,
-0.05441867560148239,
0.03348081558942795,
-0.01094374805688858,
0.0630873516201973,
-0.1337805539369583,
-0.054793521761894226,
0.06120707839727402,
-0.06341905891895294,
0.061356253921985626,
0.14714190363883972,
-0.011651833541691303,
0.04314182326197624,
0.10978104174137115,
0.08037428557872772,
0.056341953575611115,
-0.027136407792568207,
-0.03741173446178436,
-0.07371090352535248,
0.029007233679294586,
0.10514532774686813,
0.09230285882949829,
-0.10556545108556747,
0.07772571593523026,
0.006177879404276609,
-0.07992278784513474,
-0.06634397804737091,
0.008291391655802727,
-0.04081752523779869,
0.044291071593761444,
0.018203532323241234,
-0.03715929388999939,
0.00460739666596055,
-0.004144251346588135,
-0.014540350995957851,
-0.07088236510753632,
-0.12238774448633194,
0.0009452097583562136,
-0.01540803350508213,
0.0832926481962204,
-0.0425301119685173,
0.09532494097948074,
-0.08746211230754852,
-0.0038582112174481153,
-0.14751148223876953,
-0.030299056321382523,
0.08194105327129364,
-0.13040615618228912,
0.10201440006494522,
0.010626014322042465,
-0.037138696759939194,
0.05898474156856537,
-0.025734353810548782,
0.030080819502472878,
0.006040386855602264,
-0.03623279929161072,
-0.03404746204614639,
-0.11681672930717468,
0.010504372417926788,
-0.03332275152206421,
0.0969565287232399,
-0.06001421809196472,
0.0015513855032622814,
0.08412809669971466,
0.08058200776576996,
0.03290506452322006,
-0.053062960505485535,
-0.028384506702423096,
0.10578913986682892,
-0.011282887309789658,
-0.022408228367567062,
0.04987006634473801,
0.0011735016014426947,
-0.022461792454123497,
0.09254255890846252,
-0.09647722542285919,
-0.09058746695518494,
0.0382029265165329,
0.049258582293987274,
-0.06631097197532654,
0.04303690418601036,
-0.02607503905892372,
0.00700178649276495,
-0.03539089486002922,
-0.05736036226153374,
0.27669304609298706,
0.0069457474164664745,
0.09093505144119263,
-0.14058682322502136,
-0.05272260680794716,
-0.003520177211612463,
0.020985227078199387,
-0.026092981919646263,
0.10688181221485138,
0.07148861885070801,
-0.047496113926172256,
0.033801645040512085,
0.013288717716932297,
0.07082908600568771,
0.1384182870388031,
0.05584218353033066,
-0.09441043436527252,
-0.034088604152202606,
0.04005932807922363,
0.058265894651412964,
0.030501589179039,
0.01005987636744976,
-0.00941777415573597,
-0.0044466350227594376,
0.03423905372619629,
0.056887075304985046,
-0.10257405042648315,
0.04504356533288956,
0.008338694460690022,
-0.05491923913359642,
-0.033945806324481964,
-0.04575931280851364,
0.01239550206810236,
0.09531956911087036,
0.09837007522583008,
-0.033131591975688934,
-0.023284578695893288,
-0.0683027058839798,
-0.0795915424823761,
0.10329344868659973,
-0.09696679562330246,
-0.2867029011249542,
-0.1601337045431137,
0.09716343134641647,
-0.03390426188707352,
0.04538685828447342,
0.0345175601541996,
-0.055035196244716644,
-0.03774288296699524,
-0.06267479807138443,
0.016922635957598686,
0.08349733799695969,
-0.04931895434856415,
-0.06888048350811005,
0.03910725191235542,
-0.05273688957095146,
-0.10799099504947662,
0.029093338176608086,
-0.06298977136611938,
-0.14468449354171753,
-0.0645982101559639,
-0.057003214955329895,
0.1441427767276764,
0.08693891763687134,
0.05513813719153404,
-0.06333814561367035,
-0.030135657638311386,
0.10691329836845398,
-0.11289516091346741,
0.02149765007197857,
-0.02096213772892952,
-0.039645966142416,
0.008358900435268879,
0.14426110684871674,
0.03135087341070175,
-0.05299295485019684,
-0.008125176653265953,
0.003172962460666895,
-0.07257787883281708,
-0.2193630188703537,
-0.10394631326198578,
-0.05106775462627411,
-0.015251874923706055,
0.056173667311668396,
0.062063854187726974,
0.04238395765423775,
-0.03439917787909508,
-0.08479594439268112,
-0.011675475165247917,
0.08523109555244446,
0.0612921267747879,
0.044217128306627274,
0.03197421878576279,
0.04140881448984146,
-0.07365452498197556,
-0.016979847103357315,
0.13200487196445465,
0.08613985776901245,
0.22678238153457642,
0.022986989468336105,
0.1674424409866333,
0.036467187106609344,
0.03251991793513298,
0.021651387214660645,
0.017124660313129425,
0.02318061701953411,
0.06721501052379608,
-0.06615056842565536,
-0.023321721702814102,
0.007434197701513767,
0.03600601106882095,
0.11635076254606247,
-0.06157006323337555,
-0.00978797022253275,
-0.15622112154960632,
0.09655257314443588,
0.24867235124111176,
0.02371075749397278,
-0.09370271116495132,
-0.07769738882780075,
0.0400918573141098,
-0.09885268658399582,
-0.03451886773109436,
0.012159489095211029,
0.11241979151964188,
-0.11178436875343323,
-0.010193949565291405,
0.0055369967594742775,
0.08163012564182281,
-0.13414444029331207,
0.01753033883869648,
0.02317075803875923,
-0.011232176795601845,
-0.0006492361426353455,
0.010390081442892551,
-0.07214383035898209,
0.12581492960453033,
0.02163451723754406,
0.07951977849006653,
-0.03283730149269104,
0.009439773857593536,
-0.008492615073919296,
-0.04240391030907631,
0.1469610035419464,
0.01229808758944273,
-0.12500283122062683,
-0.04377935826778412,
-0.1348823606967926,
0.06208841875195503,
0.05724136531352997,
-0.16468310356140137,
0.08584842085838318,
0.019173096865415573,
-0.02673802524805069,
-0.0533050112426281,
-0.09605275094509125,
-0.0649326965212822,
-0.15994790196418762,
0.06547222286462784,
-0.09704574942588806,
-0.0018880071584135294,
-0.04048041254281998,
-0.03807542473077774,
-0.04200517386198044,
0.11766742169857025,
-0.24072794616222382,
-0.05097951740026474,
-0.04139170050621033,
-0.1345575451850891,
0.08393610268831253,
-0.007584147155284882,
0.05094391852617264,
0.006350256036967039,
0.04713868349790573,
-0.013869940303266048,
-0.035654712468385696,
0.05607684329152107,
-0.06413345038890839,
-0.17111077904701233,
-0.06410564482212067,
0.08657437562942505,
0.1126459389925003,
0.04997336119413376,
0.018380412831902504,
0.08426153659820557,
0.05006680637598038,
-0.07741197943687439,
-0.05849490687251091,
0.055338241159915924,
-0.019525161013007164,
0.11663135886192322,
-0.05784115195274353,
-0.17110389471054077,
-0.0944293886423111,
-0.009390448220074177,
0.04007777199149132,
0.12084821611642838,
-0.015347518026828766,
0.17662012577056885,
0.24255073070526123,
-0.15000450611114502,
-0.16039517521858215,
-0.032893504947423935,
0.004043878987431526,
-0.02554880455136299,
-0.03513861075043678,
-0.16091370582580566,
0.0026768483221530914,
0.15777090191841125,
-0.006350582465529442,
0.02463112212717533,
-0.29846909642219543,
-0.11882603913545609,
0.00014664116315543652,
-0.01458217017352581,
0.0893440991640091,
-0.06871951371431351,
-0.09347966313362122,
-0.03549434617161751,
-0.022684959694743156,
0.13682153820991516,
-0.023077258840203285,
0.07067149877548218,
-0.015154710970818996,
-0.1480078399181366,
-0.0018920855363830924,
0.01950232684612274,
0.165113165974617,
0.02685382589697838,
0.00048796666669659317,
-0.037076011300086975,
0.025807509198784828,
0.08070790767669678,
-0.0067448001354932785,
-0.0029870010912418365,
0.05417434126138687,
-0.04380853474140167,
-0.08109356462955475,
-0.01740289479494095,
-0.06806867569684982,
0.020842649042606354,
-0.06139202043414116,
0.0419684462249279,
-0.08502380549907684,
0.06664425134658813,
0.07573607563972473,
0.0524304062128067,
0.016535423696041107,
-0.06494157016277313,
0.1014474481344223,
0.07085573673248291,
0.15441599488258362,
0.039960507303476334,
0.0059263454750180244,
0.000025741523131728172,
-0.0070227161049842834,
0.07857504487037659,
0.03860122710466385,
0.017494847998023033,
0.1199946403503418,
-0.043390627950429916,
0.05972908437252045,
0.008704578503966331,
-0.118955597281456,
0.05878959596157074,
0.0890398919582367,
-0.04498118907213211,
-0.09319064766168594,
0.013388308696448803,
-0.0962035059928894,
-0.002746524289250374,
-0.05485387146472931,
0.10047727078199387,
0.037071410566568375,
-0.06385570019483566,
0.012577338144183159,
0.04677202180027962,
0.027656883001327515,
0.07649222016334534,
-0.00832981988787651,
-0.020095065236091614,
-0.07236596196889877,
0.09894444048404694,
0.13979752361774445,
-0.1558360457420349,
-0.03418165072798729,
0.15641453862190247,
-0.07503083348274231,
-0.02136705256998539,
-0.1160861998796463,
0.05828070640563965,
-0.0927075669169426,
-0.09214749932289124,
-0.11169604957103729,
-0.11369144916534424,
0.055858224630355835,
0.14730560779571533,
-0.000054416945204138756,
0.029016200453042984,
-0.003967637661844492,
0.027271104976534843,
0.008133907802402973,
0.049375519156455994,
0.02860679291188717,
0.008895576000213623,
0.049961671233177185,
-0.02524782344698906,
0.022424541413784027,
-0.062063831835985184,
-0.004320287145674229,
-0.039568811655044556,
-0.07740003615617752,
0.0005472886841744184,
-0.15059806406497955,
0.04265942424535751,
-0.061655934900045395,
-0.006465582177042961,
-0.010011887177824974,
-0.0045112911611795425,
-0.03373674675822258,
-0.013839478604495525,
-0.025762949138879776,
-0.017126258462667465,
-0.07100865244865417,
0.0821647047996521,
-0.08650901913642883,
0.013352373614907265,
0.04852387309074402,
-0.07438687980175018,
0.060764044523239136,
-0.05169122666120529,
-0.04863379895687103,
0.023234425112605095,
-0.17636321485042572,
0.037968236953020096,
-0.02595827728509903,
0.020362375304102898,
0.003867853432893753,
-0.05186738446354866,
0.0185780581086874,
0.045532938092947006,
-0.013033615425229073,
0.017481982707977295,
0.09435334801673889,
-0.041583746671676636,
0.04884129390120506,
0.025189168751239777,
-0.08697739243507385,
-0.0048108575865626335,
0.0660933405160904,
0.10553811490535736,
0.038925230503082275,
0.06758466362953186,
-0.07933329790830612,
0.02677217684686184,
-0.047705117613077164,
-0.02297908067703247,
0.00847054272890091,
-0.004329552873969078,
-0.02697693556547165,
-0.015846187248826027,
0.01416032761335373,
0.061297982931137085,
0.1596653163433075,
-0.005216633901000023,
0.07214930653572083,
0.021654454991221428,
0.11164399236440659,
-0.15349990129470825,
0.06226231902837753,
-0.0830039381980896,
0.07583379745483398,
-0.007934927940368652,
-0.07016120851039886,
0.008001855574548244,
-0.014105364680290222,
-0.07580220699310303,
0.1085515171289444,
0.07652805745601654,
0.31312453746795654,
0.07970527559518814,
0.05757759138941765,
-0.049564436078071594,
-0.0027199829928576946,
0.1861964762210846,
-0.02856002375483513,
0.04263714700937271,
-0.004737982526421547,
-0.04126264899969101,
0.09983798861503601,
-0.16625165939331055,
0.08348982036113739,
0.02982962504029274,
-0.015113135799765587,
-0.030712511390447617,
-0.11529579758644104,
-0.008535156957805157,
-0.014094289392232895,
-0.04802033305168152,
-0.05877993255853653,
0.021686449646949768,
0.060000523924827576,
0.017749454826116562,
-0.010190826840698719,
0.06242160126566887,
-0.11861279606819153,
-0.09822431206703186,
0.10875551402568817,
0.011524710804224014,
0.1536748707294464,
0.03141317889094353,
0.01105274073779583,
0.00328793004155159,
0.12293682247400284,
0.06684164702892303,
0.10657641291618347,
0.05193546786904335,
0.01975247636437416,
-0.05988471582531929,
-0.04359646141529083,
0.052264899015426636,
0.006815762259066105,
-0.02060302533209324,
0.24099332094192505,
0.07055039703845978,
-0.04709361493587494,
0.07447308301925659,
0.08969399333000183,
0.0179380401968956,
0.03625722974538803,
-0.1525786817073822,
0.12980331480503082,
-0.008927518501877785,
0.0049652718007564545,
0.05576447397470474,
-0.10835134983062744,
0.028574103489518166,
0.12631827592849731,
0.14635545015335083,
0.021084221079945564,
0.0016343288589268923,
0.013206161558628082,
0.00026245415210723877,
0.05931473523378372,
0.04416114464402199,
0.0064776563085615635,
0.31862667202949524,
-0.036428194493055344,
-0.025247760117053986,
0.03646896779537201,
0.015681279823184013,
-0.06080847606062889,
0.03138720244169235,
-0.06035850569605827,
-0.00985187292098999,
-0.07336445152759552,
0.1333901733160019,
-0.10190355777740479,
-0.28165775537490845,
0.13972215354442596,
-0.08218219876289368,
-0.09223867952823639,
-0.017143620178103447,
0.006355304270982742,
0.0010260473936796188,
0.06927888840436935,
0.036035917699337006,
-0.0743996798992157,
0.11100008338689804,
0.014715151861310005,
-0.00973515398800373,
-0.09085133671760559,
0.08195038139820099,
-0.08366499841213226,
0.25274452567100525,
-0.01779267005622387,
0.02410256490111351,
0.06732004880905151,
0.018687386065721512,
-0.08296144753694534,
-0.0028562135994434357,
0.025264713913202286,
0.019037490710616112,
0.04096575081348419,
0.06927299499511719,
-0.006918728351593018,
0.13615518808364868,
0.0929957702755928,
0.016165336593985558,
0.11363804340362549,
-0.017922816798090935,
-0.02355550229549408,
-0.08573485910892487,
0.07232091575860977,
-0.11153483390808105,
0.10475096106529236,
0.12329846620559692,
0.017143715173006058,
0.057710543274879456,
-0.006233495194464922,
-0.03283064439892769,
-0.0380055233836174,
0.09998392313718796,
0.008434947580099106,
-0.15663498640060425,
0.06641457974910736,
-0.06510314345359802,
0.08652335405349731,
-0.15808339416980743,
-0.027390984818339348,
0.03766575828194618,
-0.03099336288869381,
-0.013345056213438511,
0.08017508685588837,
0.019876405596733093,
-0.014575740322470665,
-0.020337004214525223,
-0.052496377378702164,
0.0009853604715317488,
0.04516205936670303,
-0.03728707507252693,
-0.024130528792738914
] |
null | null | transformers |
# Spanish RoBERTa-large trained on BNE finetuned for CAPITEL Named Entity Recognition (NER) dataset.
## Table of contents
<details>
<summary>Click to expand</summary>
- [Model description](#model-description)
- [Intended uses and limitations](#intended-use)
- [How to use](#how-to-use)
- [Limitations and bias](#limitations-and-bias)
- [Training](#training)
- [Training](#training)
- [Training data](#training-data)
- [Training procedure](#training-procedure)
- [Evaluation](#evaluation)
- [Evaluation](#evaluation)
- [Variable and metrics](#variable-and-metrics)
- [Evaluation results](#evaluation-results)
- [Additional information](#additional-information)
- [Author](#author)
- [Contact information](#contact-information)
- [Copyright](#copyright)
- [Licensing information](#licensing-information)
- [Funding](#funding)
- [Citing information](#citing-information)
- [Disclaimer](#disclaimer)
</details>
## Model description
The **roberta-large-bne-capitel-ner** is a Named Entity Recognition (NER) model for the Spanish language fine-tuned from the [roberta-large-bne](https://huggingface.co/PlanTL-GOB-ES/roberta-large-bne) model, a [RoBERTa](https://arxiv.org/abs/1907.11692) large model pre-trained using the largest Spanish corpus known to date, with a total of 570GB of clean and deduplicated text, processed for this work, compiled from the web crawlings performed by the [National Library of Spain (Biblioteca Nacional de España)](http://www.bne.es/en/Inicio/index.html) from 2009 to 2019.
## Intended uses and limitations
**roberta-large-bne-capitel-ner** model can be used to recognize Named Entities (NE). The model is limited by its training dataset and may not generalize well for all use cases.
## How to use
```python
from transformers import pipeline
from pprint import pprint
nlp = pipeline("ner", model="PlanTL-GOB-ES/roberta-large-bne-capitel-ner")
example = "Me llamo Francisco Javier y vivo en Madrid."
ner_results = nlp(example)
pprint(ner_results)
```
## Limitations and bias
At the time of submission, no measures have been taken to estimate the bias embedded in the model. However, we are well aware that our models may be biased since the corpora have been collected using crawling techniques on multiple web sources. We intend to conduct research in these areas in the future, and if completed, this model card will be updated.
## Training
The dataset used is the one from the [CAPITEL competition at IberLEF 2020](https://sites.google.com/view/capitel2020) (sub-task 1).
### Training procedure
The model was trained with a batch size of 32 and a learning rate of 3e-5 for 5 epochs. We then selected the best checkpoint using the downstream task metric in the corresponding development set and then evaluated it on the test set.
## Evaluation
### Variable and metrics
This model was finetuned maximizing F1 score.
## Evaluation results
We evaluated the **roberta-large-bne-capitel-ner** on the CAPITEL-NERC test set against standard multilingual and monolingual baselines:
| Model | CAPITEL-NERC (F1) |
| ------------|:----|
| roberta-large-bne-capitel-ner | **90.51** |
| roberta-base-bne-capitel-ner | 89.60|
| BETO | 87.72 |
| mBERT | 88.10 |
| BERTIN | 88.56 |
| ELECTRA | 80.35 |
For more details, check the fine-tuning and evaluation scripts in the official [GitHub repository](https://github.com/PlanTL-GOB-ES/lm-spanish).
## Additional information
### Author
Text Mining Unit (TeMU) at the Barcelona Supercomputing Center ([email protected])
### Contact information
For further information, send an email to <[email protected]>
### Copyright
Copyright by the Spanish State Secretariat for Digitalization and Artificial Intelligence (SEDIA) (2022)
### Licensing information
[Apache License, Version 2.0](https://www.apache.org/licenses/LICENSE-2.0)
### Funding
This work was funded by the Spanish State Secretariat for Digitalization and Artificial Intelligence (SEDIA) within the framework of the Plan-TL.
## Citing information
If you use this model, please cite our [paper](http://journal.sepln.org/sepln/ojs/ojs/index.php/pln/article/view/6405):
```
@article{,
abstract = {We want to thank the National Library of Spain for such a large effort on the data gathering and the Future of Computing Center, a
Barcelona Supercomputing Center and IBM initiative (2020). This work was funded by the Spanish State Secretariat for Digitalization and Artificial
Intelligence (SEDIA) within the framework of the Plan-TL.},
author = {Asier Gutiérrez Fandiño and Jordi Armengol Estapé and Marc Pàmies and Joan Llop Palao and Joaquin Silveira Ocampo and Casimiro Pio Carrino and Carme Armentano Oller and Carlos Rodriguez Penagos and Aitor Gonzalez Agirre and Marta Villegas},
doi = {10.26342/2022-68-3},
issn = {1135-5948},
journal = {Procesamiento del Lenguaje Natural},
keywords = {Artificial intelligence,Benchmarking,Data processing.,MarIA,Natural language processing,Spanish language modelling,Spanish language resources,Tractament del llenguatge natural (Informàtica),Àrees temàtiques de la UPC::Informàtica::Intel·ligència artificial::Llenguatge natural},
publisher = {Sociedad Española para el Procesamiento del Lenguaje Natural},
title = {MarIA: Spanish Language Models},
volume = {68},
url = {https://upcommons.upc.edu/handle/2117/367156#.YyMTB4X9A-0.mendeley},
year = {2022},
}
```
### Disclaimer
The models published in this repository are intended for a generalist purpose and are available to third parties. These models may have bias and/or any other undesirable distortions.
When third parties, deploy or provide systems and/or services to other parties using any of these models (or using systems based on these models) or become users of the models, they should note that it is their responsibility to mitigate the risks arising from their use and, in any event, to comply with applicable regulations, including regulations regarding the use of artificial intelligence.
In no event shall the owner of the models (SEDIA – State Secretariat for digitalization and artificial intelligence) nor the creator (BSC – Barcelona Supercomputing Center) be liable for any results arising from the use made by third parties of these models.
Los modelos publicados en este repositorio tienen una finalidad generalista y están a disposición de terceros. Estos modelos pueden tener sesgos y/u otro tipo de distorsiones indeseables.
Cuando terceros desplieguen o proporcionen sistemas y/o servicios a otras partes usando alguno de estos modelos (o utilizando sistemas basados en estos modelos) o se conviertan en usuarios de los modelos, deben tener en cuenta que es su responsabilidad mitigar los riesgos derivados de su uso y, en todo caso, cumplir con la normativa aplicable, incluyendo la normativa en materia de uso de inteligencia artificial.
En ningún caso el propietario de los modelos (SEDIA – Secretaría de Estado de Digitalización e Inteligencia Artificial) ni el creador (BSC – Barcelona Supercomputing Center) serán responsables de los resultados derivados del uso que hagan terceros de estos modelos. | {"language": ["es"], "license": "apache-2.0", "tags": ["national library of spain", "spanish", "bne", "capitel", "ner"], "datasets": ["bne", "capitel"], "metrics": ["f1"], "inference": {"parameters": {"aggregation_strategy": "first"}}, "widget": ["Me llamo Francisco Javier y vivo en Madrid.", "Mi hermano Ram\u00f3n y su mejor amigo Luis trabajan en el BSC."], "model-index": [{"name": "roberta-large-bne-capiter-ner", "results": [{"task": {"type": "token-classification"}, "dataset": {"name": "CAPITEL-NERC", "type": "ner"}, "metrics": [{"type": "f1", "value": 0.9051, "name": "F1"}]}]}]} | token-classification | PlanTL-GOB-ES/roberta-large-bne-capitel-ner | [
"transformers",
"pytorch",
"roberta",
"token-classification",
"national library of spain",
"spanish",
"bne",
"capitel",
"ner",
"es",
"dataset:bne",
"dataset:capitel",
"arxiv:1907.11692",
"license:apache-2.0",
"model-index",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [
"1907.11692"
] | [
"es"
] | TAGS
#transformers #pytorch #roberta #token-classification #national library of spain #spanish #bne #capitel #ner #es #dataset-bne #dataset-capitel #arxiv-1907.11692 #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us
| Spanish RoBERTa-large trained on BNE finetuned for CAPITEL Named Entity Recognition (NER) dataset.
==================================================================================================
Table of contents
-----------------
Click to expand
* Model description
* Intended uses and limitations
* How to use
* Limitations and bias
* Training
* Training
+ Training data
+ Training procedure
* Evaluation
* Evaluation
+ Variable and metrics
+ Evaluation results
* Additional information
+ Author
+ Contact information
+ Copyright
+ Licensing information
+ Funding
+ Citing information
+ Disclaimer
Model description
-----------------
The roberta-large-bne-capitel-ner is a Named Entity Recognition (NER) model for the Spanish language fine-tuned from the roberta-large-bne model, a RoBERTa large model pre-trained using the largest Spanish corpus known to date, with a total of 570GB of clean and deduplicated text, processed for this work, compiled from the web crawlings performed by the National Library of Spain (Biblioteca Nacional de España) from 2009 to 2019.
Intended uses and limitations
-----------------------------
roberta-large-bne-capitel-ner model can be used to recognize Named Entities (NE). The model is limited by its training dataset and may not generalize well for all use cases.
How to use
----------
Limitations and bias
--------------------
At the time of submission, no measures have been taken to estimate the bias embedded in the model. However, we are well aware that our models may be biased since the corpora have been collected using crawling techniques on multiple web sources. We intend to conduct research in these areas in the future, and if completed, this model card will be updated.
Training
--------
The dataset used is the one from the CAPITEL competition at IberLEF 2020 (sub-task 1).
### Training procedure
The model was trained with a batch size of 32 and a learning rate of 3e-5 for 5 epochs. We then selected the best checkpoint using the downstream task metric in the corresponding development set and then evaluated it on the test set.
Evaluation
----------
### Variable and metrics
This model was finetuned maximizing F1 score.
Evaluation results
------------------
We evaluated the roberta-large-bne-capitel-ner on the CAPITEL-NERC test set against standard multilingual and monolingual baselines:
For more details, check the fine-tuning and evaluation scripts in the official GitHub repository.
Additional information
----------------------
### Author
Text Mining Unit (TeMU) at the Barcelona Supercomputing Center (bsc-temu@URL)
### Contact information
For further information, send an email to [plantl-gob-es@URL](mailto:plantl-gob-es@URL)
### Copyright
Copyright by the Spanish State Secretariat for Digitalization and Artificial Intelligence (SEDIA) (2022)
### Licensing information
Apache License, Version 2.0
### Funding
This work was funded by the Spanish State Secretariat for Digitalization and Artificial Intelligence (SEDIA) within the framework of the Plan-TL.
Citing information
------------------
If you use this model, please cite our paper:
### Disclaimer
The models published in this repository are intended for a generalist purpose and are available to third parties. These models may have bias and/or any other undesirable distortions.
When third parties, deploy or provide systems and/or services to other parties using any of these models (or using systems based on these models) or become users of the models, they should note that it is their responsibility to mitigate the risks arising from their use and, in any event, to comply with applicable regulations, including regulations regarding the use of artificial intelligence.
In no event shall the owner of the models (SEDIA – State Secretariat for digitalization and artificial intelligence) nor the creator (BSC – Barcelona Supercomputing Center) be liable for any results arising from the use made by third parties of these models.
Los modelos publicados en este repositorio tienen una finalidad generalista y están a disposición de terceros. Estos modelos pueden tener sesgos y/u otro tipo de distorsiones indeseables.
Cuando terceros desplieguen o proporcionen sistemas y/o servicios a otras partes usando alguno de estos modelos (o utilizando sistemas basados en estos modelos) o se conviertan en usuarios de los modelos, deben tener en cuenta que es su responsabilidad mitigar los riesgos derivados de su uso y, en todo caso, cumplir con la normativa aplicable, incluyendo la normativa en materia de uso de inteligencia artificial.
En ningún caso el propietario de los modelos (SEDIA – Secretaría de Estado de Digitalización e Inteligencia Artificial) ni el creador (BSC – Barcelona Supercomputing Center) serán responsables de los resultados derivados del uso que hagan terceros de estos modelos.
| [
"### Training procedure\n\n\nThe model was trained with a batch size of 32 and a learning rate of 3e-5 for 5 epochs. We then selected the best checkpoint using the downstream task metric in the corresponding development set and then evaluated it on the test set.\n\n\nEvaluation\n----------",
"### Variable and metrics\n\n\nThis model was finetuned maximizing F1 score.\n\n\nEvaluation results\n------------------\n\n\nWe evaluated the roberta-large-bne-capitel-ner on the CAPITEL-NERC test set against standard multilingual and monolingual baselines:\n\n\n\nFor more details, check the fine-tuning and evaluation scripts in the official GitHub repository.\n\n\nAdditional information\n----------------------",
"### Author\n\n\nText Mining Unit (TeMU) at the Barcelona Supercomputing Center (bsc-temu@URL)",
"### Contact information\n\n\nFor further information, send an email to [plantl-gob-es@URL](mailto:plantl-gob-es@URL)",
"### Copyright\n\n\nCopyright by the Spanish State Secretariat for Digitalization and Artificial Intelligence (SEDIA) (2022)",
"### Licensing information\n\n\nApache License, Version 2.0",
"### Funding\n\n\nThis work was funded by the Spanish State Secretariat for Digitalization and Artificial Intelligence (SEDIA) within the framework of the Plan-TL.\n\n\nCiting information\n------------------\n\n\nIf you use this model, please cite our paper:",
"### Disclaimer\n\n\nThe models published in this repository are intended for a generalist purpose and are available to third parties. These models may have bias and/or any other undesirable distortions.\n\n\nWhen third parties, deploy or provide systems and/or services to other parties using any of these models (or using systems based on these models) or become users of the models, they should note that it is their responsibility to mitigate the risks arising from their use and, in any event, to comply with applicable regulations, including regulations regarding the use of artificial intelligence.\n\n\nIn no event shall the owner of the models (SEDIA – State Secretariat for digitalization and artificial intelligence) nor the creator (BSC – Barcelona Supercomputing Center) be liable for any results arising from the use made by third parties of these models.\n\n\nLos modelos publicados en este repositorio tienen una finalidad generalista y están a disposición de terceros. Estos modelos pueden tener sesgos y/u otro tipo de distorsiones indeseables.\n\n\nCuando terceros desplieguen o proporcionen sistemas y/o servicios a otras partes usando alguno de estos modelos (o utilizando sistemas basados en estos modelos) o se conviertan en usuarios de los modelos, deben tener en cuenta que es su responsabilidad mitigar los riesgos derivados de su uso y, en todo caso, cumplir con la normativa aplicable, incluyendo la normativa en materia de uso de inteligencia artificial.\n\n\nEn ningún caso el propietario de los modelos (SEDIA – Secretaría de Estado de Digitalización e Inteligencia Artificial) ni el creador (BSC – Barcelona Supercomputing Center) serán responsables de los resultados derivados del uso que hagan terceros de estos modelos."
] | [
"TAGS\n#transformers #pytorch #roberta #token-classification #national library of spain #spanish #bne #capitel #ner #es #dataset-bne #dataset-capitel #arxiv-1907.11692 #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us \n",
"### Training procedure\n\n\nThe model was trained with a batch size of 32 and a learning rate of 3e-5 for 5 epochs. We then selected the best checkpoint using the downstream task metric in the corresponding development set and then evaluated it on the test set.\n\n\nEvaluation\n----------",
"### Variable and metrics\n\n\nThis model was finetuned maximizing F1 score.\n\n\nEvaluation results\n------------------\n\n\nWe evaluated the roberta-large-bne-capitel-ner on the CAPITEL-NERC test set against standard multilingual and monolingual baselines:\n\n\n\nFor more details, check the fine-tuning and evaluation scripts in the official GitHub repository.\n\n\nAdditional information\n----------------------",
"### Author\n\n\nText Mining Unit (TeMU) at the Barcelona Supercomputing Center (bsc-temu@URL)",
"### Contact information\n\n\nFor further information, send an email to [plantl-gob-es@URL](mailto:plantl-gob-es@URL)",
"### Copyright\n\n\nCopyright by the Spanish State Secretariat for Digitalization and Artificial Intelligence (SEDIA) (2022)",
"### Licensing information\n\n\nApache License, Version 2.0",
"### Funding\n\n\nThis work was funded by the Spanish State Secretariat for Digitalization and Artificial Intelligence (SEDIA) within the framework of the Plan-TL.\n\n\nCiting information\n------------------\n\n\nIf you use this model, please cite our paper:",
"### Disclaimer\n\n\nThe models published in this repository are intended for a generalist purpose and are available to third parties. These models may have bias and/or any other undesirable distortions.\n\n\nWhen third parties, deploy or provide systems and/or services to other parties using any of these models (or using systems based on these models) or become users of the models, they should note that it is their responsibility to mitigate the risks arising from their use and, in any event, to comply with applicable regulations, including regulations regarding the use of artificial intelligence.\n\n\nIn no event shall the owner of the models (SEDIA – State Secretariat for digitalization and artificial intelligence) nor the creator (BSC – Barcelona Supercomputing Center) be liable for any results arising from the use made by third parties of these models.\n\n\nLos modelos publicados en este repositorio tienen una finalidad generalista y están a disposición de terceros. Estos modelos pueden tener sesgos y/u otro tipo de distorsiones indeseables.\n\n\nCuando terceros desplieguen o proporcionen sistemas y/o servicios a otras partes usando alguno de estos modelos (o utilizando sistemas basados en estos modelos) o se conviertan en usuarios de los modelos, deben tener en cuenta que es su responsabilidad mitigar los riesgos derivados de su uso y, en todo caso, cumplir con la normativa aplicable, incluyendo la normativa en materia de uso de inteligencia artificial.\n\n\nEn ningún caso el propietario de los modelos (SEDIA – Secretaría de Estado de Digitalización e Inteligencia Artificial) ni el creador (BSC – Barcelona Supercomputing Center) serán responsables de los resultados derivados del uso que hagan terceros de estos modelos."
] | [
90,
65,
94,
28,
37,
22,
12,
50,
363
] | [
"passage: TAGS\n#transformers #pytorch #roberta #token-classification #national library of spain #spanish #bne #capitel #ner #es #dataset-bne #dataset-capitel #arxiv-1907.11692 #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us \n### Training procedure\n\n\nThe model was trained with a batch size of 32 and a learning rate of 3e-5 for 5 epochs. We then selected the best checkpoint using the downstream task metric in the corresponding development set and then evaluated it on the test set.\n\n\nEvaluation\n----------### Variable and metrics\n\n\nThis model was finetuned maximizing F1 score.\n\n\nEvaluation results\n------------------\n\n\nWe evaluated the roberta-large-bne-capitel-ner on the CAPITEL-NERC test set against standard multilingual and monolingual baselines:\n\n\n\nFor more details, check the fine-tuning and evaluation scripts in the official GitHub repository.\n\n\nAdditional information\n----------------------### Author\n\n\nText Mining Unit (TeMU) at the Barcelona Supercomputing Center (bsc-temu@URL)### Contact information\n\n\nFor further information, send an email to [plantl-gob-es@URL](mailto:plantl-gob-es@URL)### Copyright\n\n\nCopyright by the Spanish State Secretariat for Digitalization and Artificial Intelligence (SEDIA) (2022)### Licensing information\n\n\nApache License, Version 2.0### Funding\n\n\nThis work was funded by the Spanish State Secretariat for Digitalization and Artificial Intelligence (SEDIA) within the framework of the Plan-TL.\n\n\nCiting information\n------------------\n\n\nIf you use this model, please cite our paper:"
] | [
-0.08379781991243362,
0.21478645503520966,
-0.007256380282342434,
0.06397680193185806,
0.11667671799659729,
-0.0264272503554821,
0.04127122089266777,
0.10152267664670944,
-0.03466814383864403,
0.12049250304698944,
-0.0038757636211812496,
0.0441327802836895,
0.10022305697202682,
0.11260434985160828,
0.045798979699611664,
-0.1798790991306305,
-0.010611284524202347,
-0.08511862903833389,
0.00901732873171568,
0.10038700699806213,
0.10086078941822052,
-0.052430856972932816,
0.05135425552725792,
-0.02697448618710041,
-0.019223671406507492,
0.07717937976121902,
-0.05613945424556732,
-0.0767878070473671,
0.06333515048027039,
0.05936616659164429,
0.05357305333018303,
0.01924976333975792,
0.04398352652788162,
-0.21642987430095673,
0.015277283266186714,
0.042067576199769974,
0.002809230936691165,
0.05188998952507973,
0.09416863322257996,
-0.04764454811811447,
0.20327679812908173,
-0.10630226135253906,
0.020687170326709747,
0.049265991896390915,
-0.10352376848459244,
-0.11049919575452805,
-0.1247875913977623,
0.06480927765369415,
0.08000481873750687,
0.03668840602040291,
-0.029004648327827454,
0.05767719820141792,
-0.05391734465956688,
0.016465185210108757,
0.08067122846841812,
-0.187440425157547,
-0.043616700917482376,
0.04089491814374924,
0.02347624860703945,
0.11395945399999619,
-0.07540358603000641,
-0.0033594470005482435,
0.050024718046188354,
-0.005895110312849283,
-0.002630513161420822,
-0.03210685774683952,
-0.056664157658815384,
0.017924919724464417,
-0.11923069506883621,
-0.10382840037345886,
0.13354580104351044,
0.0033762992825359106,
-0.08993898332118988,
-0.10903814435005188,
-0.010343354195356369,
0.00978673342615366,
0.030222268775105476,
-0.03862166032195091,
0.0367366224527359,
-0.018879029899835587,
0.06883779913187027,
-0.05169304087758064,
-0.0895562544465065,
-0.03739217296242714,
-0.04625089094042778,
0.06781657040119171,
0.012249604798853397,
-0.019830923527479172,
0.020378541201353073,
0.1302412450313568,
0.006716958712786436,
-0.1264864057302475,
-0.027843112125992775,
-0.002770893508568406,
-0.03990405797958374,
-0.04401085898280144,
0.025929125025868416,
-0.04711788520216942,
0.09712528437376022,
0.1915348321199417,
-0.0651039183139801,
0.01358813513070345,
-0.030863501131534576,
0.006772827822715044,
0.08605610579252243,
0.16966456174850464,
-0.07332304120063782,
-0.08742072433233261,
0.009665136225521564,
-0.0034943060018122196,
0.030563127249479294,
0.021475618705153465,
-0.033218882977962494,
0.014130443334579468,
0.04577399417757988,
0.1335119903087616,
0.07868345081806183,
-0.04491818696260452,
-0.07701422274112701,
-0.019034769386053085,
0.14795686304569244,
-0.15187717974185944,
0.031848035752773285,
0.013829614035785198,
-0.07028353959321976,
0.018163206055760384,
-0.046559546142816544,
-0.04065214470028877,
-0.11005813628435135,
0.04494043439626694,
-0.048634860664606094,
-0.03151252493262291,
-0.07006833702325821,
-0.04343749210238457,
0.07438315451145172,
-0.06015849858522415,
-0.0017716658767312765,
-0.09239528328180313,
-0.09311814606189728,
-0.08284393697977066,
0.046577952802181244,
-0.12503540515899658,
0.003223585430532694,
-0.02843206562101841,
-0.004743062891066074,
0.0018854840891435742,
-0.04470543563365936,
0.03600790351629257,
-0.058682363480329514,
0.059385996311903,
0.03758717700839043,
0.02418016828596592,
0.06009746342897415,
0.0006828616023994982,
-0.11668886244297028,
0.014719790779054165,
-0.1438879668712616,
0.10317067056894302,
-0.10076875239610672,
0.016340279951691628,
-0.1906743049621582,
-0.044373467564582825,
-0.005756731610745192,
0.012111334130167961,
0.057362496852874756,
0.18343918025493622,
-0.10876704007387161,
-0.04444880411028862,
0.1380884051322937,
-0.043865371495485306,
-0.05129753425717354,
0.1144823208451271,
0.006288380362093449,
0.06519436836242676,
0.07097697257995605,
0.09323491901159286,
0.12601728737354279,
-0.16674166917800903,
-0.050878699868917465,
-0.005427815485745668,
-0.01979766972362995,
0.06177472695708275,
0.11704623699188232,
-0.0953654870390892,
0.06388629227876663,
0.03926171362400055,
-0.12913517653942108,
-0.021597590297460556,
-0.01288830116391182,
-0.04321320354938507,
0.03426951542496681,
0.003723945003002882,
-0.033465202897787094,
-0.002138713141903281,
-0.026627082377672195,
-0.03271542116999626,
-0.10118145495653152,
-0.003635873319581151,
0.04117467254400253,
0.01770244538784027,
0.01150426920503378,
-0.09435490518808365,
0.09210244566202164,
-0.05085818096995354,
0.006324280519038439,
-0.18664304912090302,
-0.0060774050652980804,
0.044558085501194,
-0.06786337494850159,
0.11909353733062744,
-0.07351674884557724,
0.011820976622402668,
0.015769731253385544,
-0.03359626606106758,
-0.00770450197160244,
-0.028857121244072914,
-0.05370202660560608,
0.018762195482850075,
-0.11926262080669403,
0.005115638021379709,
-0.03178159147500992,
0.06658526510000229,
-0.09474891424179077,
0.009063760749995708,
0.11569223552942276,
0.08653449267148972,
-0.018447335809469223,
-0.039996884763240814,
0.01743406243622303,
0.023358270525932312,
-0.027456384152173996,
-0.06940852850675583,
0.031673286110162735,
0.010739422403275967,
-0.029719440266489983,
-0.031305521726608276,
-0.03787708654999733,
-0.05060433968901634,
0.06350873410701752,
0.11244465410709381,
-0.05887802690267563,
-0.036611445248126984,
-0.03539947420358658,
0.00881865806877613,
-0.053613584488630295,
-0.014344044029712677,
0.2117571234703064,
0.03160648047924042,
0.06888990849256516,
-0.13809221982955933,
-0.07531740516424179,
-0.0013142804382368922,
-0.04847858473658562,
-0.08169045299291611,
0.14609605073928833,
0.04620109871029854,
-0.07666920870542526,
0.0705212876200676,
-0.00827474519610405,
0.02986142598092556,
0.18335801362991333,
0.00006179586489452049,
-0.10486079752445221,
-0.039521001279354095,
0.08306125551462173,
0.037459325045347214,
0.0713297501206398,
-0.04000173136591911,
0.009596476331353188,
0.04422086477279663,
0.009841110557317734,
0.06947460770606995,
-0.1302659958600998,
0.031325895339250565,
0.0014460984384641051,
-0.06126323714852333,
0.0033167055808007717,
0.018280556425452232,
-0.02851930446922779,
0.08413273096084595,
0.04894288256764412,
0.048829857259988785,
-0.05342818424105644,
-0.033714231103658676,
-0.10503324121236801,
0.1523592174053192,
-0.07582655549049377,
-0.26046255230903625,
-0.23570501804351807,
0.02896891161799431,
-0.04443507641553879,
0.028741803020238876,
0.03892134502530098,
-0.09444394707679749,
-0.06431029736995697,
-0.0658838301897049,
0.02096833847463131,
0.07559517025947571,
-0.08247267454862595,
-0.004018290434032679,
0.05916329100728035,
0.002256319159641862,
-0.10570766776800156,
0.0009578604949638247,
0.046575091779232025,
-0.04823099076747894,
-0.03755657747387886,
0.010014270432293415,
0.1326511651277542,
0.09942717850208282,
0.019707072526216507,
-0.008200537413358688,
-0.005673017352819443,
0.1830018013715744,
-0.13935676217079163,
0.04495551809668541,
0.21751002967357635,
0.05694495141506195,
0.026107056066393852,
0.16400839388370514,
0.018870355561375618,
-0.05621841549873352,
0.021318845450878143,
0.019696401432156563,
-0.045864563435316086,
-0.266483336687088,
-0.06470039486885071,
-0.03620227053761482,
-0.04061129689216614,
0.06118404492735863,
0.08918396383523941,
-0.026725655421614647,
0.017478803172707558,
-0.042635347694158554,
-0.07160473614931107,
0.05590032413601875,
0.10263969749212265,
0.05516936630010605,
0.0185357928276062,
0.017328733578324318,
-0.06247296556830406,
-0.04752930626273155,
0.13559485971927643,
0.09356345236301422,
0.1133708581328392,
0.004423258360475302,
0.1544886827468872,
0.05554944649338722,
0.058037687093019485,
-0.04920487850904465,
0.024198226630687714,
0.029648790135979652,
0.014297788962721825,
-0.02947693131864071,
-0.07089633494615555,
-0.03666220232844353,
-0.0031570829451084137,
0.014639423228800297,
-0.028285592794418335,
-0.02808634378015995,
-0.10892341285943985,
0.08702007681131363,
0.10601934045553207,
-0.00499053904786706,
-0.16063888370990753,
-0.056487295776605606,
0.039344530552625656,
-0.07014855742454529,
-0.07381629943847656,
-0.024681488052010536,
0.027172837406396866,
-0.15489454567432404,
0.03595125302672386,
-0.009080027230083942,
0.10604643821716309,
-0.05624127760529518,
-0.028928376734256744,
-0.014758212491869926,
0.06044876202940941,
0.0028661140240728855,
0.1127808690071106,
-0.12320591509342194,
0.14582747220993042,
0.004138762131333351,
0.1031215712428093,
-0.035400133579969406,
0.05499708652496338,
-0.032244034111499786,
-0.004566703923046589,
0.15699924528598785,
-0.008671516552567482,
-0.013548646122217178,
-0.05292557552456856,
-0.056249771267175674,
0.0074478620663285255,
0.06369522213935852,
-0.12475081533193588,
0.10904281586408615,
-0.007814823649823666,
-0.014496929943561554,
-0.11071698367595673,
-0.11619789153337479,
-0.08240896463394165,
-0.16510416567325592,
0.031004076823592186,
-0.12362812459468842,
0.06827455759048462,
-0.045133743435144424,
-0.0458093024790287,
-0.00849772896617651,
0.1978093534708023,
-0.22144436836242676,
-0.08565331250429153,
-0.12864534556865692,
0.03318517655134201,
0.13225695490837097,
-0.08164165169000626,
0.0304021667689085,
-0.03674842789769173,
0.09457950294017792,
0.01642918400466442,
-0.025479432195425034,
0.017828302457928658,
-0.06276876479387283,
-0.1204257383942604,
-0.020216738805174828,
0.16162104904651642,
0.05969846993684769,
0.03750820830464363,
0.005045941099524498,
-0.012099314481019974,
0.01690051145851612,
-0.10454311966896057,
-0.04524010792374611,
0.07922104746103287,
0.12438427656888962,
0.07675640285015106,
-0.01454127300530672,
-0.150743767619133,
-0.122256800532341,
-0.07069827616214752,
0.06880787014961243,
0.2010871022939682,
-0.013833579607307911,
0.1120203360915184,
0.18099547922611237,
-0.12528325617313385,
-0.15410694479942322,
-0.0780915766954422,
0.05918313190340996,
0.020247116684913635,
0.02637374773621559,
-0.1789189875125885,
0.001993203070014715,
0.09748096019029617,
-0.007675994653254747,
-0.009975088760256767,
-0.27796822786331177,
-0.11803113669157028,
0.010502207092940807,
0.03529161959886551,
-0.088585264980793,
-0.12185333669185638,
-0.10398918390274048,
-0.055998627096414566,
-0.12334948778152466,
0.06626632809638977,
0.0028405007906258106,
0.046925608068704605,
-0.0012006033211946487,
0.012225543148815632,
0.04152623564004898,
-0.026114005595445633,
0.1921960860490799,
-0.0412948839366436,
0.020864443853497505,
-0.03746434673666954,
-0.0024691021535545588,
0.07864028960466385,
-0.013799325563013554,
0.11473607271909714,
0.0024623991921544075,
0.022419149056077003,
-0.09454662352800369,
-0.054281990975141525,
-0.04674908518791199,
0.04073549434542656,
-0.047536980360746384,
-0.009384074248373508,
-0.09769195318222046,
0.09746810793876648,
0.04098765179514885,
-0.013012435287237167,
0.031140822917222977,
-0.0800560712814331,
-0.0021523742470890284,
0.1714705228805542,
0.15121081471443176,
0.06291405111551285,
-0.03041486255824566,
-0.0019507139222696424,
-0.0009989264653995633,
0.025460585951805115,
-0.13014301657676697,
0.013059047982096672,
0.12602896988391876,
0.01619923673570156,
0.08081571757793427,
-0.032312218099832535,
-0.14237470924854279,
0.010814977809786797,
0.14496587216854095,
-0.04659327119588852,
-0.1401088833808899,
-0.01729217730462551,
-0.018286574631929398,
-0.1205434799194336,
-0.005225906614214182,
0.10858932882547379,
0.018768755719065666,
-0.06804399192333221,
0.021012907847762108,
0.06538867205381393,
-0.018977977335453033,
0.13683369755744934,
0.035891417413949966,
0.03891300410032272,
-0.05541902780532837,
0.1205686703324318,
0.11785320192575455,
-0.11026235669851303,
-0.019626952707767487,
0.0921664610505104,
-0.056465793401002884,
-0.02917429618537426,
0.04813098907470703,
0.0008593388483859599,
-0.11287130415439606,
-0.06736143678426743,
-0.07741740345954895,
-0.032123953104019165,
0.008026588708162308,
0.08026514947414398,
0.03109731897711754,
0.02169317752122879,
0.003964377101510763,
0.02996283769607544,
-0.04205597937107086,
0.07466931641101837,
0.09257476776838303,
-0.005360255483537912,
-0.0859508290886879,
0.003889822866767645,
0.0035629330668598413,
-0.014186940155923367,
-0.019274557009339333,
-0.019197605550289154,
-0.10966281592845917,
0.010519846342504025,
-0.051254332065582275,
0.03752264752984047,
-0.07904867827892303,
-0.004536288324743509,
-0.013860209845006466,
-0.03956585377454758,
-0.04913872480392456,
0.002888096496462822,
-0.02861836552619934,
-0.04215788096189499,
-0.052771154791116714,
0.1328088641166687,
-0.14393557608127594,
0.04943366348743439,
0.09227205812931061,
-0.0693369135260582,
0.07345262914896011,
-0.021090839058160782,
0.011734077706933022,
0.09379298239946365,
-0.19099272787570953,
0.04417887702584267,
0.004702477250248194,
0.048143479973077774,
0.029189053922891617,
-0.12228314578533173,
0.04386527091264725,
0.03758195787668228,
-0.047066330909729004,
0.022048506885766983,
0.02103746496140957,
-0.1069280132651329,
-0.00333955604583025,
0.013306017965078354,
-0.0775638222694397,
-0.06359649449586868,
0.10777644068002701,
0.12493978440761566,
0.02618483081459999,
0.10848197340965271,
-0.07056543231010437,
0.0013100863434374332,
-0.1380193829536438,
-0.013193865306675434,
-0.000882956141140312,
0.01815885491669178,
-0.02470361813902855,
-0.0492982380092144,
0.051123686134815216,
0.030431188642978668,
0.15577064454555511,
0.059505071491003036,
0.11464770138263702,
0.031585827469825745,
0.018823089078068733,
0.0490429513156414,
0.028899364173412323,
0.06205708906054497,
0.012069047428667545,
0.019078342244029045,
-0.030164070427417755,
-0.0153857646510005,
-0.04998401552438736,
-0.09101758152246475,
0.06195046380162239,
0.12779109179973602,
0.0973745584487915,
0.044468291103839874,
0.004071617964655161,
-0.04402764514088631,
-0.025148531422019005,
0.029532192274928093,
0.0017726159421727061,
0.003559374250471592,
-0.04356519505381584,
0.0929085835814476,
0.20757906138896942,
-0.19587773084640503,
0.10262245684862137,
-0.007351219188421965,
-0.0645427480340004,
-0.0563158355653286,
-0.2159198373556137,
-0.028879661113023758,
-0.08292122185230255,
0.03645087778568268,
-0.1040845736861229,
0.08102605491876602,
0.002653800416737795,
-0.0034983488731086254,
-0.07620734721422195,
0.07891049236059189,
-0.0467398464679718,
-0.1297864019870758,
0.06250786781311035,
0.014070918783545494,
0.09378369897603989,
0.010795444250106812,
0.09633044898509979,
0.0004957111086696386,
0.0813247486948967,
0.09395620226860046,
0.10750983655452728,
0.044310592114925385,
0.006947788409888744,
-0.06777621060609818,
-0.027384890243411064,
0.02474376931786537,
-0.0015406633028760552,
0.010640984401106834,
0.19415374100208282,
0.030390942469239235,
-0.028806360438466072,
0.018300341442227364,
0.25533005595207214,
-0.01446663960814476,
-0.05842715501785278,
-0.1367972195148468,
0.13883076608181,
0.03352122753858566,
0.061117056757211685,
0.02446872927248478,
-0.13703837990760803,
-0.03195249289274216,
0.12467306107282639,
0.09025038033723831,
0.008109286427497864,
-0.0345923975110054,
-0.010663251392543316,
0.01824643835425377,
0.010534266009926796,
0.05217259004712105,
0.040246762335300446,
0.2548605799674988,
-0.05694187059998512,
0.05395863205194473,
-0.037566378712654114,
0.01140133198350668,
-0.05375467240810394,
0.11639157682657242,
-0.040196742862463,
-0.011126195080578327,
-0.06478378176689148,
0.16069521009922028,
-0.07651091367006302,
-0.2896938621997833,
0.05338189750909805,
-0.03445180132985115,
-0.14578987658023834,
-0.004530387930572033,
0.034065380692481995,
-0.004086478613317013,
0.04432203620672226,
0.046446848660707474,
-0.029623720794916153,
0.06705188006162643,
0.03649868443608284,
-0.029798753559589386,
-0.0632467195391655,
0.04469066113233566,
-0.0665140450000763,
0.24008744955062866,
-0.009532214142382145,
0.08710453659296036,
0.10697592049837112,
-0.039223525673151016,
-0.1542803794145584,
0.045272741466760635,
0.06340276449918747,
-0.02825351431965828,
0.12097158282995224,
0.06686459481716156,
0.026313405483961105,
0.011132658459246159,
0.07399974018335342,
0.04626160115003586,
0.03591221570968628,
0.02203580178320408,
0.06422305852174759,
-0.16274115443229675,
0.11936473101377487,
-0.14142702519893646,
0.08129104226827621,
0.09366650879383087,
-0.04017515480518341,
0.06780295819044113,
-0.0730859711766243,
0.08663706481456757,
0.002555577550083399,
0.18847569823265076,
0.03428768366575241,
-0.17075975239276886,
0.02206636592745781,
-0.008894037455320358,
0.03903871029615402,
-0.20893201231956482,
-0.025441892445087433,
0.04561232402920723,
0.0027649537660181522,
-0.05518237128853798,
0.13372112810611725,
0.011942090466618538,
0.011998442932963371,
-0.00872504897415638,
-0.16870924830436707,
-0.01034911721944809,
0.07803214341402054,
-0.11449863761663437,
-0.006706498563289642
] |
null | null | transformers |
# Spanish RoBERTa-large trained on BNE finetuned for CAPITEL Part of Speech (POS) dataset
## Table of contents
<details>
<summary>Click to expand</summary>
- [Model description](#model-description)
- [Intended uses and limitations](#intended-use)
- [How to use](#how-to-use)
- [Limitations and bias](#limitations-and-bias)
- [Training](#training)
- [Training](#training)
- [Training data](#training-data)
- [Training procedure](#training-procedure)
- [Evaluation](#evaluation)
- [Evaluation](#evaluation)
- [Variable and metrics](#variable-and-metrics)
- [Evaluation results](#evaluation-results)
- [Additional information](#additional-information)
- [Author](#author)
- [Contact information](#contact-information)
- [Copyright](#copyright)
- [Licensing information](#licensing-information)
- [Funding](#funding)
- [Citing information](#citing-information)
- [Disclaimer](#disclaimer)
</details>
## Model description
The **roberta-large-bne-capitel-pos** is a Part-of-speech-tagging (POS) model for the Spanish language fine-tuned from the [roberta-large-bne](https://huggingface.co/PlanTL-GOB-ES/roberta-large-bne) model, a [RoBERTa](https://arxiv.org/abs/1907.11692) large model pre-trained using the largest Spanish corpus known to date, with a total of 570GB of clean and deduplicated text, processed for this work, compiled from the web crawlings performed by the [National Library of Spain (Biblioteca Nacional de España)](http://www.bne.es/en/Inicio/index.html) from 2009 to 2019.
# Intended uses and limitations
**roberta-large-bne-capitel-pos** model can be used to Part-of-speech-tagging (POS) a text. The model is limited by its training dataset and may not generalize well for all use cases.
## How to use
Here is how to use this model:
```python
from transformers import pipeline
from pprint import pprint
nlp = pipeline("token-classification", model="PlanTL-GOB-ES/roberta-large-bne-capitel-pos")
example = "El alcalde de Vigo, Abel Caballero, ha comenzado a colocar las luces de Navidad en agosto."
pos_results = nlp(example)
pprint(pos_results)
```
## Limitations and bias
At the time of submission, no measures have been taken to estimate the bias embedded in the model. However, we are well aware that our models may be biased since the corpora have been collected using crawling techniques on multiple web sources. We intend to conduct research in these areas in the future, and if completed, this model card will be updated.
## Training
The dataset used is the one from the [CAPITEL competition at IberLEF 2020](https://sites.google.com/view/capitel2020) (sub-task 2).
### Training procedure
The model was trained with a batch size of 16 and a learning rate of 3e-5 for 5 epochs. We then selected the best checkpoint using the downstream task metric in the corresponding development set and then evaluated it on the test set.
## Evaluation
### Variable and metrics
This model was finetuned maximizing F1 score.
## Evaluation results
We evaluated the **roberta-large-bne-capitel-pos** on the CAPITEL-POS test set against standard multilingual and monolingual baselines:
| Model | CAPITEL-POS (F1) |
| ------------|:----|
| roberta-large-bne-capitel-pos | **98.56** |
| roberta-base-bne-capitel-pos | 98.46 |
| BETO | 98.36 |
| mBERT | 98.39 |
| BERTIN | 98.47 |
| ELECTRA | 98.16 |
For more details, check the fine-tuning and evaluation scripts in the official [GitHub repository](https://github.com/PlanTL-GOB-ES/lm-spanish).
## Additional information
### Author
Text Mining Unit (TeMU) at the Barcelona Supercomputing Center ([email protected])
### Contact information
For further information, send an email to <[email protected]>
### Copyright
Copyright by the Spanish State Secretariat for Digitalization and Artificial Intelligence (SEDIA) (2022)
### Licensing information
[Apache License, Version 2.0](https://www.apache.org/licenses/LICENSE-2.0)
### Funding
This work was funded by the Spanish State Secretariat for Digitalization and Artificial Intelligence (SEDIA) within the framework of the Plan-TL.
### Citing information
If you use this model, please cite our [paper](http://journal.sepln.org/sepln/ojs/ojs/index.php/pln/article/view/6405):
```
@article{,
abstract = {We want to thank the National Library of Spain for such a large effort on the data gathering and the Future of Computing Center, a
Barcelona Supercomputing Center and IBM initiative (2020). This work was funded by the Spanish State Secretariat for Digitalization and Artificial
Intelligence (SEDIA) within the framework of the Plan-TL.},
author = {Asier Gutiérrez Fandiño and Jordi Armengol Estapé and Marc Pàmies and Joan Llop Palao and Joaquin Silveira Ocampo and Casimiro Pio Carrino and Carme Armentano Oller and Carlos Rodriguez Penagos and Aitor Gonzalez Agirre and Marta Villegas},
doi = {10.26342/2022-68-3},
issn = {1135-5948},
journal = {Procesamiento del Lenguaje Natural},
keywords = {Artificial intelligence,Benchmarking,Data processing.,MarIA,Natural language processing,Spanish language modelling,Spanish language resources,Tractament del llenguatge natural (Informàtica),Àrees temàtiques de la UPC::Informàtica::Intel·ligència artificial::Llenguatge natural},
publisher = {Sociedad Española para el Procesamiento del Lenguaje Natural},
title = {MarIA: Spanish Language Models},
volume = {68},
url = {https://upcommons.upc.edu/handle/2117/367156#.YyMTB4X9A-0.mendeley},
year = {2022},
}
```
### Disclaimer
The models published in this repository are intended for a generalist purpose and are available to third parties. These models may have bias and/or any other undesirable distortions.
When third parties, deploy or provide systems and/or services to other parties using any of these models (or using systems based on these models) or become users of the models, they should note that it is their responsibility to mitigate the risks arising from their use and, in any event, to comply with applicable regulations, including regulations regarding the use of artificial intelligence.
In no event shall the owner of the models (SEDIA – State Secretariat for digitalization and artificial intelligence) nor the creator (BSC – Barcelona Supercomputing Center) be liable for any results arising from the use made by third parties of these models.
Los modelos publicados en este repositorio tienen una finalidad generalista y están a disposición de terceros. Estos modelos pueden tener sesgos y/u otro tipo de distorsiones indeseables.
Cuando terceros desplieguen o proporcionen sistemas y/o servicios a otras partes usando alguno de estos modelos (o utilizando sistemas basados en estos modelos) o se conviertan en usuarios de los modelos, deben tener en cuenta que es su responsabilidad mitigar los riesgos derivados de su uso y, en todo caso, cumplir con la normativa aplicable, incluyendo la normativa en materia de uso de inteligencia artificial.
En ningún caso el propietario de los modelos (SEDIA – Secretaría de Estado de Digitalización e Inteligencia Artificial) ni el creador (BSC – Barcelona Supercomputing Center) serán responsables de los resultados derivados del uso que hagan terceros de estos modelos. | {"language": ["es"], "license": "apache-2.0", "tags": ["national library of spain", "spanish", "bne", "capitel", "pos"], "datasets": ["bne", "capitel"], "metrics": ["f1"], "inference": {"parameters": {"aggregation_strategy": "first"}}, "widget": [{"text": "Festival de San Sebasti\u00e1n: Johnny Depp recibir\u00e1 el premio Donostia en pleno rifirrafe judicial con Amber Heard"}, {"text": "El alcalde de Vigo, Abel Caballero, ha comenzado a colocar las luces de Navidad en agosto."}, {"text": "Gracias a los datos de la BNE, se ha podido lograr este modelo del lenguaje."}], "model-index": [{"name": "roberta-large-bne-capiter-pos", "results": [{"task": {"type": "token-classification"}, "dataset": {"name": "CAPITEL-POS", "type": "pos"}, "metrics": [{"type": "f1", "value": 0.986, "name": "F1"}]}]}]} | token-classification | PlanTL-GOB-ES/roberta-large-bne-capitel-pos | [
"transformers",
"pytorch",
"roberta",
"token-classification",
"national library of spain",
"spanish",
"bne",
"capitel",
"pos",
"es",
"dataset:bne",
"dataset:capitel",
"arxiv:1907.11692",
"license:apache-2.0",
"model-index",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [
"1907.11692"
] | [
"es"
] | TAGS
#transformers #pytorch #roberta #token-classification #national library of spain #spanish #bne #capitel #pos #es #dataset-bne #dataset-capitel #arxiv-1907.11692 #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us
| Spanish RoBERTa-large trained on BNE finetuned for CAPITEL Part of Speech (POS) dataset
=======================================================================================
Table of contents
-----------------
Click to expand
* Model description
* Intended uses and limitations
* How to use
* Limitations and bias
* Training
* Training
+ Training data
+ Training procedure
* Evaluation
* Evaluation
+ Variable and metrics
+ Evaluation results
* Additional information
+ Author
+ Contact information
+ Copyright
+ Licensing information
+ Funding
+ Citing information
+ Disclaimer
Model description
-----------------
The roberta-large-bne-capitel-pos is a Part-of-speech-tagging (POS) model for the Spanish language fine-tuned from the roberta-large-bne model, a RoBERTa large model pre-trained using the largest Spanish corpus known to date, with a total of 570GB of clean and deduplicated text, processed for this work, compiled from the web crawlings performed by the National Library of Spain (Biblioteca Nacional de España) from 2009 to 2019.
Intended uses and limitations
=============================
roberta-large-bne-capitel-pos model can be used to Part-of-speech-tagging (POS) a text. The model is limited by its training dataset and may not generalize well for all use cases.
How to use
----------
Here is how to use this model:
Limitations and bias
--------------------
At the time of submission, no measures have been taken to estimate the bias embedded in the model. However, we are well aware that our models may be biased since the corpora have been collected using crawling techniques on multiple web sources. We intend to conduct research in these areas in the future, and if completed, this model card will be updated.
Training
--------
The dataset used is the one from the CAPITEL competition at IberLEF 2020 (sub-task 2).
### Training procedure
The model was trained with a batch size of 16 and a learning rate of 3e-5 for 5 epochs. We then selected the best checkpoint using the downstream task metric in the corresponding development set and then evaluated it on the test set.
Evaluation
----------
### Variable and metrics
This model was finetuned maximizing F1 score.
Evaluation results
------------------
We evaluated the roberta-large-bne-capitel-pos on the CAPITEL-POS test set against standard multilingual and monolingual baselines:
For more details, check the fine-tuning and evaluation scripts in the official GitHub repository.
Additional information
----------------------
### Author
Text Mining Unit (TeMU) at the Barcelona Supercomputing Center (bsc-temu@URL)
### Contact information
For further information, send an email to [plantl-gob-es@URL](mailto:plantl-gob-es@URL)
### Copyright
Copyright by the Spanish State Secretariat for Digitalization and Artificial Intelligence (SEDIA) (2022)
### Licensing information
Apache License, Version 2.0
### Funding
This work was funded by the Spanish State Secretariat for Digitalization and Artificial Intelligence (SEDIA) within the framework of the Plan-TL.
### Citing information
If you use this model, please cite our paper:
### Disclaimer
The models published in this repository are intended for a generalist purpose and are available to third parties. These models may have bias and/or any other undesirable distortions.
When third parties, deploy or provide systems and/or services to other parties using any of these models (or using systems based on these models) or become users of the models, they should note that it is their responsibility to mitigate the risks arising from their use and, in any event, to comply with applicable regulations, including regulations regarding the use of artificial intelligence.
In no event shall the owner of the models (SEDIA – State Secretariat for digitalization and artificial intelligence) nor the creator (BSC – Barcelona Supercomputing Center) be liable for any results arising from the use made by third parties of these models.
Los modelos publicados en este repositorio tienen una finalidad generalista y están a disposición de terceros. Estos modelos pueden tener sesgos y/u otro tipo de distorsiones indeseables.
Cuando terceros desplieguen o proporcionen sistemas y/o servicios a otras partes usando alguno de estos modelos (o utilizando sistemas basados en estos modelos) o se conviertan en usuarios de los modelos, deben tener en cuenta que es su responsabilidad mitigar los riesgos derivados de su uso y, en todo caso, cumplir con la normativa aplicable, incluyendo la normativa en materia de uso de inteligencia artificial.
En ningún caso el propietario de los modelos (SEDIA – Secretaría de Estado de Digitalización e Inteligencia Artificial) ni el creador (BSC – Barcelona Supercomputing Center) serán responsables de los resultados derivados del uso que hagan terceros de estos modelos.
| [
"### Training procedure\n\n\nThe model was trained with a batch size of 16 and a learning rate of 3e-5 for 5 epochs. We then selected the best checkpoint using the downstream task metric in the corresponding development set and then evaluated it on the test set.\n\n\nEvaluation\n----------",
"### Variable and metrics\n\n\nThis model was finetuned maximizing F1 score.\n\n\nEvaluation results\n------------------\n\n\nWe evaluated the roberta-large-bne-capitel-pos on the CAPITEL-POS test set against standard multilingual and monolingual baselines:\n\n\n\nFor more details, check the fine-tuning and evaluation scripts in the official GitHub repository.\n\n\nAdditional information\n----------------------",
"### Author\n\n\nText Mining Unit (TeMU) at the Barcelona Supercomputing Center (bsc-temu@URL)",
"### Contact information\n\n\nFor further information, send an email to [plantl-gob-es@URL](mailto:plantl-gob-es@URL)",
"### Copyright\n\n\nCopyright by the Spanish State Secretariat for Digitalization and Artificial Intelligence (SEDIA) (2022)",
"### Licensing information\n\n\nApache License, Version 2.0",
"### Funding\n\n\nThis work was funded by the Spanish State Secretariat for Digitalization and Artificial Intelligence (SEDIA) within the framework of the Plan-TL.",
"### Citing information\n\n\nIf you use this model, please cite our paper:",
"### Disclaimer\n\n\nThe models published in this repository are intended for a generalist purpose and are available to third parties. These models may have bias and/or any other undesirable distortions.\n\n\nWhen third parties, deploy or provide systems and/or services to other parties using any of these models (or using systems based on these models) or become users of the models, they should note that it is their responsibility to mitigate the risks arising from their use and, in any event, to comply with applicable regulations, including regulations regarding the use of artificial intelligence.\n\n\nIn no event shall the owner of the models (SEDIA – State Secretariat for digitalization and artificial intelligence) nor the creator (BSC – Barcelona Supercomputing Center) be liable for any results arising from the use made by third parties of these models.\n\n\nLos modelos publicados en este repositorio tienen una finalidad generalista y están a disposición de terceros. Estos modelos pueden tener sesgos y/u otro tipo de distorsiones indeseables.\n\n\nCuando terceros desplieguen o proporcionen sistemas y/o servicios a otras partes usando alguno de estos modelos (o utilizando sistemas basados en estos modelos) o se conviertan en usuarios de los modelos, deben tener en cuenta que es su responsabilidad mitigar los riesgos derivados de su uso y, en todo caso, cumplir con la normativa aplicable, incluyendo la normativa en materia de uso de inteligencia artificial.\n\n\nEn ningún caso el propietario de los modelos (SEDIA – Secretaría de Estado de Digitalización e Inteligencia Artificial) ni el creador (BSC – Barcelona Supercomputing Center) serán responsables de los resultados derivados del uso que hagan terceros de estos modelos."
] | [
"TAGS\n#transformers #pytorch #roberta #token-classification #national library of spain #spanish #bne #capitel #pos #es #dataset-bne #dataset-capitel #arxiv-1907.11692 #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us \n",
"### Training procedure\n\n\nThe model was trained with a batch size of 16 and a learning rate of 3e-5 for 5 epochs. We then selected the best checkpoint using the downstream task metric in the corresponding development set and then evaluated it on the test set.\n\n\nEvaluation\n----------",
"### Variable and metrics\n\n\nThis model was finetuned maximizing F1 score.\n\n\nEvaluation results\n------------------\n\n\nWe evaluated the roberta-large-bne-capitel-pos on the CAPITEL-POS test set against standard multilingual and monolingual baselines:\n\n\n\nFor more details, check the fine-tuning and evaluation scripts in the official GitHub repository.\n\n\nAdditional information\n----------------------",
"### Author\n\n\nText Mining Unit (TeMU) at the Barcelona Supercomputing Center (bsc-temu@URL)",
"### Contact information\n\n\nFor further information, send an email to [plantl-gob-es@URL](mailto:plantl-gob-es@URL)",
"### Copyright\n\n\nCopyright by the Spanish State Secretariat for Digitalization and Artificial Intelligence (SEDIA) (2022)",
"### Licensing information\n\n\nApache License, Version 2.0",
"### Funding\n\n\nThis work was funded by the Spanish State Secretariat for Digitalization and Artificial Intelligence (SEDIA) within the framework of the Plan-TL.",
"### Citing information\n\n\nIf you use this model, please cite our paper:",
"### Disclaimer\n\n\nThe models published in this repository are intended for a generalist purpose and are available to third parties. These models may have bias and/or any other undesirable distortions.\n\n\nWhen third parties, deploy or provide systems and/or services to other parties using any of these models (or using systems based on these models) or become users of the models, they should note that it is their responsibility to mitigate the risks arising from their use and, in any event, to comply with applicable regulations, including regulations regarding the use of artificial intelligence.\n\n\nIn no event shall the owner of the models (SEDIA – State Secretariat for digitalization and artificial intelligence) nor the creator (BSC – Barcelona Supercomputing Center) be liable for any results arising from the use made by third parties of these models.\n\n\nLos modelos publicados en este repositorio tienen una finalidad generalista y están a disposición de terceros. Estos modelos pueden tener sesgos y/u otro tipo de distorsiones indeseables.\n\n\nCuando terceros desplieguen o proporcionen sistemas y/o servicios a otras partes usando alguno de estos modelos (o utilizando sistemas basados en estos modelos) o se conviertan en usuarios de los modelos, deben tener en cuenta que es su responsabilidad mitigar los riesgos derivados de su uso y, en todo caso, cumplir con la normativa aplicable, incluyendo la normativa en materia de uso de inteligencia artificial.\n\n\nEn ningún caso el propietario de los modelos (SEDIA – Secretaría de Estado de Digitalización e Inteligencia Artificial) ni el creador (BSC – Barcelona Supercomputing Center) serán responsables de los resultados derivados del uso que hagan terceros de estos modelos."
] | [
90,
65,
93,
28,
37,
22,
12,
34,
16,
363
] | [
"passage: TAGS\n#transformers #pytorch #roberta #token-classification #national library of spain #spanish #bne #capitel #pos #es #dataset-bne #dataset-capitel #arxiv-1907.11692 #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us \n### Training procedure\n\n\nThe model was trained with a batch size of 16 and a learning rate of 3e-5 for 5 epochs. We then selected the best checkpoint using the downstream task metric in the corresponding development set and then evaluated it on the test set.\n\n\nEvaluation\n----------### Variable and metrics\n\n\nThis model was finetuned maximizing F1 score.\n\n\nEvaluation results\n------------------\n\n\nWe evaluated the roberta-large-bne-capitel-pos on the CAPITEL-POS test set against standard multilingual and monolingual baselines:\n\n\n\nFor more details, check the fine-tuning and evaluation scripts in the official GitHub repository.\n\n\nAdditional information\n----------------------### Author\n\n\nText Mining Unit (TeMU) at the Barcelona Supercomputing Center (bsc-temu@URL)### Contact information\n\n\nFor further information, send an email to [plantl-gob-es@URL](mailto:plantl-gob-es@URL)### Copyright\n\n\nCopyright by the Spanish State Secretariat for Digitalization and Artificial Intelligence (SEDIA) (2022)### Licensing information\n\n\nApache License, Version 2.0### Funding\n\n\nThis work was funded by the Spanish State Secretariat for Digitalization and Artificial Intelligence (SEDIA) within the framework of the Plan-TL.### Citing information\n\n\nIf you use this model, please cite our paper:"
] | [
-0.0822443962097168,
0.2131914645433426,
-0.00697114085778594,
0.06207109987735748,
0.11970138549804688,
-0.023532861843705177,
0.05196146294474602,
0.10813628137111664,
-0.02495196834206581,
0.11681848019361496,
-0.012185716070234776,
0.03538229316473007,
0.10738515853881836,
0.12686587870121002,
0.046108584851026535,
-0.1841261088848114,
-0.010489326901733875,
-0.0894661694765091,
0.002566713374108076,
0.10737712681293488,
0.10498000681400299,
-0.057073332369327545,
0.050215426832437515,
-0.027470991015434265,
-0.008269798941910267,
0.07168793678283691,
-0.05892965942621231,
-0.08489891141653061,
0.06100575998425484,
0.059479475021362305,
0.05366676300764084,
0.018142230808734894,
0.048058249056339264,
-0.21531282365322113,
0.014842404052615166,
0.04974614083766937,
0.00732693076133728,
0.04689054563641548,
0.10524788498878479,
-0.04150022193789482,
0.18122510612010956,
-0.10367325693368912,
0.024470480158925056,
0.04862847179174423,
-0.1012345552444458,
-0.12763872742652893,
-0.12877391278743744,
0.06652307510375977,
0.08335502445697784,
0.045741014182567596,
-0.03463607281446457,
0.06535843759775162,
-0.054454557597637177,
0.015521005727350712,
0.07554347813129425,
-0.18176934123039246,
-0.043254416435956955,
0.03913666307926178,
0.027561210095882416,
0.1174321249127388,
-0.08136390149593353,
-0.00803425069898367,
0.046904027462005615,
-0.0074066584929823875,
-0.004645863547921181,
-0.030173659324645996,
-0.04070888087153435,
0.016608623787760735,
-0.12160695344209671,
-0.10841938108205795,
0.15393462777137756,
0.0007257715915329754,
-0.09543632715940475,
-0.1201326996088028,
-0.00974906887859106,
0.009310457855463028,
0.03924892470240593,
-0.04913971945643425,
0.04033384472131729,
-0.011771230958402157,
0.0708557739853859,
-0.034953873604536057,
-0.09317290782928467,
-0.03083512745797634,
-0.044439349323511124,
0.07342979311943054,
0.009302610531449318,
-0.01981521211564541,
0.011554548516869545,
0.13638854026794434,
0.01508481614291668,
-0.12784166634082794,
-0.028268303722143173,
-0.006255345419049263,
-0.03614360839128494,
-0.0515044629573822,
0.032360922545194626,
-0.05881345644593239,
0.0901193767786026,
0.1943439096212387,
-0.05843723565340042,
0.009659337811172009,
-0.02621450088918209,
-0.000011725852345989551,
0.10140513628721237,
0.18200726807117462,
-0.07506604492664337,
-0.08485130220651627,
-0.003482093568891287,
0.0029902325477451086,
0.019734520465135574,
0.020837195217609406,
-0.021323077380657196,
0.01597726158797741,
0.050262704491615295,
0.1323246955871582,
0.07564164698123932,
-0.04330005124211311,
-0.08260733634233475,
-0.02226709946990013,
0.15090802311897278,
-0.1572972685098648,
0.03241095319390297,
0.00928491447120905,
-0.07010029256343842,
0.030638353899121284,
-0.0490342453122139,
-0.03426064923405647,
-0.10495106130838394,
0.06081036105751991,
-0.04742114618420601,
-0.028310006484389305,
-0.0721905529499054,
-0.04506956785917282,
0.07152076065540314,
-0.0473032183945179,
-0.0005945555749349296,
-0.0891716480255127,
-0.0841168612241745,
-0.08770759403705597,
0.04412136599421501,
-0.12191750854253769,
0.009830246679484844,
-0.019386759027838707,
-0.006114442832767963,
0.002355113159865141,
-0.04240243136882782,
0.04054015874862671,
-0.05732329934835434,
0.06021099165081978,
0.03273081034421921,
0.018543917685747147,
0.07304754853248596,
0.0028753303922712803,
-0.11677993834018707,
0.008767781779170036,
-0.14753490686416626,
0.10250957310199738,
-0.09967587143182755,
0.019901778548955917,
-0.18766739964485168,
-0.050572898238897324,
-0.0027462346479296684,
0.009260647930204868,
0.04401814192533493,
0.18624979257583618,
-0.11274170875549316,
-0.03900601714849472,
0.1464090496301651,
-0.049140725284814835,
-0.05745908245444298,
0.11464869230985641,
0.0031379754655063152,
0.07525555044412613,
0.06535724550485611,
0.08267059922218323,
0.13258835673332214,
-0.17409437894821167,
-0.0551954060792923,
-0.00909937359392643,
-0.017447466030716896,
0.08538174629211426,
0.12851117551326752,
-0.101730577647686,
0.07249636948108673,
0.0351254940032959,
-0.13230055570602417,
-0.03387175500392914,
-0.015663566067814827,
-0.04224976524710655,
0.0341128334403038,
0.009539734572172165,
-0.03625267371535301,
0.00000677775915391976,
-0.030940959230065346,
-0.035027824342250824,
-0.10725180804729462,
-0.011074184440076351,
0.04415937140583992,
0.017879066988825798,
0.006803974974900484,
-0.09806492924690247,
0.08200863003730774,
-0.040849994868040085,
0.003660175483673811,
-0.18508504331111908,
-0.005164675414562225,
0.049223609268665314,
-0.06888652592897415,
0.1039658933877945,
-0.06928272545337677,
0.017597850412130356,
0.012481764890253544,
-0.030689571052789688,
-0.011568820104002953,
-0.03585319593548775,
-0.0550508052110672,
0.01309133879840374,
-0.11071037501096725,
0.0003454845573287457,
-0.03256057947874069,
0.0716559961438179,
-0.08558377623558044,
0.012169607914984226,
0.12687714397907257,
0.08183083683252335,
-0.013235545717179775,
-0.04064106196165085,
0.017168624326586723,
0.012551287189126015,
-0.028292810544371605,
-0.07270918041467667,
0.022818531841039658,
0.007214339450001717,
-0.018395397812128067,
-0.03429042920470238,
-0.04142564535140991,
-0.03262615203857422,
0.06702838838100433,
0.11386308073997498,
-0.0549275204539299,
-0.03820166736841202,
-0.041423361748456955,
0.007250940892845392,
-0.0675254613161087,
-0.019455045461654663,
0.20659755170345306,
0.028218187391757965,
0.06368888914585114,
-0.14113284647464752,
-0.07707890123128891,
-0.0028615971095860004,
-0.04478804022073746,
-0.08237000554800034,
0.13817472755908966,
0.031006192788481712,
-0.07317541539669037,
0.07123570144176483,
0.003147916402667761,
0.05707734078168869,
0.17466957867145538,
-0.0023037975188344717,
-0.10883987694978714,
-0.03902994096279144,
0.08199064433574677,
0.03177565708756447,
0.06672786921262741,
-0.04191560298204422,
0.011861911043524742,
0.05017571151256561,
0.012827563099563122,
0.06244145333766937,
-0.12932385504245758,
0.029825733974575996,
0.009584984742105007,
-0.05639483407139778,
-0.005054771434515715,
0.020761564373970032,
-0.024700509384274483,
0.08045460283756256,
0.04903581365942955,
0.06207149103283882,
-0.059030935168266296,
-0.03660630062222481,
-0.10564585030078888,
0.1458725482225418,
-0.07336597144603729,
-0.2714392840862274,
-0.24073389172554016,
0.031024595722556114,
-0.0320805162191391,
0.02801338955760002,
0.04131943732500076,
-0.09862260520458221,
-0.07389500737190247,
-0.06542344391345978,
0.022197067737579346,
0.06898192316293716,
-0.07791616767644882,
0.00408446229994297,
0.06297540664672852,
0.00041733350371941924,
-0.10539212822914124,
-0.0009635037858970463,
0.052450355142354965,
-0.04956383258104324,
-0.029214834794402122,
0.014614218845963478,
0.12488193064928055,
0.09731034934520721,
0.016958454623818398,
-0.006337588187307119,
-0.007966245524585247,
0.18266987800598145,
-0.14574390649795532,
0.047013361006975174,
0.2240501493215561,
0.05071503669023514,
0.026282746344804764,
0.16035588085651398,
0.016179358586668968,
-0.06252682954072952,
0.02815583534538746,
0.012785657308995724,
-0.04097864404320717,
-0.27412036061286926,
-0.059698525816202164,
-0.029310759156942368,
-0.035494185984134674,
0.06263082474470139,
0.08868812024593353,
-0.023207690566778183,
0.0203664842993021,
-0.046257514506578445,
-0.0660133957862854,
0.06142694875597954,
0.10073290765285492,
0.037106696516275406,
0.020777549594640732,
0.020739201456308365,
-0.062004342675209045,
-0.04263853654265404,
0.14014607667922974,
0.08591856807470322,
0.1148015484213829,
0.007229704409837723,
0.14594781398773193,
0.056445445865392685,
0.06178604066371918,
-0.04906858131289482,
0.02861723117530346,
0.04259781911969185,
0.01794162578880787,
-0.029958322644233704,
-0.07307711243629456,
-0.0422503687441349,
0.0016734445234760642,
0.009289965033531189,
-0.02302996814250946,
-0.032089076936244965,
-0.10167988389730453,
0.08818813413381577,
0.11907237768173218,
-0.01000928319990635,
-0.15645816922187805,
-0.061966098845005035,
0.0410027876496315,
-0.08152887970209122,
-0.07530004531145096,
-0.02966754510998726,
0.02846832573413849,
-0.16734693944454193,
0.030220482498407364,
-0.009786715731024742,
0.10506567358970642,
-0.05930853262543678,
-0.029459096491336823,
0.006614910904318094,
0.04361126571893692,
-0.00037176840123720467,
0.1150437444448471,
-0.12876446545124054,
0.14172156155109406,
0.003272575791925192,
0.10433537513017654,
-0.044030237942934036,
0.06100807711482048,
-0.031098250299692154,
-0.0018699471838772297,
0.1556435078382492,
-0.009371618740260601,
0.006855303887277842,
-0.04579521715641022,
-0.06308326870203018,
0.0014029376907274127,
0.07253194600343704,
-0.13823731243610382,
0.11544457823038101,
-0.014599166810512543,
-0.0118944076821208,
-0.11002562940120697,
-0.09858943521976471,
-0.08339227735996246,
-0.16079770028591156,
0.03777727112174034,
-0.13037626445293427,
0.08092249184846878,
-0.04860072582960129,
-0.05002179741859436,
-0.02924264408648014,
0.19217783212661743,
-0.21284709870815277,
-0.08576738089323044,
-0.13010963797569275,
0.030462659895420074,
0.12918859720230103,
-0.07708491384983063,
0.029756592586636543,
-0.028375910595059395,
0.09726614505052567,
0.025119587779045105,
-0.021306682378053665,
0.019677268341183662,
-0.06606721878051758,
-0.12726768851280212,
-0.023683233186602592,
0.1617824286222458,
0.060522787272930145,
0.04034319147467613,
-0.0005810916773043573,
-0.01602606102824211,
0.011957592330873013,
-0.10168447345495224,
-0.04730471596121788,
0.09229820966720581,
0.13517457246780396,
0.07672231644392014,
-0.007558246608823538,
-0.1536078304052353,
-0.13192270696163177,
-0.07701218128204346,
0.05590173974633217,
0.20488405227661133,
-0.01265005860477686,
0.10700380802154541,
0.16337542235851288,
-0.12223640829324722,
-0.146087646484375,
-0.07067842036485672,
0.056024134159088135,
0.019733717665076256,
0.03236113116145134,
-0.1738293319940567,
0.0003517999139148742,
0.09970216453075409,
-0.010020211338996887,
-0.010922356508672237,
-0.2827971279621124,
-0.11907023936510086,
0.010514726862311363,
0.027211271226406097,
-0.10893803089857101,
-0.13158343732357025,
-0.10256917029619217,
-0.04848837852478027,
-0.14145855605602264,
0.056450456380844116,
-0.0008267927914857864,
0.041727934032678604,
0.0024651039857417345,
0.005509478040039539,
0.03946305438876152,
-0.02615726739168167,
0.19384999573230743,
-0.0324988029897213,
0.022616060450673103,
-0.041540998965501785,
-0.0009367231396026909,
0.08740929514169693,
-0.011934328824281693,
0.10875122994184494,
0.022755365818738937,
0.019553933292627335,
-0.11252079904079437,
-0.05808131396770477,
-0.039495278149843216,
0.02990538626909256,
-0.04671785980463028,
-0.01875169575214386,
-0.10173070430755615,
0.09567738324403763,
0.04111748933792114,
-0.015848863869905472,
0.020478887483477592,
-0.07697365432977676,
-0.0018161501502618194,
0.17536041140556335,
0.15811869502067566,
0.06853307038545609,
-0.03239857777953148,
-0.007741851732134819,
-0.00035327914520166814,
0.023832134902477264,
-0.12789201736450195,
0.013642395846545696,
0.1257706582546234,
0.017359383404254913,
0.08322577178478241,
-0.029988516122102737,
-0.13881000876426697,
0.006684463005512953,
0.1373000591993332,
-0.05051633343100548,
-0.1470799446105957,
-0.01885271444916725,
-0.008947843685746193,
-0.127714142203331,
-0.010587967000901699,
0.10568837821483612,
0.020249083638191223,
-0.07127156853675842,
0.020890872925519943,
0.0692933052778244,
-0.011870098300278187,
0.13847924768924713,
0.022455435246229172,
0.04011852666735649,
-0.056572675704956055,
0.1315767765045166,
0.1208128109574318,
-0.12404622882604599,
-0.023236937820911407,
0.08311103284358978,
-0.06084170192480087,
-0.028290675953030586,
0.04948527738451958,
-0.01713434047996998,
-0.09728674590587616,
-0.06734664738178253,
-0.06972192227840424,
-0.035514965653419495,
0.010574263520538807,
0.06612148880958557,
0.03401923179626465,
0.02580471895635128,
0.010944834910333157,
0.03371227905154228,
-0.041555244475603104,
0.0710221603512764,
0.10137052834033966,
-0.00919113215059042,
-0.09455236792564392,
0.00720882136374712,
0.0004839420726057142,
-0.01780613325536251,
-0.02021745778620243,
-0.022002428770065308,
-0.0994049683213234,
0.009157663211226463,
-0.04168316349387169,
0.038508832454681396,
-0.07002289593219757,
-0.009670986793935299,
-0.009206240996718407,
-0.045854054391384125,
-0.044345419853925705,
-0.0009548975504003465,
-0.032835703343153,
-0.04122833162546158,
-0.049130234867334366,
0.13325077295303345,
-0.14015986025333405,
0.0486031174659729,
0.09386715292930603,
-0.0710868090391159,
0.07669501006603241,
-0.011358460411429405,
0.008672391064465046,
0.08626773953437805,
-0.1742287576198578,
0.04757826775312424,
0.008062995970249176,
0.04384739696979523,
0.028541158884763718,
-0.12803085148334503,
0.04581163078546524,
0.034589558839797974,
-0.053547412157058716,
0.02365981414914131,
0.027911437675356865,
-0.10510962456464767,
-0.00042674262658692896,
0.017243532463908195,
-0.06986457109451294,
-0.061685096472501755,
0.10171208530664444,
0.1294180452823639,
0.02376750484108925,
0.11576790362596512,
-0.0648346021771431,
0.004989531822502613,
-0.14696729183197021,
-0.015878897160291672,
-0.001220791251398623,
0.02474091574549675,
-0.021871179342269897,
-0.04472850263118744,
0.048908576369285583,
0.029212888330221176,
0.16030332446098328,
0.0639011487364769,
0.12360072135925293,
0.03261920064687729,
0.016516007483005524,
0.045284125953912735,
0.032056644558906555,
0.06953208148479462,
0.015779169276356697,
0.016638269647955894,
-0.026283252984285355,
-0.02189501002430916,
-0.0494396947324276,
-0.10140983760356903,
0.058319274336099625,
0.1338742971420288,
0.08784294873476028,
0.0440896600484848,
-0.007573892828077078,
-0.04403185099363327,
-0.039560094475746155,
0.014307656325399876,
-0.0023120599798858166,
-0.0004762120661325753,
-0.047286078333854675,
0.08972863852977753,
0.20577213168144226,
-0.19748683273792267,
0.10390513390302658,
-0.016378391534090042,
-0.05884060263633728,
-0.06142907217144966,
-0.20253682136535645,
-0.029635120183229446,
-0.08392312377691269,
0.03558943420648575,
-0.09929592907428741,
0.08605443686246872,
0.009200341999530792,
-0.0011360484641045332,
-0.07696899026632309,
0.08139195293188095,
-0.066337451338768,
-0.1343228667974472,
0.06704583764076233,
0.01443418301641941,
0.09193317592144012,
0.0027834915090352297,
0.09514253586530685,
-0.0020712795667350292,
0.08087464421987534,
0.09757458418607712,
0.1061420738697052,
0.0377679169178009,
-0.0012697619386017323,
-0.07155057042837143,
-0.03173267841339111,
0.021672921255230904,
-0.003302388358861208,
-0.0029129921458661556,
0.18596220016479492,
0.02930564247071743,
-0.02265286259353161,
0.02084500342607498,
0.26473596692085266,
-0.021842962130904198,
-0.05562353879213333,
-0.1411152333021164,
0.12791414558887482,
0.031227832660079002,
0.06604937463998795,
0.030478091910481453,
-0.1404808908700943,
-0.0328875370323658,
0.11732695996761322,
0.08536648750305176,
0.012423855252563953,
-0.030532648786902428,
-0.0036619952879846096,
0.01865014061331749,
0.00982893630862236,
0.05346175283193588,
0.03810899704694748,
0.2385704070329666,
-0.05780285969376564,
0.06551756709814072,
-0.032674115151166916,
0.004643136635422707,
-0.04221482574939728,
0.11939442157745361,
-0.04475703090429306,
-0.009585208259522915,
-0.06640266627073288,
0.1640676110982895,
-0.06658608466386795,
-0.2839690148830414,
0.05558446794748306,
-0.034667495638132095,
-0.14529456198215485,
-0.00022041767078917474,
0.024471018463373184,
-0.0021005093585699797,
0.049551866948604584,
0.05508866906166077,
-0.031906742602586746,
0.0847896933555603,
0.04335053265094757,
-0.027674317359924316,
-0.055549487471580505,
0.04781077057123184,
-0.07549837976694107,
0.2396022528409958,
-0.008303453214466572,
0.08717521280050278,
0.1009066253900528,
-0.043376632034778595,
-0.15706397593021393,
0.03502678871154785,
0.06583927571773529,
-0.021684952080249786,
0.11903242021799088,
0.07084659487009048,
0.02955969236791134,
0.006355315446853638,
0.07499777525663376,
0.03807922080159187,
0.036511000245809555,
0.01657772623002529,
0.07149488478899002,
-0.17012560367584229,
0.12280751019716263,
-0.13987617194652557,
0.08304329961538315,
0.09119796752929688,
-0.04027985408902168,
0.07198923826217651,
-0.0692698061466217,
0.08842499554157257,
0.00442300783470273,
0.18672379851341248,
0.038078758865594864,
-0.16809573769569397,
0.016885515302419662,
-0.025606291368603706,
0.0384460985660553,
-0.2128898650407791,
-0.02282901294529438,
0.04525364562869072,
-0.003115456784144044,
-0.056178320199251175,
0.1352839171886444,
0.006988630164414644,
0.012788069434463978,
-0.010736402124166489,
-0.15952640771865845,
-0.005381670314818621,
0.08499157428741455,
-0.11724860221147537,
0.003993241116404533
] |
null | null | transformers |
# Spanish RoBERTa-large trained on BNE finetuned for Spanish Question Answering Corpus (SQAC) dataset.
## Table of contents
<details>
<summary>Click to expand</summary>
- [Model description](#model-description)
- [Intended uses and limitations](#intended-use)
- [How to use](#how-to-use)
- [Limitations and bias](#limitations-and-bias)
- [Training](#training)
- [Training](#training)
- [Training data](#training-data)
- [Training procedure](#training-procedure)
- [Evaluation](#evaluation)
- [Evaluation](#evaluation)
- [Variable and metrics](#variable-and-metrics)
- [Evaluation results](#evaluation-results)
- [Additional information](#additional-information)
- [Author](#author)
- [Contact information](#contact-information)
- [Copyright](#copyright)
- [Licensing information](#licensing-information)
- [Funding](#funding)
- [Citing information](#citing-information)
- [Disclaimer](#disclaimer)
</details>
## Model description
The **roberta-large-bne-sqac** is a Question Answering (QA) model for the Spanish language fine-tuned from the [roberta-large-bne](https://huggingface.co/PlanTL-GOB-ES/roberta-large-bne) model, a [RoBERTa](https://arxiv.org/abs/1907.11692) large model pre-trained using the largest Spanish corpus known to date, with a total of 570GB of clean and deduplicated text, processed for this work, compiled from the web crawlings performed by the [National Library of Spain (Biblioteca Nacional de España)](http://www.bne.es/en/Inicio/index.html) from 2009 to 2019.
## Intended uses and limitations
**roberta-large-bne-sqac** model can be used for extractive question answering. The model is limited by its training dataset and may not generalize well for all use cases.
## How to use
```python
from transformers import pipeline
nlp = pipeline("question-answering", model="PlanTL-GOB-ES/roberta-large-bne-sqac")
text = "¿Dónde vivo?"
context = "Me llamo Wolfgang y vivo en Berlin"
qa_results = nlp(text, context)
print(qa_results)
```
## Limitations and bias
At the time of submission, no measures have been taken to estimate the bias embedded in the model. However, we are well aware that our models may be biased since the corpora have been collected using crawling techniques on multiple web sources. We intend to conduct research in these areas in the future, and if completed, this model card will be updated.
## Training
### Training data
We used the QA dataset in Spanish called [SQAC corpus](https://huggingface.co/datasets/PlanTL-GOB-ES/SQAC) for training and evaluation.
### Training procedure
The model was trained with a batch size of 16 and a learning rate of 1e-5 for 5 epochs. We then selected the best checkpoint using the downstream task metric in the corresponding development set and then evaluated it on the test set.
## Evaluation results
We evaluated the **roberta-large-bne-sqac** on the SQAC test set against standard multilingual and monolingual baselines:
| Model | SQAC (F1) |
| ------------|:----|
| roberta-large-bne-sqac | **82.02** |
| roberta-base-bne-sqac | 79.23|
| BETO | 79.23 |
| mBERT | 75.62 |
| BERTIN | 76.78 |
| ELECTRA | 73.83 |
For more details, check the fine-tuning and evaluation scripts in the official [GitHub repository](https://github.com/PlanTL-GOB-ES/lm-spanish).
## Additional information
### Author
Text Mining Unit (TeMU) at the Barcelona Supercomputing Center ([email protected])
### Contact information
For further information, send an email to <[email protected]>
### Copyright
Copyright by the Spanish State Secretariat for Digitalization and Artificial Intelligence (SEDIA) (2022)
### Licensing information
[Apache License, Version 2.0](https://www.apache.org/licenses/LICENSE-2.0)
### Funding
This work was funded by the Spanish State Secretariat for Digitalization and Artificial Intelligence (SEDIA) within the framework of the Plan-TL.
### Citing information
If you use this model, please cite our [paper](http://journal.sepln.org/sepln/ojs/ojs/index.php/pln/article/view/6405):
```
@article{,
abstract = {We want to thank the National Library of Spain for such a large effort on the data gathering and the Future of Computing Center, a
Barcelona Supercomputing Center and IBM initiative (2020). This work was funded by the Spanish State Secretariat for Digitalization and Artificial
Intelligence (SEDIA) within the framework of the Plan-TL.},
author = {Asier Gutiérrez Fandiño and Jordi Armengol Estapé and Marc Pàmies and Joan Llop Palao and Joaquin Silveira Ocampo and Casimiro Pio Carrino and Carme Armentano Oller and Carlos Rodriguez Penagos and Aitor Gonzalez Agirre and Marta Villegas},
doi = {10.26342/2022-68-3},
issn = {1135-5948},
journal = {Procesamiento del Lenguaje Natural},
keywords = {Artificial intelligence,Benchmarking,Data processing.,MarIA,Natural language processing,Spanish language modelling,Spanish language resources,Tractament del llenguatge natural (Informàtica),Àrees temàtiques de la UPC::Informàtica::Intel·ligència artificial::Llenguatge natural},
publisher = {Sociedad Española para el Procesamiento del Lenguaje Natural},
title = {MarIA: Spanish Language Models},
volume = {68},
url = {https://upcommons.upc.edu/handle/2117/367156#.YyMTB4X9A-0.mendeley},
year = {2022},
}
```
### Disclaimer
The models published in this repository are intended for a generalist purpose and are available to third parties. These models may have bias and/or any other undesirable distortions.
When third parties, deploy or provide systems and/or services to other parties using any of these models (or using systems based on these models) or become users of the models, they should note that it is their responsibility to mitigate the risks arising from their use and, in any event, to comply with applicable regulations, including regulations regarding the use of artificial intelligence.
In no event shall the owner of the models (SEDIA – State Secretariat for digitalization and artificial intelligence) nor the creator (BSC – Barcelona Supercomputing Center) be liable for any results arising from the use made by third parties of these models.
Los modelos publicados en este repositorio tienen una finalidad generalista y están a disposición de terceros. Estos modelos pueden tener sesgos y/u otro tipo de distorsiones indeseables.
Cuando terceros desplieguen o proporcionen sistemas y/o servicios a otras partes usando alguno de estos modelos (o utilizando sistemas basados en estos modelos) o se conviertan en usuarios de los modelos, deben tener en cuenta que es su responsabilidad mitigar los riesgos derivados de su uso y, en todo caso, cumplir con la normativa aplicable, incluyendo la normativa en materia de uso de inteligencia artificial.
En ningún caso el propietario de los modelos (SEDIA – Secretaría de Estado de Digitalización e Inteligencia Artificial) ni el creador (BSC – Barcelona Supercomputing Center) serán responsables de los resultados derivados del uso que hagan terceros de estos modelos. | {"language": ["es"], "license": "apache-2.0", "tags": ["national library of spain", "spanish", "bne", "qa", "question answering"], "datasets": ["PlanTL-GOB-ES/SQAC"], "metrics": ["f1", "exact match"], "model-index": [{"name": "roberta-large-bne-sqac", "results": [{"task": {"type": "question-answering"}, "dataset": {"name": "SQAC", "type": "PlanTL-GOB-ES/SQAC"}, "metrics": [{"type": "f1", "value": 0.8202, "name": "F1"}]}]}]} | question-answering | PlanTL-GOB-ES/roberta-large-bne-sqac | [
"transformers",
"pytorch",
"roberta",
"question-answering",
"national library of spain",
"spanish",
"bne",
"qa",
"question answering",
"es",
"dataset:PlanTL-GOB-ES/SQAC",
"arxiv:1907.11692",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"has_space",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [
"1907.11692"
] | [
"es"
] | TAGS
#transformers #pytorch #roberta #question-answering #national library of spain #spanish #bne #qa #question answering #es #dataset-PlanTL-GOB-ES/SQAC #arxiv-1907.11692 #license-apache-2.0 #model-index #endpoints_compatible #has_space #region-us
| Spanish RoBERTa-large trained on BNE finetuned for Spanish Question Answering Corpus (SQAC) dataset.
====================================================================================================
Table of contents
-----------------
Click to expand
* Model description
* Intended uses and limitations
* How to use
* Limitations and bias
* Training
* Training
+ Training data
+ Training procedure
* Evaluation
* Evaluation
+ Variable and metrics
+ Evaluation results
* Additional information
+ Author
+ Contact information
+ Copyright
+ Licensing information
+ Funding
+ Citing information
+ Disclaimer
Model description
-----------------
The roberta-large-bne-sqac is a Question Answering (QA) model for the Spanish language fine-tuned from the roberta-large-bne model, a RoBERTa large model pre-trained using the largest Spanish corpus known to date, with a total of 570GB of clean and deduplicated text, processed for this work, compiled from the web crawlings performed by the National Library of Spain (Biblioteca Nacional de España) from 2009 to 2019.
Intended uses and limitations
-----------------------------
roberta-large-bne-sqac model can be used for extractive question answering. The model is limited by its training dataset and may not generalize well for all use cases.
How to use
----------
Limitations and bias
--------------------
At the time of submission, no measures have been taken to estimate the bias embedded in the model. However, we are well aware that our models may be biased since the corpora have been collected using crawling techniques on multiple web sources. We intend to conduct research in these areas in the future, and if completed, this model card will be updated.
Training
--------
### Training data
We used the QA dataset in Spanish called SQAC corpus for training and evaluation.
### Training procedure
The model was trained with a batch size of 16 and a learning rate of 1e-5 for 5 epochs. We then selected the best checkpoint using the downstream task metric in the corresponding development set and then evaluated it on the test set.
Evaluation results
------------------
We evaluated the roberta-large-bne-sqac on the SQAC test set against standard multilingual and monolingual baselines:
For more details, check the fine-tuning and evaluation scripts in the official GitHub repository.
Additional information
----------------------
### Author
Text Mining Unit (TeMU) at the Barcelona Supercomputing Center (bsc-temu@URL)
### Contact information
For further information, send an email to [plantl-gob-es@URL](mailto:plantl-gob-es@URL)
### Copyright
Copyright by the Spanish State Secretariat for Digitalization and Artificial Intelligence (SEDIA) (2022)
### Licensing information
Apache License, Version 2.0
### Funding
This work was funded by the Spanish State Secretariat for Digitalization and Artificial Intelligence (SEDIA) within the framework of the Plan-TL.
### Citing information
If you use this model, please cite our paper:
### Disclaimer
The models published in this repository are intended for a generalist purpose and are available to third parties. These models may have bias and/or any other undesirable distortions.
When third parties, deploy or provide systems and/or services to other parties using any of these models (or using systems based on these models) or become users of the models, they should note that it is their responsibility to mitigate the risks arising from their use and, in any event, to comply with applicable regulations, including regulations regarding the use of artificial intelligence.
In no event shall the owner of the models (SEDIA – State Secretariat for digitalization and artificial intelligence) nor the creator (BSC – Barcelona Supercomputing Center) be liable for any results arising from the use made by third parties of these models.
Los modelos publicados en este repositorio tienen una finalidad generalista y están a disposición de terceros. Estos modelos pueden tener sesgos y/u otro tipo de distorsiones indeseables.
Cuando terceros desplieguen o proporcionen sistemas y/o servicios a otras partes usando alguno de estos modelos (o utilizando sistemas basados en estos modelos) o se conviertan en usuarios de los modelos, deben tener en cuenta que es su responsabilidad mitigar los riesgos derivados de su uso y, en todo caso, cumplir con la normativa aplicable, incluyendo la normativa en materia de uso de inteligencia artificial.
En ningún caso el propietario de los modelos (SEDIA – Secretaría de Estado de Digitalización e Inteligencia Artificial) ni el creador (BSC – Barcelona Supercomputing Center) serán responsables de los resultados derivados del uso que hagan terceros de estos modelos.
| [
"### Training data\n\n\nWe used the QA dataset in Spanish called SQAC corpus for training and evaluation.",
"### Training procedure\n\n\nThe model was trained with a batch size of 16 and a learning rate of 1e-5 for 5 epochs. We then selected the best checkpoint using the downstream task metric in the corresponding development set and then evaluated it on the test set.\n\n\nEvaluation results\n------------------\n\n\nWe evaluated the roberta-large-bne-sqac on the SQAC test set against standard multilingual and monolingual baselines:\n\n\n\nFor more details, check the fine-tuning and evaluation scripts in the official GitHub repository.\n\n\nAdditional information\n----------------------",
"### Author\n\n\nText Mining Unit (TeMU) at the Barcelona Supercomputing Center (bsc-temu@URL)",
"### Contact information\n\n\nFor further information, send an email to [plantl-gob-es@URL](mailto:plantl-gob-es@URL)",
"### Copyright\n\n\nCopyright by the Spanish State Secretariat for Digitalization and Artificial Intelligence (SEDIA) (2022)",
"### Licensing information\n\n\nApache License, Version 2.0",
"### Funding\n\n\nThis work was funded by the Spanish State Secretariat for Digitalization and Artificial Intelligence (SEDIA) within the framework of the Plan-TL.",
"### Citing information\n\n\nIf you use this model, please cite our paper:",
"### Disclaimer\n\n\nThe models published in this repository are intended for a generalist purpose and are available to third parties. These models may have bias and/or any other undesirable distortions.\n\n\nWhen third parties, deploy or provide systems and/or services to other parties using any of these models (or using systems based on these models) or become users of the models, they should note that it is their responsibility to mitigate the risks arising from their use and, in any event, to comply with applicable regulations, including regulations regarding the use of artificial intelligence.\n\n\nIn no event shall the owner of the models (SEDIA – State Secretariat for digitalization and artificial intelligence) nor the creator (BSC – Barcelona Supercomputing Center) be liable for any results arising from the use made by third parties of these models.\n\n\nLos modelos publicados en este repositorio tienen una finalidad generalista y están a disposición de terceros. Estos modelos pueden tener sesgos y/u otro tipo de distorsiones indeseables.\n\n\nCuando terceros desplieguen o proporcionen sistemas y/o servicios a otras partes usando alguno de estos modelos (o utilizando sistemas basados en estos modelos) o se conviertan en usuarios de los modelos, deben tener en cuenta que es su responsabilidad mitigar los riesgos derivados de su uso y, en todo caso, cumplir con la normativa aplicable, incluyendo la normativa en materia de uso de inteligencia artificial.\n\n\nEn ningún caso el propietario de los modelos (SEDIA – Secretaría de Estado de Digitalización e Inteligencia Artificial) ni el creador (BSC – Barcelona Supercomputing Center) serán responsables de los resultados derivados del uso que hagan terceros de estos modelos."
] | [
"TAGS\n#transformers #pytorch #roberta #question-answering #national library of spain #spanish #bne #qa #question answering #es #dataset-PlanTL-GOB-ES/SQAC #arxiv-1907.11692 #license-apache-2.0 #model-index #endpoints_compatible #has_space #region-us \n",
"### Training data\n\n\nWe used the QA dataset in Spanish called SQAC corpus for training and evaluation.",
"### Training procedure\n\n\nThe model was trained with a batch size of 16 and a learning rate of 1e-5 for 5 epochs. We then selected the best checkpoint using the downstream task metric in the corresponding development set and then evaluated it on the test set.\n\n\nEvaluation results\n------------------\n\n\nWe evaluated the roberta-large-bne-sqac on the SQAC test set against standard multilingual and monolingual baselines:\n\n\n\nFor more details, check the fine-tuning and evaluation scripts in the official GitHub repository.\n\n\nAdditional information\n----------------------",
"### Author\n\n\nText Mining Unit (TeMU) at the Barcelona Supercomputing Center (bsc-temu@URL)",
"### Contact information\n\n\nFor further information, send an email to [plantl-gob-es@URL](mailto:plantl-gob-es@URL)",
"### Copyright\n\n\nCopyright by the Spanish State Secretariat for Digitalization and Artificial Intelligence (SEDIA) (2022)",
"### Licensing information\n\n\nApache License, Version 2.0",
"### Funding\n\n\nThis work was funded by the Spanish State Secretariat for Digitalization and Artificial Intelligence (SEDIA) within the framework of the Plan-TL.",
"### Citing information\n\n\nIf you use this model, please cite our paper:",
"### Disclaimer\n\n\nThe models published in this repository are intended for a generalist purpose and are available to third parties. These models may have bias and/or any other undesirable distortions.\n\n\nWhen third parties, deploy or provide systems and/or services to other parties using any of these models (or using systems based on these models) or become users of the models, they should note that it is their responsibility to mitigate the risks arising from their use and, in any event, to comply with applicable regulations, including regulations regarding the use of artificial intelligence.\n\n\nIn no event shall the owner of the models (SEDIA – State Secretariat for digitalization and artificial intelligence) nor the creator (BSC – Barcelona Supercomputing Center) be liable for any results arising from the use made by third parties of these models.\n\n\nLos modelos publicados en este repositorio tienen una finalidad generalista y están a disposición de terceros. Estos modelos pueden tener sesgos y/u otro tipo de distorsiones indeseables.\n\n\nCuando terceros desplieguen o proporcionen sistemas y/o servicios a otras partes usando alguno de estos modelos (o utilizando sistemas basados en estos modelos) o se conviertan en usuarios de los modelos, deben tener en cuenta que es su responsabilidad mitigar los riesgos derivados de su uso y, en todo caso, cumplir con la normativa aplicable, incluyendo la normativa en materia de uso de inteligencia artificial.\n\n\nEn ningún caso el propietario de los modelos (SEDIA – Secretaría de Estado de Digitalización e Inteligencia Artificial) ni el creador (BSC – Barcelona Supercomputing Center) serán responsables de los resultados derivados del uso que hagan terceros de estos modelos."
] | [
91,
23,
132,
28,
37,
22,
12,
34,
16,
363
] | [
"passage: TAGS\n#transformers #pytorch #roberta #question-answering #national library of spain #spanish #bne #qa #question answering #es #dataset-PlanTL-GOB-ES/SQAC #arxiv-1907.11692 #license-apache-2.0 #model-index #endpoints_compatible #has_space #region-us \n### Training data\n\n\nWe used the QA dataset in Spanish called SQAC corpus for training and evaluation.### Training procedure\n\n\nThe model was trained with a batch size of 16 and a learning rate of 1e-5 for 5 epochs. We then selected the best checkpoint using the downstream task metric in the corresponding development set and then evaluated it on the test set.\n\n\nEvaluation results\n------------------\n\n\nWe evaluated the roberta-large-bne-sqac on the SQAC test set against standard multilingual and monolingual baselines:\n\n\n\nFor more details, check the fine-tuning and evaluation scripts in the official GitHub repository.\n\n\nAdditional information\n----------------------### Author\n\n\nText Mining Unit (TeMU) at the Barcelona Supercomputing Center (bsc-temu@URL)### Contact information\n\n\nFor further information, send an email to [plantl-gob-es@URL](mailto:plantl-gob-es@URL)### Copyright\n\n\nCopyright by the Spanish State Secretariat for Digitalization and Artificial Intelligence (SEDIA) (2022)### Licensing information\n\n\nApache License, Version 2.0### Funding\n\n\nThis work was funded by the Spanish State Secretariat for Digitalization and Artificial Intelligence (SEDIA) within the framework of the Plan-TL.### Citing information\n\n\nIf you use this model, please cite our paper:"
] | [
-0.06452801823616028,
0.20677435398101807,
-0.007026208098977804,
0.05438276380300522,
0.13681699335575104,
-0.01843101531267166,
0.05138891190290451,
0.1177089661359787,
0.011052639223635197,
0.09885146468877792,
-0.01074911281466484,
0.0299407746642828,
0.09809687733650208,
0.1157652735710144,
0.03300350904464722,
-0.14467673003673553,
-0.02826450951397419,
-0.09396836161613464,
-0.010586874559521675,
0.10698935389518738,
0.0978575125336647,
-0.06706353276968002,
0.046558838337659836,
-0.03508276864886284,
0.0034236914943903685,
0.09408874809741974,
-0.07258303463459015,
-0.07255590707063675,
0.06730562448501587,
0.06961606442928314,
0.0732891783118248,
0.02300065942108631,
0.040703438222408295,
-0.23090218007564545,
0.02077610231935978,
0.04080704227089882,
0.0017672028625383973,
0.04257369041442871,
0.09243585914373398,
-0.029111817479133606,
0.13784991204738617,
-0.09565162658691406,
0.004600214771926403,
0.037567541003227234,
-0.11667891591787338,
-0.09930786490440369,
-0.11033610999584198,
0.0752759575843811,
0.0930006131529808,
0.040147870779037476,
-0.0374385230243206,
0.0605890229344368,
-0.07577790319919586,
0.020266085863113403,
0.04769597575068474,
-0.18593846261501312,
-0.036215756088495255,
0.019731352105736732,
0.046340588480234146,
0.13146783411502838,
-0.0842273011803627,
0.0016332720406353474,
0.04136018827557564,
-0.023360060527920723,
-0.012659160420298576,
-0.029409967362880707,
-0.0780831053853035,
0.03511399030685425,
-0.09891123324632645,
-0.12394165247678757,
0.15811355412006378,
0.008223755285143852,
-0.08563195168972015,
-0.1051158532500267,
-0.009995123371481895,
0.028307978063821793,
0.050904788076877594,
-0.04925162345170975,
0.0289456844329834,
-0.011221700347959995,
0.06071740388870239,
-0.03314685821533203,
-0.11196374148130417,
-0.04152508080005646,
-0.022849537432193756,
0.06752046942710876,
0.021126221865415573,
-0.015664825215935707,
-0.007208520546555519,
0.15261533856391907,
0.060967255383729935,
-0.12430284172296524,
-0.013975366950035095,
-0.009685343131422997,
-0.04543880373239517,
-0.04229074716567993,
0.04589863121509552,
-0.020202849060297012,
0.10081355273723602,
0.20278173685073853,
-0.026544634252786636,
0.0014304289361461997,
-0.012667560018599033,
0.008163025602698326,
0.07667267322540283,
0.16177326440811157,
-0.06467484682798386,
-0.07495830208063126,
-0.0118010388687253,
0.020030900835990906,
0.005125381518155336,
0.03195573762059212,
-0.01903853937983513,
0.012017802335321903,
0.01841125637292862,
0.14072057604789734,
0.07346678525209427,
-0.03205164894461632,
-0.08224944025278091,
-0.01914909854531288,
0.17853227257728577,
-0.14414475858211517,
0.027635687962174416,
0.027343159541487694,
-0.0792621597647667,
-0.00900991354137659,
-0.03825656697154045,
-0.017686108127236366,
-0.10373545438051224,
0.05230478197336197,
-0.030445853248238564,
-0.023429354652762413,
-0.07433375716209412,
-0.05089034512639046,
0.0656861662864685,
-0.055581189692020416,
0.0005330733838491142,
-0.09434640407562256,
-0.08708753436803818,
-0.08161070197820663,
0.03944242373108864,
-0.1032944917678833,
0.011347487568855286,
-0.01309508178383112,
0.0060141682624816895,
0.01625855267047882,
-0.05784616619348526,
0.042482294142246246,
-0.06800024211406708,
0.05783868953585625,
0.02838447317481041,
0.028335023671388626,
0.06537547707557678,
-0.0011975528905168176,
-0.08676192909479141,
-0.008135589770972729,
-0.13743934035301208,
0.10643085837364197,
-0.09157557040452957,
0.023195913061499596,
-0.17861734330654144,
-0.025734910741448402,
-0.0001019484261632897,
0.023033088073134422,
0.04005846753716469,
0.19624340534210205,
-0.1069958433508873,
-0.023684030398726463,
0.13414546847343445,
-0.03759310767054558,
-0.06390970945358276,
0.11465444415807724,
-0.011382566764950752,
0.07534872740507126,
0.05352788046002388,
0.06895565241575241,
0.14279411733150482,
-0.18055841326713562,
-0.059870485216379166,
-0.016778629273176193,
-0.03152592480182648,
0.07030963897705078,
0.12579591572284698,
-0.0893307700753212,
0.09809347987174988,
0.031153066083788872,
-0.14109212160110474,
-0.01520173903554678,
-0.025905108079314232,
-0.04447618126869202,
0.050159405916929245,
-0.0018515581032261252,
-0.03773394599556923,
0.0005880917888134718,
-0.023951157927513123,
-0.03264625370502472,
-0.09916356950998306,
-0.046956617385149,
0.041065167635679245,
0.013645495288074017,
-0.005564925726503134,
-0.119209423661232,
0.06256598234176636,
-0.03965764492750168,
0.008640553802251816,
-0.17876991629600525,
-0.027875225991010666,
0.048610564321279526,
-0.06587754189968109,
0.09595915675163269,
-0.04467833787202835,
0.02139469049870968,
0.001993112964555621,
-0.04246661067008972,
-0.02538883127272129,
-0.03714534267783165,
-0.07270840555429459,
0.006116294302046299,
-0.10081207752227783,
-0.0017026022542268038,
-0.03216167911887169,
0.07622209191322327,
-0.09087677299976349,
0.019428282976150513,
0.12591348588466644,
0.08259843289852142,
-0.012688933871686459,
-0.03157255798578262,
0.02917172946035862,
0.012899910099804401,
-0.016160286962985992,
-0.05249083787202835,
0.03380383551120758,
0.006050730124115944,
-0.026394445449113846,
-0.033452682197093964,
-0.029977133497595787,
-0.03027578815817833,
0.04186081886291504,
0.09793274104595184,
-0.03868462145328522,
-0.027412759140133858,
-0.05069752037525177,
0.0058324760757386684,
-0.06652336567640305,
-0.036956608295440674,
0.23906771838665009,
0.020197806879878044,
0.061099518090486526,
-0.12391138821840286,
-0.08561519533395767,
-0.024460768327116966,
-0.038146212697029114,
-0.052332669496536255,
0.14184248447418213,
0.032706744968891144,
-0.07335471361875534,
0.06606245785951614,
-0.02164447121322155,
0.0686175525188446,
0.19601674377918243,
-0.00205175275914371,
-0.10365904867649078,
-0.02342783473432064,
0.10867464542388916,
0.023265250027179718,
0.05892270803451538,
-0.03181309998035431,
-0.014844991266727448,
0.05244610086083412,
-0.005778411403298378,
0.06670061498880386,
-0.14100970327854156,
0.026532500982284546,
0.004739475902169943,
-0.06362076848745346,
-0.024476248770952225,
0.021748367697000504,
-0.017735233530402184,
0.07286176085472107,
0.04249449074268341,
0.0662720575928688,
-0.0449233241379261,
-0.051092974841594696,
-0.09952856600284576,
0.1264132261276245,
-0.05549130588769913,
-0.29979783296585083,
-0.2482766956090927,
0.052371054887771606,
-0.0268035177141428,
0.02497185952961445,
0.059210825711488724,
-0.11609429866075516,
-0.08042186498641968,
-0.044579919427633286,
0.0016705984016880393,
0.1150260642170906,
-0.09155432879924774,
0.010668298229575157,
0.057134538888931274,
0.01997201330959797,
-0.11439554393291473,
0.0034332023933529854,
0.05302529036998749,
-0.03314829245209694,
-0.022526580840349197,
0.030488545075058937,
0.13611210882663727,
0.05716017261147499,
0.01677260920405388,
-0.018369602039456367,
-0.006142624653875828,
0.19885073602199554,
-0.14933636784553528,
0.049318600445985794,
0.21363811194896698,
0.021931156516075134,
0.02625580132007599,
0.1746351271867752,
0.013137711212038994,
-0.061064813286066055,
0.03982168436050415,
0.020822929218411446,
-0.028295809403061867,
-0.28670111298561096,
-0.05417191609740257,
-0.03654051572084427,
-0.08178479969501495,
0.04102765768766403,
0.09553109109401703,
-0.031868547201156616,
0.007688192650675774,
-0.06343341618776321,
-0.07777569442987442,
0.06273703277111053,
0.09494482725858688,
0.011126114055514336,
0.044166769832372665,
0.029970655217766762,
-0.07312095165252686,
-0.058290861546993256,
0.12250351160764694,
0.10681435465812683,
0.125352680683136,
0.025056442245841026,
0.1493566930294037,
0.07456225901842117,
0.06586955487728119,
-0.0204840749502182,
0.036726973950862885,
0.050895676016807556,
0.017706425860524178,
-0.03349540755152702,
-0.0632680207490921,
-0.03714163601398468,
-0.0017891450552269816,
0.00789318885654211,
-0.044003501534461975,
-0.02071806602180004,
-0.08813276141881943,
0.09561818838119507,
0.15318815410137177,
-0.02368253283202648,
-0.13084067404270172,
-0.041859403252601624,
0.04092804715037346,
-0.09450463205575943,
-0.061916887760162354,
-0.040136612951755524,
0.052058834582567215,
-0.15765631198883057,
0.011799763888120651,
-0.0025913966819643974,
0.11753278970718384,
-0.05999251827597618,
-0.013759816996753216,
0.0009262987296096981,
-0.0025881617330014706,
0.004232506267726421,
0.10820931941270828,
-0.13872022926807404,
0.16431216895580292,
0.008869088254868984,
0.12037868797779083,
-0.033191584050655365,
0.056924253702163696,
-0.027419723570346832,
-0.026287928223609924,
0.1446363478899002,
-0.011721586808562279,
0.009454692713916302,
-0.0448145866394043,
-0.050972361117601395,
0.02334796078503132,
0.07254768908023834,
-0.16361604630947113,
0.13055020570755005,
-0.007233519107103348,
-0.011427687481045723,
-0.11164912581443787,
-0.0523802787065506,
-0.0771552324295044,
-0.16734769940376282,
0.020156199112534523,
-0.11345765739679337,
0.11113212257623672,
-0.03050297312438488,
-0.03437821939587593,
-0.04048550873994827,
0.15579631924629211,
-0.2359289824962616,
-0.10174398869276047,
-0.11334444582462311,
0.0011853534961119294,
0.11261039227247238,
-0.07764361798763275,
0.022005310282111168,
-0.0032241165172308683,
0.07964640855789185,
0.04540138319134712,
0.000343726685969159,
0.007258415687829256,
-0.06397374719381332,
-0.1218251958489418,
-0.015058668330311775,
0.16023103892803192,
0.05144239589571953,
0.031139954924583435,
-0.0004279046261217445,
-0.0325116328895092,
0.006303695961833,
-0.0943368598818779,
-0.031047169119119644,
0.09050585329532623,
0.15155495703220367,
0.08246628195047379,
-0.02154703065752983,
-0.1468789279460907,
-0.15053506195545197,
-0.08229240030050278,
0.06373129785060883,
0.22069057822227478,
-0.0017780669732019305,
0.09104982018470764,
0.2032073736190796,
-0.12677359580993652,
-0.13018929958343506,
-0.10857114940881729,
0.05401066318154335,
0.022057868540287018,
0.014501992613077164,
-0.18077443540096283,
-0.023992547765374184,
0.08521470427513123,
-0.004858105909079313,
-0.02669375203549862,
-0.2459641546010971,
-0.11990702897310257,
0.0012425403110682964,
0.004628640599548817,
-0.0774984359741211,
-0.1338181048631668,
-0.08875268697738647,
-0.053278710693120956,
-0.16501520574092865,
0.04621921852231026,
0.018116401508450508,
0.03829139843583107,
0.00253317435272038,
0.016193578019738197,
0.029864758253097534,
-0.02606564201414585,
0.18509623408317566,
-0.02230083756148815,
0.046037185937166214,
-0.036288097500801086,
0.0005778908962383866,
0.06713584810495377,
-0.010697738267481327,
0.09709592908620834,
0.006997324526309967,
0.012165260501205921,
-0.14143958687782288,
-0.05762862786650658,
-0.03484078496694565,
0.00544912600889802,
-0.035730477422475815,
-0.006453016772866249,
-0.09779126197099686,
0.07362788170576096,
0.05059577897191048,
0.005730158649384975,
0.008992064744234085,
-0.08404689282178879,
0.0340077169239521,
0.1584002822637558,
0.17412054538726807,
0.061899323016405106,
-0.042214203625917435,
-0.022112056612968445,
-0.001107625081203878,
0.044102590531110764,
-0.09753265976905823,
0.03316986560821533,
0.13744871318340302,
-0.008254466578364372,
0.10418155044317245,
-0.029553432017564774,
-0.11612720787525177,
0.022485338151454926,
0.14944791793823242,
-0.0742221474647522,
-0.17524445056915283,
-0.01641751639544964,
-0.04376627877354622,
-0.1128140464425087,
-0.02298954501748085,
0.09689114987850189,
0.04304603114724159,
-0.07646962255239487,
0.019265206530690193,
0.060696717351675034,
0.002867973642423749,
0.1306396871805191,
0.022538186982274055,
0.058424048125743866,
-0.06204727664589882,
0.1283140629529953,
0.1202145367860794,
-0.11853218078613281,
-0.027395712211728096,
0.08502932637929916,
-0.061247870326042175,
-0.029784932732582092,
0.03433724865317345,
0.01995488628745079,
-0.07737541198730469,
-0.0811648964881897,
-0.07918021082878113,
-0.008555987849831581,
-0.00012560203322209418,
0.06334558874368668,
0.012502324767410755,
0.02151196263730526,
0.031450770795345306,
0.04220971465110779,
-0.035777173936367035,
0.07638714462518692,
0.10290516912937164,
-0.036063071340322495,
-0.10498323291540146,
0.01966405101120472,
0.0001414739090250805,
-0.031443215906620026,
-0.016264036297798157,
-0.01364845409989357,
-0.1077408716082573,
0.015044963918626308,
-0.06378208845853806,
0.05609777197241783,
-0.038372963666915894,
-0.007099205162376165,
-0.00036257124156691134,
-0.052083443850278854,
-0.05229188874363899,
0.0033204536885023117,
-0.055233411490917206,
-0.03226163983345032,
-0.019893687218427658,
0.1362488567829132,
-0.14389240741729736,
0.02479385770857334,
0.09413585066795349,
-0.0581202507019043,
0.0736028403043747,
-0.00006540499452967197,
-0.013458454050123692,
0.08085980266332626,
-0.15283939242362976,
0.03220219910144806,
0.015935132279992104,
0.05167366564273834,
0.02065904438495636,
-0.10310271382331848,
0.04396912455558777,
0.0344550795853138,
-0.05280141532421112,
0.022041214630007744,
0.01956256479024887,
-0.08741248399019241,
0.004443207755684853,
0.0373549647629261,
-0.06409133225679398,
-0.06396949291229248,
0.08124196529388428,
0.13738438487052917,
0.04839566349983215,
0.11017340421676636,
-0.0774315819144249,
-0.011639026924967766,
-0.13361383974552155,
-0.00044246413744986057,
0.010221151635050774,
0.026627710089087486,
-0.03874591737985611,
-0.05622094124555588,
0.044520605355501175,
0.012892000377178192,
0.14012551307678223,
0.05775192007422447,
0.12795813381671906,
0.026819521561264992,
0.005122726783156395,
0.016336392611265182,
0.02328728698194027,
0.02350423112511635,
0.0037973097059875727,
0.01072232611477375,
-0.023612508550286293,
-0.045318178832530975,
-0.060738690197467804,
-0.1109461858868599,
0.06002376973628998,
0.13702747225761414,
0.11637099832296371,
0.03986284136772156,
-0.005723903421312571,
-0.06156017258763313,
-0.04653768613934517,
-0.03399422764778137,
-0.015299249440431595,
-0.0017562040593475103,
-0.07486496865749359,
0.08057597279548645,
0.207864910364151,
-0.18929995596408844,
0.09451624006032944,
-0.03702739253640175,
-0.05413356050848961,
-0.061894167214632034,
-0.1871068924665451,
-0.027336645871400833,
-0.0730627104640007,
0.04249005764722824,
-0.09643412381410599,
0.09620672464370728,
0.016572993248701096,
-0.010327959433197975,
-0.07923993468284607,
0.07558052241802216,
-0.043839551508426666,
-0.12306873500347137,
0.05527190864086151,
-0.0003772503405343741,
0.11680790036916733,
-0.023647591471672058,
0.10475550591945648,
0.017193535342812538,
0.08028607815504074,
0.08070166409015656,
0.10110753774642944,
0.03804421052336693,
-0.01255280151963234,
-0.07713203877210617,
-0.03453062102198601,
0.014980476349592209,
0.007479350082576275,
0.015778006985783577,
0.20326067507266998,
0.03286794200539589,
-0.014282569289207458,
0.02718954160809517,
0.2674461007118225,
-0.019707249477505684,
-0.04616371542215347,
-0.14962689578533173,
0.09565312415361404,
0.032343052327632904,
0.063365139067173,
0.02732430212199688,
-0.13590925931930542,
-0.04737413302063942,
0.12480860203504562,
0.05944753810763359,
-0.0006700170342810452,
-0.0204373300075531,
-0.024244381114840508,
0.027926990762352943,
0.00361069617792964,
0.07004271447658539,
0.04759819805622101,
0.24020110070705414,
-0.05296710878610611,
0.04882345348596573,
-0.008540801703929901,
0.012829874642193317,
-0.024545613676309586,
0.13296647369861603,
-0.05701195448637009,
-0.011174853891134262,
-0.07692912220954895,
0.14587916433811188,
-0.0816635712981224,
-0.2956872582435608,
0.020820140838623047,
-0.03500337898731232,
-0.14089319109916687,
0.0005659126327373087,
0.03339425474405289,
0.005418465938419104,
0.048764947801828384,
0.04479280114173889,
-0.03997765854001045,
0.0918283462524414,
0.04360337182879448,
-0.021882152184844017,
-0.04318848252296448,
0.05893206223845482,
-0.09324302524328232,
0.21126383543014526,
-0.007991684600710869,
0.0958215743303299,
0.0960816815495491,
-0.06659089028835297,
-0.15989410877227783,
0.028563547879457474,
0.05313119664788246,
-0.04273563623428345,
0.11058057844638824,
0.08778968453407288,
0.02844858169555664,
-0.01602998748421669,
0.07766705006361008,
0.05504337698221207,
0.037428081035614014,
0.038866639137268066,
0.08091434836387634,
-0.16555118560791016,
0.10914255678653717,
-0.1297650784254074,
0.07448481023311615,
0.08184973150491714,
-0.031110772863030434,
0.08561094105243683,
-0.05350618064403534,
0.0935383141040802,
0.014149731956422329,
0.16747266054153442,
0.030296582728624344,
-0.17387327551841736,
0.0275461096316576,
-0.03185417130589485,
0.04185352474451065,
-0.18330629169940948,
-0.019794147461652756,
0.041157472878694534,
0.0035768060479313135,
-0.05514990910887718,
0.1346326470375061,
0.02786077931523323,
0.024721454828977585,
-0.005178872495889664,
-0.19351859390735626,
-0.015034646727144718,
0.07487469911575317,
-0.12135998159646988,
0.004579813685268164
] |
null | null | transformers | # RoBERTa large trained with data from the National Library of Spain (BNE)
## Table of Contents
<details>
<summary>Click to expand</summary>
- [Overview](#overview)
- [Model description](#model-description)
- [Intended uses and limitations](#intended-uses-and-limitations)
- [How to use](#how-to-use)
- [Limitations and bias](#limitations-and-bias)
- [Training](#training)
- [Training data](#training-data)
- [Training procedure](#training-procedure)
- [Evaluation](#evaluation)
- [Additional information](#additional-information)
- [Author](#author)
- [Contact information](#contact-information)
- [Copyright](#copyright)
- [Licensing information](#licensing-information)
- [Funding](#funding)
- [Citation Information](#citation-information)
- [Disclaimer](#disclaimer)
</details>
## Overview
- **Architecture:** roberta-large
- **Language:** Spanish
- **Task:** fill-mask
- **Data:** BNE
## Model description
The **roberta-large-bne** is a transformer-based masked language model for the Spanish language. It is based on the [RoBERTa](https://arxiv.org/abs/1907.11692) large model and has been pre-trained using the largest Spanish corpus known to date, with a total of 570GB of clean and deduplicated text processed for this work, compiled from the web crawlings performed by the [National Library of Spain (Biblioteca Nacional de España)](http://www.bne.es/en/Inicio/index.html) from 2009 to 2019.
## Intended uses and limitations
The **roberta-large-bne** model is ready-to-use only for masked language modeling to perform the Fill Mask task (try the inference API or read the next section).
However, it is intended to be fine-tuned on non-generative downstream tasks such as Question Answering, Text Classification, or Named Entity Recognition.
You can use the raw model for fill mask or fine-tune it to a downstream task.
## How to use
Here is how to use this model:
```python
>>> from transformers import pipeline
>>> from pprint import pprint
>>> unmasker = pipeline('fill-mask', model='PlanTL-GOB-ES/roberta-large-bne')
>>> pprint(unmasker("Gracias a los datos de la BNE se ha podido <mask> este modelo del lenguaje."))
[{'score': 0.0664491355419159,
'sequence': ' Gracias a los datos de la BNE se ha podido conocer este modelo del lenguaje.',
'token': 1910,
'token_str': ' conocer'},
{'score': 0.0492338091135025,
'sequence': ' Gracias a los datos de la BNE se ha podido realizar este modelo del lenguaje.',
'token': 2178,
'token_str': ' realizar'},
{'score': 0.03890657424926758,
'sequence': ' Gracias a los datos de la BNE se ha podido reconstruir este modelo del lenguaje.',
'token': 23368,
'token_str': ' reconstruir'},
{'score': 0.03662774711847305,
'sequence': ' Gracias a los datos de la BNE se ha podido desarrollar este modelo del lenguaje.',
'token': 3815,
'token_str': ' desarrollar'},
{'score': 0.030557377263903618,
'sequence': ' Gracias a los datos de la BNE se ha podido estudiar este modelo del lenguaje.',
'token': 6361,
'token_str': ' estudiar'}]
```
Here is how to use this model to get the features of a given text in PyTorch:
```python
>>> from transformers import RobertaTokenizer, RobertaModel
>>> tokenizer = RobertaTokenizer.from_pretrained('PlanTL-GOB-ES/roberta-large-bne')
>>> model = RobertaModel.from_pretrained('PlanTL-GOB-ES/roberta-large-bne')
>>> text = "Gracias a los datos de la BNE se ha podido desarrollar este modelo del lenguaje."
>>> encoded_input = tokenizer(text, return_tensors='pt')
>>> output = model(**encoded_input)
>>> print(output.last_hidden_state.shape)
torch.Size([1, 19, 1024])
```
## Limitations and bias
At the time of submission, no measures have been taken to estimate the bias and toxicity embedded in the model. However, we are well aware that our models may be biased since the corpora have been collected using crawling techniques on multiple web sources. We intend to conduct research in these areas in the future, and if completed, this model card will be updated.
## Training
### Training data
The [National Library of Spain (Biblioteca Nacional de España)](http://www.bne.es/en/Inicio/index.html) crawls all .es domains once a year. The training corpus consists of 59TB of WARC files from these crawls, carried out from 2009 to 2019.
To obtain a high-quality training corpus, the corpus has been preprocessed with a pipeline of operations, including among others, sentence splitting, language detection, filtering of bad-formed sentences, and deduplication of repetitive contents. During the process, document boundaries are kept. This resulted in 2TB of Spanish clean corpus. Further global deduplication among the corpus is applied, resulting in 570GB of text.
Some of the statistics of the corpus:
| Corpora | Number of documents | Number of tokens | Size (GB) |
|---------|---------------------|------------------|-----------|
| BNE | 201,080,084 | 135,733,450,668 | 570GB |
### Training procedure
The training corpus has been tokenized using a byte version of Byte-Pair Encoding (BPE) used in the original [RoBERTA](https://arxiv.org/abs/1907.11692) model with a vocabulary size of 50,262 tokens.
The **roberta-large-bne** pre-training consists of a masked language model training, that follows the approach employed for the RoBERTa large. The training lasted a total of 96 hours with 32 computing nodes each one with 4 NVIDIA V100 GPUs of 16GB VRAM.
## Evaluation
When fine-tuned on downstream tasks, this model achieves the following results:
| Dataset | Metric | [**RoBERTa-large**](https://huggingface.co/PlanTL-GOB-ES/roberta-large-bne) |
|--------------|----------|------------|
| MLDoc | F1 | 0.9702 |
| CoNLL-NERC | F1 | 0.8823 |
| CAPITEL-NERC | F1 | 0.9051 |
| PAWS-X | F1 | 0.9150 |
| UD-POS | F1 | 0.9904 |
| CAPITEL-POS | F1 | 0.9856 |
| SQAC | F1 | 0.8202 |
| STS | Combined | 0.8411 |
| XNLI | Accuracy | 0.8263 |
For more evaluation details visit our [GitHub repository](https://github.com/PlanTL-GOB-ES/lm-spanish) or [paper](http://journal.sepln.org/sepln/ojs/ojs/index.php/pln/article/view/6405).
## Additional information
### Author
Text Mining Unit (TeMU) at the Barcelona Supercomputing Center ([email protected])
### Contact information
For further information, send an email to <[email protected]>
### Copyright
Copyright by the [Spanish State Secretariat for Digitalization and Artificial Intelligence (SEDIA)](https://portal.mineco.gob.es/en-us/digitalizacionIA/Pages/sedia.aspx) (2022)
### Licensing information
This work is licensed under a [Apache License, Version 2.0](https://www.apache.org/licenses/LICENSE-2.0)
### Funding
This work was funded by the [Spanish State Secretariat for Digitalization and Artificial Intelligence (SEDIA)](https://portal.mineco.gob.es/en-us/digitalizacionIA/Pages/sedia.aspx) within the framework of the Plan-TL.
### Citation information
If you use this model, please cite our [paper](http://journal.sepln.org/sepln/ojs/ojs/index.php/pln/article/view/6405):
```
@article{,
abstract = {We want to thank the National Library of Spain for such a large effort on the data gathering and the Future of Computing Center, a
Barcelona Supercomputing Center and IBM initiative (2020). This work was funded by the Spanish State Secretariat for Digitalization and Artificial
Intelligence (SEDIA) within the framework of the Plan-TL.},
author = {Asier Gutiérrez Fandiño and Jordi Armengol Estapé and Marc Pàmies and Joan Llop Palao and Joaquin Silveira Ocampo and Casimiro Pio Carrino and Carme Armentano Oller and Carlos Rodriguez Penagos and Aitor Gonzalez Agirre and Marta Villegas},
doi = {10.26342/2022-68-3},
issn = {1135-5948},
journal = {Procesamiento del Lenguaje Natural},
keywords = {Artificial intelligence,Benchmarking,Data processing.,MarIA,Natural language processing,Spanish language modelling,Spanish language resources,Tractament del llenguatge natural (Informàtica),Àrees temàtiques de la UPC::Informàtica::Intel·ligència artificial::Llenguatge natural},
publisher = {Sociedad Española para el Procesamiento del Lenguaje Natural},
title = {MarIA: Spanish Language Models},
volume = {68},
url = {https://upcommons.upc.edu/handle/2117/367156#.YyMTB4X9A-0.mendeley},
year = {2022},
}
```
### Disclaimer
<details>
<summary>Click to expand</summary>
The models published in this repository are intended for a generalist purpose and are available to third parties. These models may have bias and/or any other undesirable distortions.
When third parties, deploy or provide systems and/or services to other parties using any of these models (or using systems based on these models) or become users of the models, they should note that it is their responsibility to mitigate the risks arising from their use and, in any event, to comply with applicable regulations, including regulations regarding the use of Artificial Intelligence.
In no event shall the owner of the models (SEDIA – State Secretariat for Digitalization and Artificial Intelligence) nor the creator (BSC – Barcelona Supercomputing Center) be liable for any results arising from the use made by third parties of these models.
Los modelos publicados en este repositorio tienen una finalidad generalista y están a disposición de terceros. Estos modelos pueden tener sesgos y/u otro tipo de distorsiones indeseables.
Cuando terceros desplieguen o proporcionen sistemas y/o servicios a otras partes usando alguno de estos modelos (o utilizando sistemas basados en estos modelos) o se conviertan en usuarios de los modelos, deben tener en cuenta que es su responsabilidad mitigar los riesgos derivados de su uso y, en todo caso, cumplir con la normativa aplicable, incluyendo la normativa en materia de uso de inteligencia artificial.
En ningún caso el propietario de los modelos (SEDIA – Secretaría de Estado de Digitalización e Inteligencia Artificial) ni el creador (BSC – Barcelona Supercomputing Center) serán responsables de los resultados derivados del uso que hagan terceros de estos modelos.
</details> | {"language": ["es"], "license": "apache-2.0", "tags": ["national library of spain", "spanish", "bne", "roberta-large-bne"], "datasets": ["bne"], "metrics": ["ppl"], "widget": [{"text": "Por la ventanilla del coche vi la Giralda y pens\u00e9 que bonita que es la ciudad de <mask>."}, {"text": "M\u00e1s vale <mask> que lamentar."}, {"text": "Caminante no hay camino, se hace camino al <mask>."}, {"text": "Tengo una pelota roja y otra amarilla. Si le doy la roja a Jose, s\u00f3lo me queda la <mask>."}, {"text": "Tengo una pelota roja y otra amarilla. Si le doy la amarilla a Jose, s\u00f3lo me queda la <mask>."}, {"text": "El <mask> es el pico m\u00e1s alto de Espa\u00f1a."}]} | fill-mask | PlanTL-GOB-ES/roberta-large-bne | [
"transformers",
"pytorch",
"roberta",
"fill-mask",
"national library of spain",
"spanish",
"bne",
"roberta-large-bne",
"es",
"dataset:bne",
"arxiv:1907.11692",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [
"1907.11692"
] | [
"es"
] | TAGS
#transformers #pytorch #roberta #fill-mask #national library of spain #spanish #bne #roberta-large-bne #es #dataset-bne #arxiv-1907.11692 #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
| RoBERTa large trained with data from the National Library of Spain (BNE)
========================================================================
Table of Contents
-----------------
Click to expand
* Overview
* Model description
* Intended uses and limitations
* How to use
* Limitations and bias
* Training
+ Training data
+ Training procedure
* Evaluation
* Additional information
+ Author
+ Contact information
+ Copyright
+ Licensing information
+ Funding
+ Citation Information
+ Disclaimer
Overview
--------
* Architecture: roberta-large
* Language: Spanish
* Task: fill-mask
* Data: BNE
Model description
-----------------
The roberta-large-bne is a transformer-based masked language model for the Spanish language. It is based on the RoBERTa large model and has been pre-trained using the largest Spanish corpus known to date, with a total of 570GB of clean and deduplicated text processed for this work, compiled from the web crawlings performed by the National Library of Spain (Biblioteca Nacional de España) from 2009 to 2019.
Intended uses and limitations
-----------------------------
The roberta-large-bne model is ready-to-use only for masked language modeling to perform the Fill Mask task (try the inference API or read the next section).
However, it is intended to be fine-tuned on non-generative downstream tasks such as Question Answering, Text Classification, or Named Entity Recognition.
You can use the raw model for fill mask or fine-tune it to a downstream task.
How to use
----------
Here is how to use this model:
Here is how to use this model to get the features of a given text in PyTorch:
Limitations and bias
--------------------
At the time of submission, no measures have been taken to estimate the bias and toxicity embedded in the model. However, we are well aware that our models may be biased since the corpora have been collected using crawling techniques on multiple web sources. We intend to conduct research in these areas in the future, and if completed, this model card will be updated.
Training
--------
### Training data
The National Library of Spain (Biblioteca Nacional de España) crawls all .es domains once a year. The training corpus consists of 59TB of WARC files from these crawls, carried out from 2009 to 2019.
To obtain a high-quality training corpus, the corpus has been preprocessed with a pipeline of operations, including among others, sentence splitting, language detection, filtering of bad-formed sentences, and deduplication of repetitive contents. During the process, document boundaries are kept. This resulted in 2TB of Spanish clean corpus. Further global deduplication among the corpus is applied, resulting in 570GB of text.
Some of the statistics of the corpus:
### Training procedure
The training corpus has been tokenized using a byte version of Byte-Pair Encoding (BPE) used in the original RoBERTA model with a vocabulary size of 50,262 tokens.
The roberta-large-bne pre-training consists of a masked language model training, that follows the approach employed for the RoBERTa large. The training lasted a total of 96 hours with 32 computing nodes each one with 4 NVIDIA V100 GPUs of 16GB VRAM.
Evaluation
----------
When fine-tuned on downstream tasks, this model achieves the following results:
Dataset: MLDoc, Metric: F1, RoBERTa-large: 0.9702
Dataset: CoNLL-NERC, Metric: F1, RoBERTa-large: 0.8823
Dataset: CAPITEL-NERC, Metric: F1, RoBERTa-large: 0.9051
Dataset: PAWS-X, Metric: F1, RoBERTa-large: 0.9150
Dataset: UD-POS, Metric: F1, RoBERTa-large: 0.9904
Dataset: CAPITEL-POS, Metric: F1, RoBERTa-large: 0.9856
Dataset: SQAC, Metric: F1, RoBERTa-large: 0.8202
Dataset: STS, Metric: Combined, RoBERTa-large: 0.8411
Dataset: XNLI, Metric: Accuracy, RoBERTa-large: 0.8263
For more evaluation details visit our GitHub repository or paper.
Additional information
----------------------
### Author
Text Mining Unit (TeMU) at the Barcelona Supercomputing Center (bsc-temu@URL)
### Contact information
For further information, send an email to [plantl-gob-es@URL](mailto:plantl-gob-es@URL)
### Copyright
Copyright by the Spanish State Secretariat for Digitalization and Artificial Intelligence (SEDIA) (2022)
### Licensing information
This work is licensed under a Apache License, Version 2.0
### Funding
This work was funded by the Spanish State Secretariat for Digitalization and Artificial Intelligence (SEDIA) within the framework of the Plan-TL.
information
If you use this model, please cite our paper:
### Disclaimer
Click to expand
The models published in this repository are intended for a generalist purpose and are available to third parties. These models may have bias and/or any other undesirable distortions.
When third parties, deploy or provide systems and/or services to other parties using any of these models (or using systems based on these models) or become users of the models, they should note that it is their responsibility to mitigate the risks arising from their use and, in any event, to comply with applicable regulations, including regulations regarding the use of Artificial Intelligence.
In no event shall the owner of the models (SEDIA – State Secretariat for Digitalization and Artificial Intelligence) nor the creator (BSC – Barcelona Supercomputing Center) be liable for any results arising from the use made by third parties of these models.
Los modelos publicados en este repositorio tienen una finalidad generalista y están a disposición de terceros. Estos modelos pueden tener sesgos y/u otro tipo de distorsiones indeseables.
Cuando terceros desplieguen o proporcionen sistemas y/o servicios a otras partes usando alguno de estos modelos (o utilizando sistemas basados en estos modelos) o se conviertan en usuarios de los modelos, deben tener en cuenta que es su responsabilidad mitigar los riesgos derivados de su uso y, en todo caso, cumplir con la normativa aplicable, incluyendo la normativa en materia de uso de inteligencia artificial.
En ningún caso el propietario de los modelos (SEDIA – Secretaría de Estado de Digitalización e Inteligencia Artificial) ni el creador (BSC – Barcelona Supercomputing Center) serán responsables de los resultados derivados del uso que hagan terceros de estos modelos.
| [
"### Training data\n\n\nThe National Library of Spain (Biblioteca Nacional de España) crawls all .es domains once a year. The training corpus consists of 59TB of WARC files from these crawls, carried out from 2009 to 2019.\n\n\nTo obtain a high-quality training corpus, the corpus has been preprocessed with a pipeline of operations, including among others, sentence splitting, language detection, filtering of bad-formed sentences, and deduplication of repetitive contents. During the process, document boundaries are kept. This resulted in 2TB of Spanish clean corpus. Further global deduplication among the corpus is applied, resulting in 570GB of text.\n\n\nSome of the statistics of the corpus:",
"### Training procedure\n\n\nThe training corpus has been tokenized using a byte version of Byte-Pair Encoding (BPE) used in the original RoBERTA model with a vocabulary size of 50,262 tokens.\n\n\nThe roberta-large-bne pre-training consists of a masked language model training, that follows the approach employed for the RoBERTa large. The training lasted a total of 96 hours with 32 computing nodes each one with 4 NVIDIA V100 GPUs of 16GB VRAM.\n\n\nEvaluation\n----------\n\n\nWhen fine-tuned on downstream tasks, this model achieves the following results:\n\n\nDataset: MLDoc, Metric: F1, RoBERTa-large: 0.9702\nDataset: CoNLL-NERC, Metric: F1, RoBERTa-large: 0.8823\nDataset: CAPITEL-NERC, Metric: F1, RoBERTa-large: 0.9051\nDataset: PAWS-X, Metric: F1, RoBERTa-large: 0.9150\nDataset: UD-POS, Metric: F1, RoBERTa-large: 0.9904\nDataset: CAPITEL-POS, Metric: F1, RoBERTa-large: 0.9856\nDataset: SQAC, Metric: F1, RoBERTa-large: 0.8202\nDataset: STS, Metric: Combined, RoBERTa-large: 0.8411\nDataset: XNLI, Metric: Accuracy, RoBERTa-large: 0.8263\n\n\nFor more evaluation details visit our GitHub repository or paper.\n\n\nAdditional information\n----------------------",
"### Author\n\n\nText Mining Unit (TeMU) at the Barcelona Supercomputing Center (bsc-temu@URL)",
"### Contact information\n\n\nFor further information, send an email to [plantl-gob-es@URL](mailto:plantl-gob-es@URL)",
"### Copyright\n\n\nCopyright by the Spanish State Secretariat for Digitalization and Artificial Intelligence (SEDIA) (2022)",
"### Licensing information\n\n\nThis work is licensed under a Apache License, Version 2.0",
"### Funding\n\n\nThis work was funded by the Spanish State Secretariat for Digitalization and Artificial Intelligence (SEDIA) within the framework of the Plan-TL.\n\n\ninformation\n\n\nIf you use this model, please cite our paper:",
"### Disclaimer\n\n\n\nClick to expand\nThe models published in this repository are intended for a generalist purpose and are available to third parties. These models may have bias and/or any other undesirable distortions.\n\n\nWhen third parties, deploy or provide systems and/or services to other parties using any of these models (or using systems based on these models) or become users of the models, they should note that it is their responsibility to mitigate the risks arising from their use and, in any event, to comply with applicable regulations, including regulations regarding the use of Artificial Intelligence.\n\n\nIn no event shall the owner of the models (SEDIA – State Secretariat for Digitalization and Artificial Intelligence) nor the creator (BSC – Barcelona Supercomputing Center) be liable for any results arising from the use made by third parties of these models.\n\n\nLos modelos publicados en este repositorio tienen una finalidad generalista y están a disposición de terceros. Estos modelos pueden tener sesgos y/u otro tipo de distorsiones indeseables.\n\n\nCuando terceros desplieguen o proporcionen sistemas y/o servicios a otras partes usando alguno de estos modelos (o utilizando sistemas basados en estos modelos) o se conviertan en usuarios de los modelos, deben tener en cuenta que es su responsabilidad mitigar los riesgos derivados de su uso y, en todo caso, cumplir con la normativa aplicable, incluyendo la normativa en materia de uso de inteligencia artificial.\n\n\nEn ningún caso el propietario de los modelos (SEDIA – Secretaría de Estado de Digitalización e Inteligencia Artificial) ni el creador (BSC – Barcelona Supercomputing Center) serán responsables de los resultados derivados del uso que hagan terceros de estos modelos."
] | [
"TAGS\n#transformers #pytorch #roberta #fill-mask #national library of spain #spanish #bne #roberta-large-bne #es #dataset-bne #arxiv-1907.11692 #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n",
"### Training data\n\n\nThe National Library of Spain (Biblioteca Nacional de España) crawls all .es domains once a year. The training corpus consists of 59TB of WARC files from these crawls, carried out from 2009 to 2019.\n\n\nTo obtain a high-quality training corpus, the corpus has been preprocessed with a pipeline of operations, including among others, sentence splitting, language detection, filtering of bad-formed sentences, and deduplication of repetitive contents. During the process, document boundaries are kept. This resulted in 2TB of Spanish clean corpus. Further global deduplication among the corpus is applied, resulting in 570GB of text.\n\n\nSome of the statistics of the corpus:",
"### Training procedure\n\n\nThe training corpus has been tokenized using a byte version of Byte-Pair Encoding (BPE) used in the original RoBERTA model with a vocabulary size of 50,262 tokens.\n\n\nThe roberta-large-bne pre-training consists of a masked language model training, that follows the approach employed for the RoBERTa large. The training lasted a total of 96 hours with 32 computing nodes each one with 4 NVIDIA V100 GPUs of 16GB VRAM.\n\n\nEvaluation\n----------\n\n\nWhen fine-tuned on downstream tasks, this model achieves the following results:\n\n\nDataset: MLDoc, Metric: F1, RoBERTa-large: 0.9702\nDataset: CoNLL-NERC, Metric: F1, RoBERTa-large: 0.8823\nDataset: CAPITEL-NERC, Metric: F1, RoBERTa-large: 0.9051\nDataset: PAWS-X, Metric: F1, RoBERTa-large: 0.9150\nDataset: UD-POS, Metric: F1, RoBERTa-large: 0.9904\nDataset: CAPITEL-POS, Metric: F1, RoBERTa-large: 0.9856\nDataset: SQAC, Metric: F1, RoBERTa-large: 0.8202\nDataset: STS, Metric: Combined, RoBERTa-large: 0.8411\nDataset: XNLI, Metric: Accuracy, RoBERTa-large: 0.8263\n\n\nFor more evaluation details visit our GitHub repository or paper.\n\n\nAdditional information\n----------------------",
"### Author\n\n\nText Mining Unit (TeMU) at the Barcelona Supercomputing Center (bsc-temu@URL)",
"### Contact information\n\n\nFor further information, send an email to [plantl-gob-es@URL](mailto:plantl-gob-es@URL)",
"### Copyright\n\n\nCopyright by the Spanish State Secretariat for Digitalization and Artificial Intelligence (SEDIA) (2022)",
"### Licensing information\n\n\nThis work is licensed under a Apache License, Version 2.0",
"### Funding\n\n\nThis work was funded by the Spanish State Secretariat for Digitalization and Artificial Intelligence (SEDIA) within the framework of the Plan-TL.\n\n\ninformation\n\n\nIf you use this model, please cite our paper:",
"### Disclaimer\n\n\n\nClick to expand\nThe models published in this repository are intended for a generalist purpose and are available to third parties. These models may have bias and/or any other undesirable distortions.\n\n\nWhen third parties, deploy or provide systems and/or services to other parties using any of these models (or using systems based on these models) or become users of the models, they should note that it is their responsibility to mitigate the risks arising from their use and, in any event, to comply with applicable regulations, including regulations regarding the use of Artificial Intelligence.\n\n\nIn no event shall the owner of the models (SEDIA – State Secretariat for Digitalization and Artificial Intelligence) nor the creator (BSC – Barcelona Supercomputing Center) be liable for any results arising from the use made by third parties of these models.\n\n\nLos modelos publicados en este repositorio tienen una finalidad generalista y están a disposición de terceros. Estos modelos pueden tener sesgos y/u otro tipo de distorsiones indeseables.\n\n\nCuando terceros desplieguen o proporcionen sistemas y/o servicios a otras partes usando alguno de estos modelos (o utilizando sistemas basados en estos modelos) o se conviertan en usuarios de los modelos, deben tener en cuenta que es su responsabilidad mitigar los riesgos derivados de su uso y, en todo caso, cumplir con la normativa aplicable, incluyendo la normativa en materia de uso de inteligencia artificial.\n\n\nEn ningún caso el propietario de los modelos (SEDIA – Secretaría de Estado de Digitalización e Inteligencia Artificial) ni el creador (BSC – Barcelona Supercomputing Center) serán responsables de los resultados derivados del uso que hagan terceros de estos modelos."
] | [
83,
160,
383,
28,
37,
22,
19,
46,
364
] | [
"passage: TAGS\n#transformers #pytorch #roberta #fill-mask #national library of spain #spanish #bne #roberta-large-bne #es #dataset-bne #arxiv-1907.11692 #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n### Training data\n\n\nThe National Library of Spain (Biblioteca Nacional de España) crawls all .es domains once a year. The training corpus consists of 59TB of WARC files from these crawls, carried out from 2009 to 2019.\n\n\nTo obtain a high-quality training corpus, the corpus has been preprocessed with a pipeline of operations, including among others, sentence splitting, language detection, filtering of bad-formed sentences, and deduplication of repetitive contents. During the process, document boundaries are kept. This resulted in 2TB of Spanish clean corpus. Further global deduplication among the corpus is applied, resulting in 570GB of text.\n\n\nSome of the statistics of the corpus:",
"passage: ### Training procedure\n\n\nThe training corpus has been tokenized using a byte version of Byte-Pair Encoding (BPE) used in the original RoBERTA model with a vocabulary size of 50,262 tokens.\n\n\nThe roberta-large-bne pre-training consists of a masked language model training, that follows the approach employed for the RoBERTa large. The training lasted a total of 96 hours with 32 computing nodes each one with 4 NVIDIA V100 GPUs of 16GB VRAM.\n\n\nEvaluation\n----------\n\n\nWhen fine-tuned on downstream tasks, this model achieves the following results:\n\n\nDataset: MLDoc, Metric: F1, RoBERTa-large: 0.9702\nDataset: CoNLL-NERC, Metric: F1, RoBERTa-large: 0.8823\nDataset: CAPITEL-NERC, Metric: F1, RoBERTa-large: 0.9051\nDataset: PAWS-X, Metric: F1, RoBERTa-large: 0.9150\nDataset: UD-POS, Metric: F1, RoBERTa-large: 0.9904\nDataset: CAPITEL-POS, Metric: F1, RoBERTa-large: 0.9856\nDataset: SQAC, Metric: F1, RoBERTa-large: 0.8202\nDataset: STS, Metric: Combined, RoBERTa-large: 0.8411\nDataset: XNLI, Metric: Accuracy, RoBERTa-large: 0.8263\n\n\nFor more evaluation details visit our GitHub repository or paper.\n\n\nAdditional information\n----------------------### Author\n\n\nText Mining Unit (TeMU) at the Barcelona Supercomputing Center (bsc-temu@URL)### Contact information\n\n\nFor further information, send an email to [plantl-gob-es@URL](mailto:plantl-gob-es@URL)### Copyright\n\n\nCopyright by the Spanish State Secretariat for Digitalization and Artificial Intelligence (SEDIA) (2022)### Licensing information\n\n\nThis work is licensed under a Apache License, Version 2.0### Funding\n\n\nThis work was funded by the Spanish State Secretariat for Digitalization and Artificial Intelligence (SEDIA) within the framework of the Plan-TL.\n\n\ninformation\n\n\nIf you use this model, please cite our paper:"
] | [
-0.049707505851984024,
0.14850088953971863,
-0.0060951923951506615,
0.041039470583200455,
0.09856148064136505,
0.026904316619038582,
0.04208232834935188,
0.0949828028678894,
-0.09204181283712387,
0.057494234293699265,
0.036024078726768494,
0.09942222386598587,
0.06347296386957169,
0.019261378794908524,
0.022196553647518158,
-0.19059814512729645,
0.0321689248085022,
-0.08019870519638062,
-0.11664091050624847,
0.029294589534401894,
0.09344207495450974,
-0.04072609543800354,
0.039390116930007935,
-0.0670018196105957,
-0.020130256190896034,
0.03851853683590889,
-0.02768240123987198,
-0.1146654486656189,
0.08668411523103714,
0.07225455343723297,
0.06438881158828735,
0.012173384428024292,
0.05180448666214943,
-0.12427737563848495,
0.0028135483153164387,
0.049561843276023865,
-0.027512816712260246,
0.04637029021978378,
0.04086323827505112,
-0.019040130078792572,
0.16325734555721283,
-0.11255750060081482,
0.03790140151977539,
0.0010813740082085133,
-0.14021456241607666,
-0.2079433798789978,
-0.08715642988681793,
-0.04467444121837616,
0.04758991301059723,
0.037929706275463104,
-0.0008509587496519089,
0.027470940724015236,
-0.07293430715799332,
0.008782701566815376,
0.036250919103622437,
-0.1911497712135315,
-0.037110645323991776,
0.0768825113773346,
0.051872916519641876,
0.12324030697345734,
-0.047053176909685135,
0.005570121109485626,
0.01653122901916504,
0.03296632319688797,
-0.009894616901874542,
-0.0727134495973587,
-0.06650283932685852,
0.003832782618701458,
-0.07301897555589676,
-0.10282570123672485,
0.18562088906764984,
-0.0580630861222744,
-0.07785563170909882,
-0.05424875020980835,
-0.011173872277140617,
0.01752377673983574,
0.00794508308172226,
-0.017633629962801933,
0.04574337974190712,
-0.0005167145282030106,
0.04875416308641434,
-0.03661473095417023,
-0.08263881504535675,
-0.030492380261421204,
-0.12014657258987427,
0.045959945768117905,
0.021283896639943123,
0.010329922661185265,
0.014913264662027359,
0.06538531184196472,
0.029645299538969994,
-0.06162500008940697,
0.007040003314614296,
-0.009416776709258556,
-0.0449376218020916,
0.017727335914969444,
-0.03770258650183678,
-0.10009735822677612,
0.02739124558866024,
0.09533780813217163,
-0.08499401807785034,
0.01074889861047268,
-0.034761350601911545,
0.029015440493822098,
0.06930506229400635,
0.06024497002363205,
-0.050521381199359894,
-0.03950913995504379,
-0.008369185030460358,
-0.04257822036743164,
0.00048261601477861404,
-0.0016571669839322567,
-0.07193440198898315,
-0.018617188557982445,
-0.014765365049242973,
0.08141570538282394,
0.011281379498541355,
-0.011136951856315136,
-0.014672607183456421,
-0.03374674916267395,
0.07056711614131927,
-0.10299722105264664,
-0.013703063130378723,
0.00524875707924366,
-0.05960968881845474,
-0.053305380046367645,
-0.008062999695539474,
-0.03171171620488167,
-0.06590297818183899,
0.026058878749608994,
-0.06291550397872925,
-0.08941873162984848,
-0.06600219011306763,
-0.09936390072107315,
0.05587456747889519,
-0.13050420582294464,
-0.015201580710709095,
-0.10791443288326263,
-0.1255398541688919,
-0.0769268125295639,
0.04713011533021927,
-0.08640366047620773,
0.04888859763741493,
-0.05768745020031929,
-0.00367767084389925,
0.012315934523940086,
-0.03348206728696823,
0.12048786878585815,
-0.042175956070423126,
0.09728748351335526,
-0.03820852190256119,
0.0666583925485611,
-0.05871681123971939,
0.031502820551395416,
-0.08095051348209381,
0.003538830205798149,
-0.09869697690010071,
0.10867540538311005,
-0.06827917695045471,
-0.002424144884571433,
-0.10709574818611145,
-0.02666769176721573,
-0.06030147150158882,
0.06702039390802383,
0.022311044856905937,
0.0996662825345993,
-0.1644497513771057,
-0.0430648997426033,
0.17140065133571625,
-0.05259291082620621,
0.010525505989789963,
0.07647587358951569,
-0.03793296217918396,
0.10862302035093307,
0.07088525593280792,
0.1716184914112091,
0.051845040172338486,
-0.08272313326597214,
-0.008356442674994469,
-0.002227439545094967,
0.003989893943071365,
0.0005806786939501762,
0.08982754498720169,
-0.07724685966968536,
0.050971869379282,
0.03142927587032318,
-0.04874064773321152,
-0.023799896240234375,
-0.009788303636014462,
-0.05355650186538696,
0.07010098546743393,
0.00005389656871557236,
-0.04685777425765991,
0.018759768456220627,
0.024621788412332535,
-0.023405972868204117,
-0.009955979883670807,
0.027786748483777046,
0.03419490158557892,
-0.024091601371765137,
0.010042810812592506,
-0.033957529813051224,
0.031784966588020325,
-0.03514649346470833,
0.031350985169410706,
-0.14209116995334625,
-0.031762249767780304,
0.030705254524946213,
-0.01817850023508072,
0.07641220837831497,
0.041760802268981934,
0.027292955666780472,
0.04703422635793686,
-0.033134959638118744,
0.01486918143928051,
0.006546716205775738,
-0.05658011883497238,
-0.029374169185757637,
-0.12050335109233856,
0.030690615996718407,
-0.04062112420797348,
0.04488440603017807,
-0.08043621480464935,
-0.0013001412153244019,
0.0005643516778945923,
0.025023845955729485,
0.015182843431830406,
-0.020741501823067665,
-0.006032996810972691,
0.06475887447595596,
-0.014604983851313591,
-0.03690502047538757,
0.04803775995969772,
0.008603292517364025,
-0.01035038847476244,
0.05625168979167938,
-0.09785233438014984,
0.030829308554530144,
0.0901724249124527,
0.06977412104606628,
-0.038595303893089294,
-0.054774489253759384,
-0.018973026424646378,
0.008994907140731812,
-0.04361165314912796,
-0.05553552508354187,
0.21272802352905273,
-0.018947003409266472,
0.09363159537315369,
-0.12772612273693085,
-0.020393263548612595,
0.026774020865559578,
-0.049263373017311096,
-0.0720726028084755,
0.12578991055488586,
0.0032856501638889313,
-0.11075586080551147,
0.09627067297697067,
0.012128106318414211,
0.0032243337482213974,
0.20248953998088837,
-0.0016136635094881058,
-0.09531143307685852,
-0.03510425612330437,
0.04629603773355484,
0.057730577886104584,
0.09491637349128723,
-0.07788785547018051,
-0.0007280338322743773,
0.043511006981134415,
0.06012870371341705,
0.0696244016289711,
-0.09245781600475311,
-0.012058739550411701,
-0.014626340940594673,
-0.07478385418653488,
-0.03810901939868927,
0.011271409690380096,
-0.011024676263332367,
0.11795076727867126,
0.04275505989789963,
0.013622167520225048,
-0.011792315170168877,
-0.04087863489985466,
-0.04848848283290863,
0.16333696246147156,
-0.16747254133224487,
-0.261666476726532,
-0.1854567527770996,
0.045172907412052155,
-0.014353211037814617,
0.07441184669733047,
0.02160346508026123,
-0.09427206218242645,
-0.04122449457645416,
-0.023468025028705597,
0.056417807936668396,
0.005156717263162136,
-0.06079847738146782,
-0.06602183729410172,
0.08973756432533264,
-0.020501041784882545,
-0.1380527764558792,
-0.001004043035209179,
-0.011071942746639252,
-0.08674491941928864,
0.01973971165716648,
-0.01628928631544113,
0.11947764456272125,
0.047779254615306854,
0.03750684857368469,
-0.025976058095693588,
-0.021409429609775543,
0.11799049377441406,
-0.10529468953609467,
-0.015464678406715393,
0.09595130383968353,
0.03974670171737671,
-0.00499989278614521,
0.10050664842128754,
-0.005613755434751511,
-0.09445546567440033,
0.00618892814964056,
0.023841099813580513,
-0.07207081466913223,
-0.26213496923446655,
-0.08213144540786743,
-0.042562082409858704,
-0.02340688370168209,
0.027951005846261978,
0.050488755106925964,
-0.008525246754288673,
-0.0030753999017179012,
-0.00803797971457243,
0.050689976662397385,
0.049266964197158813,
0.03323884308338165,
0.09299220144748688,
-0.015067014843225479,
0.04496731609106064,
-0.06039903312921524,
-0.05066347122192383,
0.11885342001914978,
0.09847544133663177,
0.2474932074546814,
-0.02054557576775551,
0.11834865808486938,
0.06207379698753357,
0.04258958622813225,
-0.03234820067882538,
0.043389029800891876,
-0.0010349545627832413,
0.03623943775892258,
-0.08728675544261932,
-0.05022968351840973,
-0.10192998498678207,
0.02316039614379406,
0.039408110082149506,
-0.05469454079866409,
-0.06079574674367905,
-0.08501088619232178,
0.06673993170261383,
0.10919547080993652,
-0.007021491415798664,
-0.1657048463821411,
-0.049782443791627884,
0.058736652135849,
-0.03378739207983017,
-0.09001737087965012,
0.00859500840306282,
0.13623762130737305,
-0.11410880088806152,
-0.004163792356848717,
0.0026872102171182632,
0.062246162444353104,
-0.11412402242422104,
0.01490566972643137,
-0.040399178862571716,
0.009585507214069366,
0.016233166679739952,
0.07158182561397552,
-0.2094515562057495,
0.2003016322851181,
0.03233713656663895,
0.06211431324481964,
-0.06060301139950752,
0.02069375291466713,
-0.03585588559508324,
-0.09257647395133972,
0.15148857235908508,
0.024974770843982697,
-0.05468570441007614,
-0.07940833270549774,
-0.0528072863817215,
0.042129188776016235,
0.08246523886919022,
-0.06079471856355667,
0.0579526424407959,
0.004435725510120392,
0.012480469420552254,
-0.04086650535464287,
-0.028702136129140854,
-0.028554001823067665,
-0.16360706090927124,
0.008244997821748257,
-0.0914730429649353,
-0.0076161716133356094,
-0.0466490238904953,
-0.0060859909281134605,
0.0006106180371716619,
0.10731986910104752,
-0.11713169515132904,
-0.04832661896944046,
-0.08680593222379684,
0.012694965116679668,
0.11969805508852005,
-0.04508105665445328,
0.006066519767045975,
0.003804920706897974,
-0.008133893832564354,
-0.012815234251320362,
-0.057654038071632385,
0.07215388119220734,
-0.05442094802856445,
-0.06702684611082077,
-0.09008932113647461,
0.08412989228963852,
0.0647304356098175,
0.057574816048145294,
0.012177033349871635,
0.010752979665994644,
0.05631042271852493,
-0.06468844413757324,
0.004240900278091431,
0.05229247361421585,
0.1237269714474678,
0.0721907690167427,
-0.1445726752281189,
-0.10120880603790283,
-0.04110855609178543,
-0.05863019824028015,
0.1117207407951355,
0.19231532514095306,
-0.013906765729188919,
0.13132542371749878,
0.21029429137706757,
-0.15534889698028564,
-0.22161127626895905,
-0.01674758642911911,
0.04990608990192413,
0.04419228062033653,
-0.03288068622350693,
-0.22743353247642517,
-0.04071403294801712,
0.17166444659233093,
0.0020560184493660927,
0.03427716717123985,
-0.3466362953186035,
-0.07457751035690308,
0.03909831494092941,
0.04746349900960922,
0.12444072961807251,
-0.14705148339271545,
-0.0824226438999176,
-0.05577389895915985,
-0.015711650252342224,
0.1426285207271576,
-0.0672619566321373,
0.09827253222465515,
0.02249828912317753,
-0.040320828557014465,
0.02419227920472622,
-0.014918077737092972,
0.18031072616577148,
0.021340828388929367,
0.07431298494338989,
-0.02615438960492611,
0.018631823360919952,
0.16271816194057465,
0.009253816679120064,
0.05945191532373428,
0.059792280197143555,
-0.010437434539198875,
-0.13081860542297363,
-0.03721160441637039,
-0.08342248201370239,
0.014176197350025177,
-0.05044814571738243,
-0.055304817855358124,
-0.02348054200410843,
0.08986783027648926,
0.04855787754058838,
-0.019511528313159943,
0.04132980853319168,
-0.0494854673743248,
0.04770532622933388,
0.11673557758331299,
0.09890875220298767,
0.06227055937051773,
0.02034122869372368,
0.02309396304190159,
-0.001013085711747408,
0.05288923159241676,
-0.09026365727186203,
0.01493037212640047,
0.09701667726039886,
-0.038522131741046906,
0.10246887803077698,
0.004787735641002655,
-0.09503655135631561,
0.03008751943707466,
0.13473057746887207,
-0.05523506924510002,
-0.04824633151292801,
0.03797711804509163,
-0.06500355899333954,
-0.02083045057952404,
-0.030812187120318413,
0.12665444612503052,
0.041284073144197464,
-0.03295765817165375,
-0.002218110952526331,
0.03795948624610901,
-0.015325285494327545,
0.1555313766002655,
0.013768868520855904,
0.0034260163083672523,
-0.08031893521547318,
0.15071561932563782,
0.09691131860017776,
-0.15963712334632874,
0.016905011609196663,
0.09397934377193451,
-0.061238616704940796,
-0.04541895538568497,
-0.04722648859024048,
0.05015707015991211,
-0.09041537344455719,
-0.04409266263246536,
-0.06421126425266266,
-0.06380613148212433,
0.025760075077414513,
0.042770057916641235,
-0.000617719255387783,
0.028900055214762688,
-0.015576975420117378,
0.03906160593032837,
-0.028660999611020088,
0.03362146392464638,
0.005972397513687611,
-0.011012408882379532,
-0.000038448721170425415,
0.0923798680305481,
-0.011955938301980495,
-0.03518882393836975,
-0.011296271346509457,
-0.021485310047864914,
-0.141578808426857,
0.018436044454574585,
-0.08239349722862244,
0.004133937414735556,
-0.0510256290435791,
-0.025756757706403732,
-0.022245928645133972,
0.025214504450559616,
-0.01352063287049532,
-0.042872197926044464,
-0.047056037932634354,
-0.02130250632762909,
-0.045037344098091125,
0.08980990946292877,
-0.07376881688833237,
-0.04257379099726677,
0.011777554638683796,
-0.048201099038124084,
0.06702334433794022,
0.034296344965696335,
0.01876484602689743,
0.07330326735973358,
-0.09628649055957794,
0.0484817773103714,
0.06609742343425751,
0.039270542562007904,
0.04239174723625183,
-0.06609369814395905,
0.03559819981455803,
0.032801177352666855,
-0.009275153279304504,
0.04094694182276726,
0.04146675765514374,
-0.09962628781795502,
0.053741779178380966,
-0.0163201242685318,
-0.0869988277554512,
-0.03943478316068649,
0.05121096223592758,
0.12138348072767258,
0.04119713604450226,
0.07505278289318085,
-0.08263996243476868,
-0.008868777193129063,
-0.08114951848983765,
-0.011367443017661572,
-0.011886137537658215,
-0.036113739013671875,
-0.03955955058336258,
-0.016077224165201187,
0.03653879463672638,
0.018823491409420967,
0.17081895470619202,
0.05351032316684723,
0.034046418964862823,
0.007168818265199661,
-0.027012761682271957,
-0.034240443259477615,
0.024621814489364624,
0.053883928805589676,
0.09545458853244781,
-0.01786072924733162,
-0.05698124319314957,
0.0427861362695694,
-0.005622409284114838,
0.042801935225725174,
0.0633426383137703,
0.14151667058467865,
0.22763489186763763,
0.07211180031299591,
0.04708893597126007,
-0.10950452089309692,
-0.033454444259405136,
0.1482388973236084,
0.02349160611629486,
0.04049959033727646,
-0.022615645080804825,
-0.0018375497311353683,
0.12536004185676575,
-0.20315587520599365,
0.10189694166183472,
-0.011499729007482529,
-0.051332347095012665,
-0.07368836551904678,
-0.148550882935524,
-0.0058008963242173195,
0.004381575621664524,
-0.011037090793251991,
-0.11474062502384186,
0.04790320619940758,
0.009521622210741043,
0.0059024435468018055,
-0.02382596582174301,
0.025399506092071533,
-0.0859965831041336,
-0.12157312780618668,
0.03730600327253342,
0.036141570657491684,
0.0926312506198883,
-0.01832101307809353,
0.0052223047241568565,
0.002205078024417162,
0.059119801968336105,
0.03693942353129387,
0.07277413457632065,
0.06609132885932922,
0.02559821307659149,
-0.07479844987392426,
-0.01953350193798542,
-0.011610827408730984,
0.03671986982226372,
-0.009365331381559372,
0.19805467128753662,
0.06961499154567719,
-0.07159218192100525,
0.04373934119939804,
0.20889624953269958,
-0.013130922801792622,
-0.023261893540620804,
-0.11565318703651428,
0.13729622960090637,
0.06489350646734238,
0.08782468736171722,
0.021886814385652542,
-0.08121499419212341,
-0.03818618506193161,
0.1512400209903717,
0.2524843215942383,
-0.05534584820270538,
-0.03933551907539368,
0.05561130866408348,
-0.0002042276319116354,
0.029285211116075516,
0.12312842905521393,
0.059221431612968445,
0.32764026522636414,
-0.04813714325428009,
-0.0006917621940374374,
-0.017968222498893738,
0.04686196148395538,
-0.0541081577539444,
0.1354476809501648,
-0.05101773142814636,
-0.018338754773139954,
0.0006351538468152285,
0.06911323219537735,
-0.10703279078006744,
-0.28121218085289,
0.06332255899906158,
-0.09502986073493958,
-0.11657111346721649,
-0.012870420701801777,
-0.04663832113146782,
0.022520966827869415,
0.07672088593244553,
0.03263486921787262,
-0.00957595556974411,
0.07058581709861755,
0.014744389802217484,
-0.07633516192436218,
-0.10720372945070267,
0.05942528322339058,
0.049476347863674164,
0.17213745415210724,
-0.02017437480390072,
0.049972496926784515,
0.06831157207489014,
0.031349413096904755,
-0.0808272585272789,
0.057421013712882996,
0.0339539460837841,
0.01284441165626049,
0.04159620404243469,
0.01605382189154625,
-0.02039940282702446,
0.1007423922419548,
0.07326966524124146,
-0.00646795891225338,
0.06956237554550171,
0.03189532831311226,
-0.012863242998719215,
-0.07327485084533691,
0.05933648720383644,
-0.113576740026474,
0.08254970610141754,
0.1703607589006424,
0.03634748235344887,
0.036590028554201126,
-0.0535883754491806,
0.03372178599238396,
0.028649527579545975,
0.07953020930290222,
-0.02124563790857792,
-0.13950413465499878,
-0.006523691117763519,
-0.050550997257232666,
0.014591129496693611,
-0.2309463918209076,
-0.024197900667786598,
0.020324349403381348,
-0.02919602021574974,
-0.051012519747018814,
0.07563116401433945,
-0.02192843146622181,
0.023073265329003334,
-0.022170670330524445,
-0.0513291209936142,
-0.0015212483704090118,
0.06385185569524765,
-0.07212284207344055,
-0.05141439661383629
] |
null | null | transformers |
#Homer DialoGPT Model | {"tags": ["conversational"]} | text-generation | Plencers/DialoGPT-small-homer | [
"transformers",
"pytorch",
"gpt2",
"text-generation",
"conversational",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [] | TAGS
#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
#Homer DialoGPT Model | [] | [
"TAGS\n#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n"
] | [
51
] | [
"passage: TAGS\n#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n"
] | [
-0.009697278961539268,
0.03208012506365776,
-0.007204889785498381,
0.004809224978089333,
0.16726240515708923,
0.014898733235895634,
0.09765533357858658,
0.13672804832458496,
-0.007841327227652073,
-0.031050153076648712,
0.14490588009357452,
0.20411323010921478,
-0.006439372431486845,
0.0661218985915184,
-0.07572533935308456,
-0.2683109939098358,
0.05759621039032936,
0.046649303287267685,
0.016515716910362244,
0.1200079694390297,
0.08573378622531891,
-0.05473608896136284,
0.08714032918214798,
-0.014583407901227474,
-0.150366872549057,
0.017733458429574966,
0.043394338339567184,
-0.12260226160287857,
0.11910516023635864,
0.05462685227394104,
0.07063519209623337,
0.014929565601050854,
-0.07541623711585999,
-0.1631229966878891,
0.03031250834465027,
0.01425902172923088,
-0.0594632662832737,
0.04757995903491974,
0.059961482882499695,
-0.10165371745824814,
0.10819483548402786,
0.09530027210712433,
-0.013078106567263603,
0.06798283755779266,
-0.16849711537361145,
-0.020869607105851173,
-0.01446688175201416,
0.009899779222905636,
0.05550243332982063,
0.09964893013238907,
-0.03413357585668564,
0.10497362166643143,
-0.09214533120393753,
0.11017382889986038,
0.10932035744190216,
-0.32057443261146545,
-0.005767723545432091,
0.09167823940515518,
0.039358653128147125,
0.07352814823389053,
-0.04467793554067612,
0.06258884817361832,
0.018015462905168533,
0.017986174672842026,
-0.014015024527907372,
-0.07283061742782593,
-0.11612214148044586,
0.04717336222529411,
-0.08668071031570435,
-0.059868961572647095,
0.2244078367948532,
-0.05464440956711769,
0.06881742179393768,
-0.05281897634267807,
-0.10522868484258652,
-0.04308144748210907,
-0.029833965003490448,
0.00475557055324316,
-0.07660607248544693,
0.08692064881324768,
0.00869679357856512,
-0.09547875821590424,
-0.1376667022705078,
-0.02496783249080181,
-0.1776352822780609,
0.16140350699424744,
0.02465328387916088,
0.05232657864689827,
-0.2027255892753601,
0.09623090922832489,
0.017906051129102707,
-0.08045592904090881,
0.022091427817940712,
-0.10046248883008957,
0.029131146147847176,
0.013760408386588097,
-0.04754498973488808,
-0.061387211084365845,
0.0843690037727356,
0.11199145019054413,
-0.01731434464454651,
0.025486016646027565,
-0.039331406354904175,
0.08100687712430954,
0.03553595021367073,
0.09077847748994827,
0.007288969587534666,
-0.028338588774204254,
0.025842782109975815,
-0.13719046115875244,
-0.003647835226729512,
-0.07116208970546722,
-0.16572439670562744,
-0.021088803187012672,
0.02994808368384838,
0.08289173990488052,
0.015449047088623047,
0.11682453751564026,
-0.03272046521306038,
-0.025152435526251793,
0.03602350503206253,
-0.047656361013650894,
-0.012649794109165668,
0.016648368909955025,
0.013163427822291851,
0.12399329990148544,
-0.0022096503525972366,
0.03235051408410072,
-0.13653022050857544,
0.031423524022102356,
-0.06793295592069626,
-0.003740974934771657,
-0.03486552834510803,
-0.040637075901031494,
0.009043924510478973,
-0.06862333416938782,
0.003486064961180091,
-0.15030112862586975,
-0.15063877403736115,
0.007587034720927477,
-0.007836631499230862,
-0.04107699543237686,
-0.06370922178030014,
-0.06952770054340363,
-0.013550350442528725,
0.04251532256603241,
-0.07093454152345657,
-0.011352915316820145,
-0.06403283774852753,
0.11004766076803207,
-0.03197755664587021,
0.07921615242958069,
-0.11953279376029968,
0.08390819281339645,
-0.11260783672332764,
-0.02386913076043129,
-0.060801517218351364,
0.09317506104707718,
-0.0006014376995153725,
0.09549830108880997,
-0.006563255097717047,
-0.017931854352355003,
-0.07981178909540176,
0.06445012241601944,
-0.042872510850429535,
0.21701598167419434,
-0.0615808479487896,
-0.11181682348251343,
0.28781595826148987,
-0.052628401666879654,
-0.1370542049407959,
0.11647392809391022,
0.008682746440172195,
0.05777018144726753,
0.10703510791063309,
0.19733482599258423,
-0.015276194550096989,
0.004040541127324104,
0.09471915662288666,
0.11263324320316315,
-0.11276852339506149,
-0.033160366117954254,
0.013019153848290443,
-0.04081077128648758,
-0.10867965966463089,
0.04689536616206169,
0.09810488671064377,
0.07090286910533905,
-0.04786505550146103,
-0.03377414867281914,
-0.01366397924721241,
0.0052589005790650845,
0.08885077387094498,
-0.007157256826758385,
0.10962837189435959,
-0.05819983780384064,
-0.03796621412038803,
-0.029282379895448685,
-0.012126247398555279,
-0.03951939567923546,
0.03137664496898651,
-0.043376367539167404,
0.10821941494941711,
-0.011204327456653118,
0.06364280730485916,
-0.16185984015464783,
-0.07691477984189987,
-0.017002692446112633,
0.1581239402294159,
0.024538565427064896,
0.09859629720449448,
0.0552486926317215,
-0.040398042649030685,
-0.0012767292791977525,
0.012792680412530899,
0.15581141412258148,
-0.022091681137681007,
-0.065607450902462,
-0.052166227251291275,
0.08642971515655518,
-0.05641226842999458,
0.04504093527793884,
-0.05937713757157326,
0.012367865070700645,
0.05064384639263153,
0.10342344641685486,
-0.00018274025933351368,
0.03323284164071083,
-0.008164864964783192,
0.002145637758076191,
-0.058205123990774155,
0.007405933458358049,
0.10799351334571838,
0.00036868182360194623,
-0.07365862280130386,
0.22074243426322937,
-0.17796069383621216,
0.1765957772731781,
0.1893044263124466,
-0.299345999956131,
0.017949223518371582,
-0.10759581625461578,
-0.04561871662735939,
0.014407722279429436,
0.05567655712366104,
-0.0454222597181797,
0.1703362911939621,
-0.009871348738670349,
0.18874616920948029,
-0.04946064203977585,
-0.04464937001466751,
-0.0200483538210392,
-0.05118836089968681,
-0.0024189651012420654,
0.07781197130680084,
0.10685696452856064,
-0.13992026448249817,
0.1964332014322281,
0.1621224284172058,
0.048237916082143784,
0.19945049285888672,
0.015346456319093704,
-0.011589210480451584,
0.0909530371427536,
0.005220826715230942,
-0.058739423751831055,
-0.07409929484128952,
-0.2594851851463318,
-0.030033592134714127,
0.07992640137672424,
0.0422382652759552,
0.1212305948138237,
-0.11349532753229141,
-0.038956157863140106,
-0.01763172075152397,
-0.023146281018853188,
0.021672505885362625,
0.0914369598031044,
0.06075398623943329,
0.13201528787612915,
-0.001710098935291171,
-0.007300339173525572,
0.10524573177099228,
0.01783694699406624,
-0.09354141354560852,
0.18308524787425995,
-0.13652534782886505,
-0.37097251415252686,
-0.13911493122577667,
-0.18057456612586975,
-0.05449081212282181,
0.05712554603815079,
0.11679314076900482,
-0.12011238187551498,
-0.018752124160528183,
0.01578843593597412,
0.10931742936372757,
-0.08449502289295197,
0.0021454424131661654,
-0.06880278885364532,
0.0321490578353405,
-0.10310184955596924,
-0.09194442629814148,
-0.055416494607925415,
-0.031392451375722885,
-0.08001253753900528,
0.1423761546611786,
-0.10777941346168518,
0.04476889222860336,
0.20262959599494934,
0.04653622955083847,
0.05625178664922714,
-0.044105201959609985,
0.19377262890338898,
-0.11264272034168243,
-0.01661740615963936,
0.19215328991413116,
-0.048360925167798996,
0.07476246356964111,
0.1232115849852562,
-0.006348740309476852,
-0.08765771239995956,
0.03011748194694519,
-0.02085109055042267,
-0.07988511025905609,
-0.23219464719295502,
-0.13938382267951965,
-0.12429051846265793,
0.09477275609970093,
0.028005298227071762,
0.056365787982940674,
0.17219258844852448,
0.06577219814062119,
-0.038416244089603424,
0.006410336587578058,
0.02959546446800232,
0.08237514644861221,
0.23417828977108002,
-0.06035616248846054,
0.1364797055721283,
-0.03420931473374367,
-0.14982740581035614,
0.08169995993375778,
0.0713929831981659,
0.10213395953178406,
0.06678459793329239,
0.0804823637008667,
0.0149586396291852,
0.06188136339187622,
0.1311223804950714,
0.08191446959972382,
0.019586285576224327,
-0.02480296604335308,
-0.03388110175728798,
-0.025523077696561813,
-0.05937909707427025,
0.040128443390131,
0.06589099019765854,
-0.16763372719287872,
-0.039227183908224106,
-0.09338314831256866,
0.09657008945941925,
0.0873042419552803,
0.06609832495450974,
-0.1842060089111328,
-0.008006223477423191,
0.08488986641168594,
-0.03854905813932419,
-0.13727426528930664,
0.09535189718008041,
0.01523482333868742,
-0.15144726634025574,
0.03139317408204079,
-0.04061909019947052,
0.12188644707202911,
-0.07804752141237259,
0.09809603542089462,
-0.08108244836330414,
-0.07448557764291763,
0.02123199962079525,
0.1261177361011505,
-0.30527687072753906,
0.20240111649036407,
-0.0024993624538183212,
-0.06486981362104416,
-0.1243603527545929,
-0.0032166161108762026,
0.002410882618278265,
0.07357452809810638,
0.10519039630889893,
-0.007196315098553896,
0.001897757756523788,
-0.06300821900367737,
-0.01829923689365387,
0.032471053302288055,
0.13080233335494995,
-0.0401318334043026,
-0.021158374845981598,
-0.050194524228572845,
-0.001653497340157628,
-0.03173094615340233,
-0.06934895366430283,
0.02002747356891632,
-0.19509181380271912,
0.08751901984214783,
0.04166261479258537,
0.09648149460554123,
0.029994789510965347,
0.004265148192644119,
-0.09651939570903778,
0.24698667228221893,
-0.07148019969463348,
-0.10072879493236542,
-0.10919588059186935,
-0.046813901513814926,
0.03569883480668068,
-0.05628936365246773,
0.04309194162487984,
-0.0788632407784462,
0.028997479006648064,
-0.06352769583463669,
-0.19235502183437347,
0.12410202622413635,
-0.09027006477117538,
-0.04412810131907463,
-0.02371402643620968,
0.2110891044139862,
-0.05598580464720726,
0.010335659608244896,
0.02930437959730625,
0.01208863127976656,
-0.11645778268575668,
-0.09678568691015244,
0.031018631532788277,
-0.007351789623498917,
0.050603240728378296,
0.041841957718133926,
-0.05915454775094986,
-0.017138581722974777,
-0.052199993282556534,
-0.022926922887563705,
0.3496883809566498,
0.14231905341148376,
-0.043836336582899094,
0.19347235560417175,
0.12347975373268127,
-0.07452994585037231,
-0.3159443140029907,
-0.1066238060593605,
-0.10937739163637161,
-0.04680149629712105,
-0.07012093812227249,
-0.2002030611038208,
0.06474938243627548,
0.00662544509395957,
-0.013415241613984108,
0.12749312818050385,
-0.2561831772327423,
-0.07571036368608475,
0.15906259417533875,
-0.017980827018618584,
0.3745945692062378,
-0.1168576180934906,
-0.10926306992769241,
-0.03950892388820648,
-0.14175476133823395,
0.16968177258968353,
-0.01989765651524067,
0.11221715062856674,
-0.009765521623194218,
0.14388824999332428,
0.05548359826207161,
-0.023479344323277473,
0.08544106781482697,
0.004999885335564613,
-0.03290518373250961,
-0.10304180532693863,
-0.05676887184381485,
0.007092386484146118,
0.02477436140179634,
0.018026655539870262,
-0.041834570467472076,
0.02227151393890381,
-0.11731979995965958,
-0.04657655209302902,
-0.08982590585947037,
0.04431166127324104,
0.03899754583835602,
-0.07325074821710587,
-0.002380647463724017,
-0.07165111601352692,
-0.012272949330508709,
0.022334342822432518,
0.20356793701648712,
-0.08029330521821976,
0.16448934376239777,
0.09239562600851059,
0.12419285625219345,
-0.14376309514045715,
-0.00019283240544609725,
-0.0762530043721199,
-0.05611240118741989,
0.07737895101308823,
-0.09433035552501678,
0.058893077075481415,
0.10901971161365509,
-0.04567738622426987,
0.08828683942556381,
0.10377411544322968,
0.008936077356338501,
0.003213887568563223,
0.10916902124881744,
-0.2667325437068939,
-0.0296600554138422,
-0.07532413303852081,
0.000883326749317348,
0.09092561900615692,
0.08562852442264557,
0.18840822577476501,
0.025361526757478714,
-0.04293036088347435,
-0.002770674182102084,
0.028597986325621605,
-0.039021048694849014,
0.051667019724845886,
0.001123449532315135,
0.01947369985282421,
-0.1530752182006836,
0.072522833943367,
0.01490565575659275,
-0.15215420722961426,
0.021316176280379295,
0.16572684049606323,
-0.11656328290700912,
-0.1283872276544571,
-0.06520111113786697,
0.08313824236392975,
-0.11755692958831787,
-0.01578943058848381,
-0.03279297426342964,
-0.13145680725574493,
0.07992171496152878,
0.12629036605358124,
0.05557859688997269,
0.0972496047616005,
-0.06061713397502899,
-0.020469192415475845,
-0.018721895292401314,
-0.014099318534135818,
-0.012384648434817791,
-0.007667020428925753,
-0.055978111922740936,
0.0590752474963665,
-0.026677248999476433,
0.1425808072090149,
-0.09221141785383224,
-0.1037059873342514,
-0.16142144799232483,
0.0374140702188015,
-0.11013076454401016,
-0.08825794607400894,
-0.08821134269237518,
-0.050188567489385605,
0.002360827289521694,
-0.019856395199894905,
-0.04037635400891304,
-0.05829505994915962,
-0.12300454825162888,
0.0338277705013752,
-0.040771447122097015,
0.024727050215005875,
-0.07512269169092178,
0.015856385231018066,
0.08507686108350754,
-0.03285100311040878,
0.15655414760112762,
0.1450488418340683,
-0.1006515845656395,
0.10741901397705078,
-0.14806775748729706,
-0.09138492494821548,
0.11116421222686768,
0.015329592861235142,
0.0449691042304039,
0.09723787009716034,
0.013362943194806576,
0.0635865181684494,
0.032776717096567154,
0.05308786407113075,
0.027619892731308937,
-0.11959987878799438,
0.06483134627342224,
-0.03626115620136261,
-0.14700546860694885,
-0.049338050186634064,
-0.05282869189977646,
0.01647452637553215,
0.013054544106125832,
0.09622690081596375,
-0.05301849544048309,
0.10698331147432327,
-0.04055701196193695,
0.0346808135509491,
0.017554637044668198,
-0.1730053424835205,
-0.03816922754049301,
-0.08538098633289337,
0.03681723028421402,
0.014741539023816586,
0.25266793370246887,
0.030072299763560295,
0.012416383251547813,
0.032671261578798294,
0.08285367488861084,
0.03899408504366875,
0.010228337720036507,
0.17482228577136993,
0.1162426546216011,
-0.06621865928173065,
-0.10445023328065872,
0.0729617029428482,
0.016332454979419708,
0.01286179106682539,
0.13617953658103943,
0.008365051820874214,
0.005795429926365614,
0.08649782836437225,
-0.016865963116288185,
0.009968153201043606,
-0.10052056610584259,
-0.13426925241947174,
-0.022176474332809448,
0.05151832848787308,
-0.04655967652797699,
0.11727844923734665,
0.1406494379043579,
-0.01806013658642769,
0.03222079202532768,
-0.021771740168333054,
-0.05699979141354561,
-0.1683429479598999,
-0.1429590880870819,
-0.06883849948644638,
-0.13416796922683716,
0.00897989235818386,
-0.11180389672517776,
0.05395037308335304,
0.06001098081469536,
0.06750501692295074,
-0.06899319589138031,
0.10220931470394135,
0.04626858979463577,
-0.11440542340278625,
0.06264589726924896,
-0.0296088308095932,
0.09430401772260666,
-0.02759445086121559,
-0.019505485892295837,
-0.09039592742919922,
0.014574515633285046,
0.011419114656746387,
0.06245238706469536,
-0.04707273095846176,
0.007463190704584122,
-0.14696238934993744,
-0.08972041308879852,
-0.0523175448179245,
0.0718572810292244,
-0.050409089773893356,
0.14282815158367157,
0.00775480642914772,
-0.0170906875282526,
0.039554283022880554,
0.22787313163280487,
-0.07476283609867096,
-0.04778539761900902,
-0.05269690603017807,
0.20717895030975342,
0.02975541539490223,
0.1171872541308403,
-0.022938819602131844,
-0.006106364540755749,
-0.0919521227478981,
0.3764844834804535,
0.30030161142349243,
-0.09031439572572708,
0.011794124729931355,
0.02137952297925949,
0.04502861574292183,
0.1316293478012085,
0.1216534823179245,
0.10318691283464432,
0.3006802201271057,
-0.07452366501092911,
-0.04653361067175865,
-0.012629742734134197,
-0.023858042433857918,
-0.09059546142816544,
0.1021224707365036,
0.04839762672781944,
-0.06382183730602264,
-0.03313443064689636,
0.0954432487487793,
-0.25862133502960205,
0.1277991235256195,
-0.12311873584985733,
-0.17578600347042084,
-0.06654827296733856,
0.009760108776390553,
0.10465722531080246,
0.015642458572983742,
0.0946015790104866,
0.007128213066607714,
-0.11252258718013763,
0.06305865943431854,
0.03397420793771744,
-0.22762253880500793,
0.0006893770187161863,
0.06642123311758041,
-0.07006710022687912,
-0.0024247700348496437,
-0.026499588042497635,
0.05657242611050606,
0.0656052976846695,
0.054629553109407425,
-0.00971333310008049,
0.03816632181406021,
0.0034184439573436975,
-0.0585215799510479,
0.016623929142951965,
0.05121519789099693,
0.02472509816288948,
-0.09763528406620026,
0.06927435845136642,
-0.1574270874261856,
0.04766253009438515,
-0.0030655991286039352,
-0.04124255105853081,
0.006064958870410919,
0.008823691867291927,
-0.06491616368293762,
0.05165379121899605,
0.07916834205389023,
-0.0016257909592241049,
-0.0062433634884655476,
-0.057178743183612823,
-0.02632102556526661,
-0.027755750343203545,
-0.09291748702526093,
-0.10495562851428986,
-0.14682936668395996,
-0.11640441417694092,
0.09368976950645447,
-0.01011267676949501,
-0.1848134547472,
0.022154374048113823,
-0.08606051653623581,
0.08319322764873505,
-0.1670055389404297,
0.08040720224380493,
0.07041648775339127,
0.013038921169936657,
-0.0031511052511632442,
-0.02002427540719509,
0.054132770746946335,
0.086809903383255,
-0.10407156497240067,
-0.07400695979595184
] |
null | null | transformers |
## Model description
This model is a fine-tuned version of [facebook/wav2vec2-xls-r-1b](https://huggingface.co/facebook/wav2vec2-xls-r-1b) on the MOZILLA-FOUNDATION/COMMON_VOICE_8_0 - FR dataset.
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 7.5e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- gradient_accumulation_steps: 8
- total_train_batch_size: 128
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 2000
- num_epochs: 4.0
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Wer |
|:-------------:|:-----:|:-----:|:---------------:|:------:|
| 0.9827 | 0.29 | 1000 | inf | 0.2937 |
| 1.0203 | 0.57 | 2000 | inf | 0.2711 |
| 1.0048 | 0.86 | 3000 | inf | 0.2620 |
| 0.9858 | 1.15 | 4000 | inf | 0.2522 |
| 0.9709 | 1.43 | 5000 | inf | 0.2365 |
| 0.9347 | 1.72 | 6000 | inf | 0.2332 |
| 0.9256 | 2.01 | 7000 | inf | 0.2261 |
| 0.8936 | 2.29 | 8000 | inf | 0.2203 |
| 0.877 | 2.58 | 9000 | inf | 0.2096 |
| 0.8393 | 2.87 | 10000 | inf | 0.2017 |
| 0.8156 | 3.15 | 11000 | inf | 0.1936 |
| 0.8015 | 3.44 | 12000 | inf | 0.1880 |
| 0.774 | 3.73 | 13000 | inf | 0.1834 |
It achieves the best result on the validation set on STEP 13000:
- Wer: 0.1834
Some problem occurs when calculating the validation loss.
### Framework versions
- Transformers 4.17.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.18.3.dev0
- Tokenizers 0.11.0
### Evaluation Commands
1. To evaluate on `mozilla-foundation/common_voice_8` with split `test`
```bash
python eval.py --model_id Plim/xls-r-1b-cv_8-fr --dataset mozilla-foundation/common_voice_8_0 --config fr --split test
```
2. To evaluate on `speech-recognition-community-v2/dev_data`
```bash
python eval.py --model_id Plim/xls-r-1b-cv_8-fr --dataset speech-recognition-community-v2/dev_data --config fr --split validation --chunk_length_s 5.0 --stride_length_s 1.0
```
| {"language": ["fr"], "license": "apache-2.0", "tags": ["automatic-speech-recognition", "mozilla-foundation/common_voice_8_0", "generated_from_trainer"], "model-index": [{"name": "XLS-R-1B - French", "results": [{"task": {"type": "automatic-speech-recognition", "name": "Automatic Speech Recognition"}, "dataset": {"name": "Common Voice 8", "type": "mozilla-foundation/common_voice_8_0", "args": "fr"}, "metrics": [{"type": "wer", "value": 18.33, "name": "Test WER"}, {"type": "cer", "value": 5.6, "name": "Test CER"}]}, {"task": {"type": "automatic-speech-recognition", "name": "Automatic Speech Recognition"}, "dataset": {"name": "Robust Speech Event - Dev Data", "type": "speech-recognition-community-v2/dev_data", "args": "fr"}, "metrics": [{"type": "wer", "value": 60.25, "name": "Test WER"}, {"type": "cer", "value": 15.68, "name": "Test CER"}]}]}]} | automatic-speech-recognition | Plim/test_lm | [
"transformers",
"pytorch",
"wav2vec2",
"automatic-speech-recognition",
"mozilla-foundation/common_voice_8_0",
"generated_from_trainer",
"fr",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [
"fr"
] | TAGS
#transformers #pytorch #wav2vec2 #automatic-speech-recognition #mozilla-foundation/common_voice_8_0 #generated_from_trainer #fr #license-apache-2.0 #model-index #endpoints_compatible #region-us
| Model description
-----------------
This model is a fine-tuned version of facebook/wav2vec2-xls-r-1b on the MOZILLA-FOUNDATION/COMMON\_VOICE\_8\_0 - FR dataset.
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 7.5e-05
* train\_batch\_size: 16
* eval\_batch\_size: 16
* seed: 42
* gradient\_accumulation\_steps: 8
* total\_train\_batch\_size: 128
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* lr\_scheduler\_warmup\_steps: 2000
* num\_epochs: 4.0
* mixed\_precision\_training: Native AMP
### Training results
It achieves the best result on the validation set on STEP 13000:
* Wer: 0.1834
Some problem occurs when calculating the validation loss.
### Framework versions
* Transformers 4.17.0.dev0
* Pytorch 1.10.2+cu102
* Datasets 1.18.3.dev0
* Tokenizers 0.11.0
### Evaluation Commands
1. To evaluate on 'mozilla-foundation/common\_voice\_8' with split 'test'
2. To evaluate on 'speech-recognition-community-v2/dev\_data'
| [
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 7.5e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* gradient\\_accumulation\\_steps: 8\n* total\\_train\\_batch\\_size: 128\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 2000\n* num\\_epochs: 4.0\n* mixed\\_precision\\_training: Native AMP",
"### Training results\n\n\n\nIt achieves the best result on the validation set on STEP 13000:\n\n\n* Wer: 0.1834\n\n\nSome problem occurs when calculating the validation loss.",
"### Framework versions\n\n\n* Transformers 4.17.0.dev0\n* Pytorch 1.10.2+cu102\n* Datasets 1.18.3.dev0\n* Tokenizers 0.11.0",
"### Evaluation Commands\n\n\n1. To evaluate on 'mozilla-foundation/common\\_voice\\_8' with split 'test'\n2. To evaluate on 'speech-recognition-community-v2/dev\\_data'"
] | [
"TAGS\n#transformers #pytorch #wav2vec2 #automatic-speech-recognition #mozilla-foundation/common_voice_8_0 #generated_from_trainer #fr #license-apache-2.0 #model-index #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 7.5e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* gradient\\_accumulation\\_steps: 8\n* total\\_train\\_batch\\_size: 128\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 2000\n* num\\_epochs: 4.0\n* mixed\\_precision\\_training: Native AMP",
"### Training results\n\n\n\nIt achieves the best result on the validation set on STEP 13000:\n\n\n* Wer: 0.1834\n\n\nSome problem occurs when calculating the validation loss.",
"### Framework versions\n\n\n* Transformers 4.17.0.dev0\n* Pytorch 1.10.2+cu102\n* Datasets 1.18.3.dev0\n* Tokenizers 0.11.0",
"### Evaluation Commands\n\n\n1. To evaluate on 'mozilla-foundation/common\\_voice\\_8' with split 'test'\n2. To evaluate on 'speech-recognition-community-v2/dev\\_data'"
] | [
74,
159,
37,
39,
57
] | [
"passage: TAGS\n#transformers #pytorch #wav2vec2 #automatic-speech-recognition #mozilla-foundation/common_voice_8_0 #generated_from_trainer #fr #license-apache-2.0 #model-index #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 7.5e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* gradient\\_accumulation\\_steps: 8\n* total\\_train\\_batch\\_size: 128\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 2000\n* num\\_epochs: 4.0\n* mixed\\_precision\\_training: Native AMP### Training results\n\n\n\nIt achieves the best result on the validation set on STEP 13000:\n\n\n* Wer: 0.1834\n\n\nSome problem occurs when calculating the validation loss.### Framework versions\n\n\n* Transformers 4.17.0.dev0\n* Pytorch 1.10.2+cu102\n* Datasets 1.18.3.dev0\n* Tokenizers 0.11.0### Evaluation Commands\n\n\n1. To evaluate on 'mozilla-foundation/common\\_voice\\_8' with split 'test'\n2. To evaluate on 'speech-recognition-community-v2/dev\\_data'"
] | [
-0.10730715095996857,
0.10671558231115341,
-0.005006418097764254,
0.029479580000042915,
0.11100972443819046,
0.031739987432956696,
0.07926255464553833,
0.15007993578910828,
-0.07044652104377747,
0.12420922517776489,
0.0657072588801384,
0.06102626025676727,
0.09601651132106781,
0.1497226059436798,
-0.0423869863152504,
-0.2024485170841217,
0.05659031495451927,
-0.05851815268397331,
-0.08695393800735474,
0.12018025666475296,
0.11145050823688507,
-0.08538442850112915,
0.023444674909114838,
0.02036667987704277,
-0.09904226660728455,
-0.01235053688287735,
-0.019017595797777176,
-0.027420610189437866,
0.07322677224874496,
0.06714267283678055,
0.05172105133533478,
0.04681481048464775,
0.02117334119975567,
-0.28641101717948914,
0.0029880860820412636,
0.09033006429672241,
0.033103954046964645,
0.057609543204307556,
0.11486516147851944,
-0.022804977372288704,
0.07165981084108353,
-0.08106295019388199,
0.017236316576600075,
0.08284816890954971,
-0.09783218055963516,
-0.25167661905288696,
-0.1544361412525177,
0.06943555921316147,
0.14096316695213318,
0.061279624700546265,
-0.048431843519210815,
0.08104808628559113,
-0.07252660393714905,
0.07181541621685028,
0.19894471764564514,
-0.2695615887641907,
-0.07531990110874176,
0.005962343420833349,
0.0604047067463398,
-0.01370774395763874,
-0.09577181190252304,
0.008162050507962704,
0.041979532688856125,
0.007597850635647774,
0.04135628417134285,
0.00023758703900966793,
0.04178089648485184,
-0.00831899605691433,
-0.1346295028924942,
-0.08981455117464066,
0.15512694418430328,
0.08859200775623322,
-0.06717916578054428,
-0.1020887941122055,
-0.017724426463246346,
-0.16471605002880096,
-0.011994250118732452,
-0.018695294857025146,
0.00869639590382576,
-0.007200116291642189,
-0.05441904813051224,
0.006319889333099127,
-0.06191621720790863,
-0.06239617243409157,
0.035533223301172256,
0.143371120095253,
0.036774277687072754,
-0.03554648160934448,
0.04985309764742851,
0.11067372560501099,
0.01300272811204195,
-0.14722894132137299,
-0.05716220289468765,
0.0011726696975529194,
-0.14228790998458862,
-0.036675773561000824,
-0.038729436695575714,
0.0415169931948185,
0.07222098112106323,
0.20117557048797607,
-0.024415219202637672,
0.08175025135278702,
0.019513504579663277,
0.014187156222760677,
-0.06226469948887825,
0.15103085339069366,
-0.07356490939855576,
-0.09210526943206787,
-0.02829359471797943,
0.11485909670591354,
-0.004788781050592661,
-0.005229281727224588,
-0.01116914115846157,
0.02083628997206688,
0.13583070039749146,
0.09384220838546753,
0.016491679474711418,
0.01573163829743862,
-0.09621847420930862,
-0.002822941169142723,
-0.004541787318885326,
-0.13858653604984283,
0.04920753836631775,
0.05128825083374977,
-0.055482205003499985,
-0.0032466724514961243,
-0.0002769769635051489,
0.0030410257168114185,
-0.049582093954086304,
0.08473919332027435,
-0.05149957910180092,
-0.019678248092532158,
-0.07912775874137878,
-0.09459806978702545,
0.0489228256046772,
-0.03891216963529587,
-0.040204714983701706,
-0.06299910694360733,
-0.08327431231737137,
-0.0914849117398262,
0.0451112799346447,
-0.06898562610149384,
-0.048353515565395355,
-0.039898283779621124,
-0.10618657618761063,
0.04624953120946884,
-0.020383235067129135,
0.15424245595932007,
-0.04128048196434975,
0.07781823724508286,
0.04870821163058281,
0.029510732740163803,
0.099408358335495,
0.042912907898426056,
-0.008770505897700787,
0.08731839805841446,
-0.11635241657495499,
0.11648520082235336,
-0.1168595477938652,
0.029129719361662865,
-0.1714402735233307,
-0.07847154140472412,
0.008630295284092426,
-0.01592058688402176,
0.12421809136867523,
0.1571454405784607,
-0.15458828210830688,
-0.057863183319568634,
0.1283990889787674,
-0.07881961017847061,
-0.11008279770612717,
0.137282133102417,
0.0026070501189678907,
-0.006424557417631149,
0.016974061727523804,
0.09774445742368698,
0.14816267788410187,
-0.08537814021110535,
-0.043082669377326965,
-0.07217943668365479,
0.07549338042736053,
0.10865180939435959,
0.09485349804162979,
-0.06434616446495056,
0.020056361332535744,
0.002992545487359166,
-0.07114523649215698,
0.016634203493595123,
-0.08871383965015411,
-0.06960957497358322,
-0.035609181970357895,
-0.04520463943481445,
0.015423928387463093,
0.020610112696886063,
-0.007405593991279602,
-0.08389024436473846,
-0.14716912806034088,
-0.006546611897647381,
0.1161322295665741,
-0.07654070854187012,
0.016224106773734093,
-0.11431356519460678,
0.10958481580018997,
-0.036294374614953995,
0.023423977196216583,
-0.17290276288986206,
-0.044853325933218,
0.04391586408019066,
-0.06085444241762161,
-0.033089570701122284,
-0.04853147268295288,
0.04998593404889107,
0.010408950038254261,
-0.024678664281964302,
-0.05191627889871597,
-0.058825645595788956,
-0.01155154500156641,
-0.03767196834087372,
-0.21141476929187775,
-0.10460434854030609,
-0.019342757761478424,
0.2159617841243744,
-0.19734250009059906,
0.008273291401565075,
0.1281757354736328,
0.1644516885280609,
-0.0018313018372282386,
-0.0399618037045002,
-0.0001688105840003118,
0.054528556764125824,
-0.03699888288974762,
-0.07000415027141571,
0.023703299462795258,
0.01134233083575964,
-0.11145956814289093,
0.020134225487709045,
-0.12235425412654877,
0.00825399812310934,
0.08807221055030823,
0.04871004447340965,
-0.08518610894680023,
-0.05062132328748703,
-0.06076391786336899,
-0.052031099796295166,
-0.03772663697600365,
-0.0528724379837513,
0.10009162873029709,
0.04879233241081238,
0.09555496275424957,
-0.07117031514644623,
-0.05275201052427292,
0.036890532821416855,
0.01925167441368103,
-0.028413813561201096,
0.1320313960313797,
0.023367390036582947,
-0.08566396683454514,
0.08972444385290146,
0.062747061252594,
-0.038055844604969025,
0.08471492677927017,
-0.07414355874061584,
-0.08585154265165329,
-0.06037251278758049,
0.07802464812994003,
0.05258791893720627,
0.08040211349725723,
-0.13506028056144714,
0.0031958867330104113,
0.03592928871512413,
0.024863118305802345,
0.02661437913775444,
-0.17782755196094513,
0.03623773902654648,
0.03223260119557381,
-0.08009202033281326,
0.007855555042624474,
0.01322259847074747,
0.003731371369212866,
0.0674302726984024,
0.027255743741989136,
-0.027753960341215134,
-0.0010196355869993567,
-0.0446818508207798,
-0.10975541919469833,
0.1699373573064804,
-0.10344088822603226,
-0.15033148229122162,
-0.15594077110290527,
0.004133146721869707,
-0.043937038630247116,
-0.027926545590162277,
0.05584210529923439,
-0.06502137333154678,
-0.06845568865537643,
-0.05979619175195694,
-0.0046054404228925705,
0.006184788886457682,
0.0013153296895325184,
0.04717302322387695,
0.0009805540321394801,
0.08129516243934631,
-0.10300081968307495,
0.014181183651089668,
0.02208714373409748,
-0.03387334942817688,
-0.000025861594622256234,
0.005326564889401197,
0.10168646275997162,
0.13530591130256653,
0.03331487253308296,
0.03596815839409828,
-0.027512358501553535,
0.19675478339195251,
-0.13518238067626953,
-0.007077193818986416,
0.13709212839603424,
0.0007567985448986292,
0.0580458901822567,
0.13386845588684082,
0.00585425179451704,
-0.08713578432798386,
0.019506504759192467,
0.04424626752734184,
-0.00385284167714417,
-0.2592013478279114,
-0.01660955138504505,
-0.08364465832710266,
-0.025973964482545853,
0.06891755759716034,
0.03511975705623627,
-0.007878593169152737,
0.027022453024983406,
-0.05607696995139122,
-0.02843891829252243,
0.0534629300236702,
0.075726218521595,
0.1015339121222496,
0.048892512917518616,
0.10373183339834213,
-0.030734023079276085,
0.017495350912213326,
0.04644604027271271,
0.015075809322297573,
0.24037039279937744,
0.00158736202865839,
0.18660804629325867,
0.090140700340271,
0.14387314021587372,
-0.012037989683449268,
0.04672249034047127,
0.007116161752492189,
0.02117326483130455,
0.03738292306661606,
-0.06694988161325455,
-0.051863521337509155,
0.03030962496995926,
0.09550107270479202,
0.03187026083469391,
-0.11184210330247879,
0.013500696048140526,
0.0701279267668724,
0.35488566756248474,
0.08948265016078949,
-0.2591702938079834,
-0.0844559594988823,
0.019806886091828346,
-0.05326918140053749,
-0.04729069769382477,
-0.0277792289853096,
0.12153194844722748,
-0.10870584100484848,
0.08221074193716049,
-0.05950123071670532,
0.09309019148349762,
-0.03650025278329849,
-0.00567674171179533,
0.08999021351337433,
0.10001654922962189,
0.022596482187509537,
0.05595109984278679,
-0.23026731610298157,
0.2244771122932434,
-0.003220459446310997,
0.08964298665523529,
-0.03227855637669563,
0.07273126393556595,
0.030706437304615974,
-0.008948184549808502,
0.07828164100646973,
-0.00991223007440567,
-0.044167179614305496,
-0.11354728043079376,
-0.08746635168790817,
0.01455681212246418,
0.13686718046665192,
-0.07146965712308884,
0.13298970460891724,
-0.05310571938753128,
-0.030262533575296402,
0.015165947377681732,
0.01659291237592697,
-0.13978038728237152,
-0.11078507453203201,
0.07206729054450989,
-0.015605193562805653,
0.1052609458565712,
-0.09603069722652435,
-0.08667788654565811,
-0.07944013178348541,
0.24218325316905975,
-0.13044650852680206,
-0.03307679668068886,
-0.13474026322364807,
0.05389590933918953,
0.17681065201759338,
-0.058211758732795715,
0.04309415444731712,
-0.007156406529247761,
0.1438039243221283,
0.025574704632163048,
0.0029822923243045807,
0.08695542812347412,
-0.06824767589569092,
-0.24342641234397888,
-0.029712243005633354,
0.1798829287290573,
0.001269466825760901,
0.03436122462153435,
0.015066301450133324,
0.01120015513151884,
0.006882163230329752,
-0.09684024006128311,
0.045199982821941376,
0.05497152730822563,
0.014860301278531551,
0.06885778903961182,
-0.016103891655802727,
-0.03768259659409523,
-0.10672472417354584,
-0.06558003276586533,
0.08670072257518768,
0.2980409562587738,
-0.07798312604427338,
0.0077904388308525085,
0.026411451399326324,
-0.058959223330020905,
-0.13893118500709534,
-0.006679026409983635,
0.11839664727449417,
0.04382737725973129,
-0.026298679411411285,
-0.1278972029685974,
-0.01579402945935726,
0.10857462882995605,
-0.022043820470571518,
0.0884719267487526,
-0.3101474344730377,
-0.13260096311569214,
0.026088278740644455,
0.0529407262802124,
-0.03352796286344528,
-0.1856827437877655,
-0.09038867056369781,
-0.034839849919080734,
-0.16533507406711578,
0.04926823452115059,
-0.0357876792550087,
0.11263730376958847,
0.015497101470828056,
-0.02528434805572033,
0.014317798428237438,
-0.04539932683110237,
0.19347581267356873,
0.01048369612544775,
0.03504375368356705,
-0.0029395746532827616,
0.05667319893836975,
0.043678320944309235,
-0.07933133840560913,
0.007272794377058744,
-0.054459188133478165,
0.03465434908866882,
-0.14055012166500092,
-0.014953038655221462,
-0.06666017323732376,
0.030336888507008553,
-0.07462339103221893,
-0.0014583389274775982,
-0.027570446953177452,
0.06208036094903946,
0.07048428803682327,
0.009828752838075161,
0.059914980083703995,
-0.055661194026470184,
0.13688358664512634,
0.1765749305486679,
0.08999276906251907,
0.0101940231397748,
-0.09984903782606125,
0.025678958743810654,
-0.009649301879107952,
0.029131416231393814,
-0.10976838320493698,
0.0645051896572113,
0.13757018744945526,
0.04082343354821205,
0.1637692153453827,
0.03971336781978607,
-0.12096986174583435,
0.019315160810947418,
0.07632589340209961,
-0.07398509979248047,
-0.15438465774059296,
-0.007435058243572712,
0.009533131495118141,
-0.12799768149852753,
-0.000800659297965467,
0.1364358365535736,
-0.010553796775639057,
-0.02147674560546875,
0.0197602529078722,
0.053858764469623566,
-0.04197946563363075,
0.22042182087898254,
-0.01740262098610401,
0.11781198531389236,
-0.08162283152341843,
0.08093057572841644,
0.04686480388045311,
-0.1309029459953308,
0.013512814417481422,
0.07106094807386398,
-0.07038677483797073,
-0.028410419821739197,
-0.03256834298372269,
0.02125181443989277,
0.028296636417508125,
-0.07051743566989899,
-0.10145291686058044,
-0.15838713943958282,
0.06913943588733673,
0.09232311695814133,
0.037156589329242706,
0.07626117020845413,
0.03556463122367859,
0.0028549027629196644,
-0.09145384281873703,
0.09359804540872574,
0.12509343028068542,
0.052781254053115845,
-0.12136033177375793,
0.10667648166418076,
0.008391477167606354,
0.011238062754273415,
0.003679802408441901,
-0.008359776809811592,
-0.1124025359749794,
0.003912723157554865,
-0.15081435441970825,
0.00002047362613666337,
-0.06294126063585281,
-0.0189723651856184,
0.030258765444159508,
-0.03805661201477051,
-0.06502299755811691,
0.043235406279563904,
-0.09863977879285812,
-0.07917086035013199,
-0.026120904833078384,
0.06794396042823792,
-0.10928857326507568,
0.0001756573183229193,
0.05051051452755928,
-0.162632554769516,
0.11088420450687408,
0.030535420402884483,
0.03537604957818985,
-0.0008934213547036052,
-0.07477809488773346,
-0.010840712115168571,
0.016381923109292984,
-0.004145415034145117,
0.02877851016819477,
-0.2403840571641922,
0.026665305718779564,
-0.023837177082896233,
-0.0036914334632456303,
0.0007470856071449816,
0.04602975398302078,
-0.13141754269599915,
-0.03604878485202789,
-0.023694656789302826,
-0.018164874985814095,
-0.04166151583194733,
0.04590921103954315,
0.08873774856328964,
0.04962147772312164,
0.1731560081243515,
-0.06321123242378235,
0.044463902711868286,
-0.19994018971920013,
-0.001079283538274467,
-0.03554001450538635,
-0.04180346429347992,
-0.02933238260447979,
-0.010498649440705776,
0.08809319883584976,
-0.06053986772894859,
0.058434780687093735,
-0.08304500579833984,
0.09031879901885986,
0.041256148368120193,
-0.134811669588089,
-0.026651954278349876,
0.06179403141140938,
0.12872599065303802,
0.05257253721356392,
0.0037258181255310774,
0.040324777364730835,
-0.03224857151508331,
0.028003349900245667,
0.018305234611034393,
0.14839647710323334,
0.1533840298652649,
0.04084981977939606,
0.10352068394422531,
0.07269829511642456,
-0.12315891683101654,
-0.09153936058282852,
0.12000808864831924,
-0.07042381912469864,
0.13999073207378387,
-0.04960335046052933,
0.11366326361894608,
0.12159813940525055,
-0.2023274302482605,
0.07121659070253372,
-0.06593506038188934,
-0.06957202404737473,
-0.09010358899831772,
-0.04009796306490898,
-0.0780642032623291,
-0.16286508738994598,
0.016279274597764015,
-0.10255084931850433,
0.05267085134983063,
0.052581217139959335,
0.043796319514513016,
0.012546209618449211,
0.054764822125434875,
-0.018160834908485413,
-0.030356191098690033,
0.11679337173700333,
-0.00869281031191349,
-0.004828022792935371,
0.0012336891377344728,
-0.09347328543663025,
0.05061401054263115,
-0.008959629572927952,
0.0987311527132988,
0.017930638045072556,
-0.030577868223190308,
0.05045046657323837,
0.0016302940202876925,
-0.10189574211835861,
0.021417688578367233,
-0.0036579081788659096,
0.04359372705221176,
0.10576777160167694,
0.03789646178483963,
0.002173075685277581,
-0.008769150823354721,
0.19737547636032104,
-0.079258993268013,
-0.06244440749287605,
-0.18841031193733215,
0.22453944385051727,
0.013029897585511208,
0.007499015890061855,
0.02693428285419941,
-0.11015184968709946,
-0.007398713380098343,
0.14508023858070374,
0.08595892786979675,
0.002064899541437626,
-0.020075542852282524,
0.028746357187628746,
-0.012207679450511932,
-0.02566160447895527,
0.0614502988755703,
0.10244549065828323,
0.03850893676280975,
-0.03783957660198212,
0.020198538899421692,
-0.0246041938662529,
-0.05818590894341469,
-0.006900193635374308,
0.09121006727218628,
-0.01718074083328247,
-0.00347824115306139,
-0.014841770753264427,
0.11349492520093918,
-0.03593484312295914,
-0.1700250208377838,
0.07834965735673904,
-0.1548892706632614,
-0.18629387021064758,
-0.032376501709222794,
0.03825726732611656,
0.008969111368060112,
0.08018036186695099,
0.0018131359247490764,
-0.048109591007232666,
0.14885830879211426,
0.0036384903360158205,
-0.006795804016292095,
-0.10967683047056198,
0.07783276587724686,
-0.07471439242362976,
0.16291436553001404,
-0.023934099823236465,
0.04009609669446945,
0.14864222705364227,
0.028395207598805428,
-0.13772155344486237,
0.026190388947725296,
0.09034986048936844,
-0.12180304527282715,
0.06741388887166977,
0.1623271405696869,
-0.01799672096967697,
0.1095055416226387,
0.044442251324653625,
-0.0844125747680664,
0.0022767409682273865,
-0.06735745817422867,
0.0035258287098258734,
-0.09368233382701874,
0.009059199132025242,
-0.0675119087100029,
0.12754526734352112,
0.21067936718463898,
-0.0803888663649559,
0.013255592435598373,
-0.054051101207733154,
0.03649736940860748,
0.006833914667367935,
0.15498439967632294,
-0.02813628502190113,
-0.261188805103302,
0.05966871604323387,
0.0134241608902812,
0.03485529497265816,
-0.21129444241523743,
-0.07743632048368454,
0.03608037903904915,
-0.041559264063835144,
-0.03958924114704132,
0.1444622427225113,
0.06312168389558792,
0.04726405814290047,
-0.04860672354698181,
-0.17732763290405273,
-0.026936424896121025,
0.1729097068309784,
-0.13564105331897736,
-0.058514952659606934
] |
null | null | transformers |
## Model description
This model is a fine-tuned version of [facebook/wav2vec2-xls-r-1b](https://huggingface.co/facebook/wav2vec2-xls-r-1b) on the MOZILLA-FOUNDATION/COMMON_VOICE_8_0 - FR dataset.
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 7.5e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- gradient_accumulation_steps: 8
- total_train_batch_size: 128
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 2000
- num_epochs: 6.0
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Wer |
|:-------------:|:-----:|:-----:|:---------------:|:------:|
| 0.9827 | 0.29 | 1000 | inf | 0.2937 |
| 1.0203 | 0.57 | 2000 | inf | 0.2711 |
| 1.0048 | 0.86 | 3000 | inf | 0.2620 |
| 0.9858 | 1.15 | 4000 | inf | 0.2522 |
| 0.9709 | 1.43 | 5000 | inf | 0.2365 |
| 0.9347 | 1.72 | 6000 | inf | 0.2332 |
| 0.9256 | 2.01 | 7000 | inf | 0.2261 |
| 0.8936 | 2.29 | 8000 | inf | 0.2203 |
| 0.877 | 2.58 | 9000 | inf | 0.2096 |
| 0.8393 | 2.87 | 10000 | inf | 0.2017 |
| 0.8156 | 3.15 | 11000 | inf | 0.1936 |
| 0.8015 | 3.44 | 12000 | inf | 0.1880 |
| 0.774 | 3.73 | 13000 | inf | 0.1834 |
| 0.8372 | 4.01 | 14000 | inf | 0.1934 |
| 0.8075 | 4.3 | 15000 | inf | 0.1923 |
| 0.8069 | 4.59 | 16000 | inf | 0.1877 |
| 0.8064 | 4.87 | 17000 | inf | 0.1955 |
| 0.801 | 5.16 | 18000 | inf | 0.1891 |
| 0.8022 | 5.45 | 19000 | inf | 0.1895 |
| 0.792 | 5.73 | 20000 | inf | 0.1854 |
It achieves the best result on the validation set on STEP 13000:
- Wer: 0.1834
Some problem occurs when calculating the validation loss.
### Framework versions
- Transformers 4.17.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.18.3.dev0
- Tokenizers 0.11.0
### Evaluation Commands
1. To evaluate on `mozilla-foundation/common_voice_8` with split `test`
```bash
python eval.py --model_id Plim/xls-r-1b-cv_8-fr --dataset mozilla-foundation/common_voice_8_0 --config fr --split test
```
2. To evaluate on `speech-recognition-community-v2/dev_data`
```bash
python eval.py --model_id Plim/xls-r-1b-cv_8-fr --dataset speech-recognition-community-v2/dev_data --config fr --split validation --chunk_length_s 5.0 --stride_length_s 1.0
```
### Evaluation Results
Without LM:
| Dataset | WER | CER |
|:----------:|:-----:|:-----:|
| TEST CV | 18.33 | 5.60 |
| DEV audio | 31.33 | 13.20 |
| TEST audio | / | / |
With LM:
| Dataset | WER | CER |
|:----------:|:-----:|:-----:|
| TEST CV | 15.40 | 5.36 |
| DEV audio | 25.05 | 12.45 |
| TEST audio | / | / |
| {"language": ["fr"], "license": "apache-2.0", "tags": ["automatic-speech-recognition", "mozilla-foundation/common_voice_8_0", "generated_from_trainer", "robust-speech-event", "hf-asr-leaderboard"], "datasets": ["mozilla-foundation/common_voice_8_0"], "model-index": [{"name": "XLS-R-1B - French", "results": [{"task": {"type": "automatic-speech-recognition", "name": "Automatic Speech Recognition"}, "dataset": {"name": "Common Voice 8", "type": "mozilla-foundation/common_voice_8_0", "args": "fr"}, "metrics": [{"type": "wer", "value": 15.4, "name": "Test WER (with LM)"}, {"type": "cer", "value": 5.36, "name": "Test CER (with LM)"}]}, {"task": {"type": "automatic-speech-recognition", "name": "Automatic Speech Recognition"}, "dataset": {"name": "Robust Speech Event - Dev Data", "type": "speech-recognition-community-v2/dev_data", "args": "fr"}, "metrics": [{"type": "wer", "value": 25.05, "name": "Test WER (with LM)"}, {"type": "cer", "value": 12.45, "name": "Test CER (with LM)"}]}, {"task": {"type": "automatic-speech-recognition", "name": "Automatic Speech Recognition"}, "dataset": {"name": "Robust Speech Event - Test Data", "type": "speech-recognition-community-v2/eval_data", "args": "fr"}, "metrics": [{"type": "wer", "value": 27.1, "name": "Test WER"}]}]}]} | automatic-speech-recognition | Plim/xls-r-1b-cv_8-fr | [
"transformers",
"pytorch",
"wav2vec2",
"automatic-speech-recognition",
"mozilla-foundation/common_voice_8_0",
"generated_from_trainer",
"robust-speech-event",
"hf-asr-leaderboard",
"fr",
"dataset:mozilla-foundation/common_voice_8_0",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [
"fr"
] | TAGS
#transformers #pytorch #wav2vec2 #automatic-speech-recognition #mozilla-foundation/common_voice_8_0 #generated_from_trainer #robust-speech-event #hf-asr-leaderboard #fr #dataset-mozilla-foundation/common_voice_8_0 #license-apache-2.0 #model-index #endpoints_compatible #region-us
| Model description
-----------------
This model is a fine-tuned version of facebook/wav2vec2-xls-r-1b on the MOZILLA-FOUNDATION/COMMON\_VOICE\_8\_0 - FR dataset.
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 7.5e-05
* train\_batch\_size: 16
* eval\_batch\_size: 16
* seed: 42
* gradient\_accumulation\_steps: 8
* total\_train\_batch\_size: 128
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* lr\_scheduler\_warmup\_steps: 2000
* num\_epochs: 6.0
* mixed\_precision\_training: Native AMP
### Training results
It achieves the best result on the validation set on STEP 13000:
* Wer: 0.1834
Some problem occurs when calculating the validation loss.
### Framework versions
* Transformers 4.17.0.dev0
* Pytorch 1.10.2+cu102
* Datasets 1.18.3.dev0
* Tokenizers 0.11.0
### Evaluation Commands
1. To evaluate on 'mozilla-foundation/common\_voice\_8' with split 'test'
2. To evaluate on 'speech-recognition-community-v2/dev\_data'
### Evaluation Results
Without LM:
With LM:
| [
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 7.5e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* gradient\\_accumulation\\_steps: 8\n* total\\_train\\_batch\\_size: 128\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 2000\n* num\\_epochs: 6.0\n* mixed\\_precision\\_training: Native AMP",
"### Training results\n\n\n\nIt achieves the best result on the validation set on STEP 13000:\n\n\n* Wer: 0.1834\n\n\nSome problem occurs when calculating the validation loss.",
"### Framework versions\n\n\n* Transformers 4.17.0.dev0\n* Pytorch 1.10.2+cu102\n* Datasets 1.18.3.dev0\n* Tokenizers 0.11.0",
"### Evaluation Commands\n\n\n1. To evaluate on 'mozilla-foundation/common\\_voice\\_8' with split 'test'\n2. To evaluate on 'speech-recognition-community-v2/dev\\_data'",
"### Evaluation Results\n\n\nWithout LM:\n\n\n\nWith LM:"
] | [
"TAGS\n#transformers #pytorch #wav2vec2 #automatic-speech-recognition #mozilla-foundation/common_voice_8_0 #generated_from_trainer #robust-speech-event #hf-asr-leaderboard #fr #dataset-mozilla-foundation/common_voice_8_0 #license-apache-2.0 #model-index #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 7.5e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* gradient\\_accumulation\\_steps: 8\n* total\\_train\\_batch\\_size: 128\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 2000\n* num\\_epochs: 6.0\n* mixed\\_precision\\_training: Native AMP",
"### Training results\n\n\n\nIt achieves the best result on the validation set on STEP 13000:\n\n\n* Wer: 0.1834\n\n\nSome problem occurs when calculating the validation loss.",
"### Framework versions\n\n\n* Transformers 4.17.0.dev0\n* Pytorch 1.10.2+cu102\n* Datasets 1.18.3.dev0\n* Tokenizers 0.11.0",
"### Evaluation Commands\n\n\n1. To evaluate on 'mozilla-foundation/common\\_voice\\_8' with split 'test'\n2. To evaluate on 'speech-recognition-community-v2/dev\\_data'",
"### Evaluation Results\n\n\nWithout LM:\n\n\n\nWith LM:"
] | [
111,
160,
37,
39,
57,
13
] | [
"passage: TAGS\n#transformers #pytorch #wav2vec2 #automatic-speech-recognition #mozilla-foundation/common_voice_8_0 #generated_from_trainer #robust-speech-event #hf-asr-leaderboard #fr #dataset-mozilla-foundation/common_voice_8_0 #license-apache-2.0 #model-index #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 7.5e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* gradient\\_accumulation\\_steps: 8\n* total\\_train\\_batch\\_size: 128\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 2000\n* num\\_epochs: 6.0\n* mixed\\_precision\\_training: Native AMP### Training results\n\n\n\nIt achieves the best result on the validation set on STEP 13000:\n\n\n* Wer: 0.1834\n\n\nSome problem occurs when calculating the validation loss.### Framework versions\n\n\n* Transformers 4.17.0.dev0\n* Pytorch 1.10.2+cu102\n* Datasets 1.18.3.dev0\n* Tokenizers 0.11.0### Evaluation Commands\n\n\n1. To evaluate on 'mozilla-foundation/common\\_voice\\_8' with split 'test'\n2. To evaluate on 'speech-recognition-community-v2/dev\\_data'### Evaluation Results\n\n\nWithout LM:\n\n\n\nWith LM:"
] | [
-0.0850834846496582,
0.10776949673891068,
-0.0061372932977974415,
0.05878908187150955,
0.07864748686552048,
0.030314022675156593,
0.04281068220734596,
0.1719975620508194,
-0.03304135799407959,
0.10169227421283722,
0.0627230703830719,
0.07574771344661713,
0.07984093576669693,
0.08228522539138794,
-0.014399862848222256,
-0.1991146057844162,
0.02547272853553295,
-0.06081121414899826,
-0.07310972362756729,
0.11400710046291351,
0.10328297317028046,
-0.09783047437667847,
0.027709048241376877,
0.014963630586862564,
-0.06156495586037636,
0.007963295094668865,
-0.0339842289686203,
0.011568357236683369,
0.06151803955435753,
0.0294874869287014,
0.045111630111932755,
0.01871524006128311,
0.022266149520874023,
-0.2849352955818176,
-0.007434973958879709,
0.08783819526433945,
0.054538361728191376,
0.04902774095535278,
0.1110745221376419,
-0.07280215620994568,
0.103386290371418,
-0.12376344949007034,
0.010750122368335724,
0.06339405477046967,
-0.1239892989397049,
-0.19533605873584747,
-0.12449751794338226,
0.06281747668981552,
0.12117426097393036,
0.05378199368715286,
-0.05400993674993515,
0.06624297052621841,
-0.11174552142620087,
0.07828707247972488,
0.1808655709028244,
-0.2745509445667267,
-0.04788075387477875,
0.0032534748315811157,
0.04007923603057861,
0.009706308133900166,
-0.12248722463846207,
-0.012220216915011406,
0.043119002133607864,
-0.02342710644006729,
0.0356687568128109,
0.005951129365712404,
0.04299662262201309,
0.0323978066444397,
-0.14185099303722382,
-0.0938742458820343,
0.1567668467760086,
0.0735766738653183,
-0.04828467220067978,
-0.10612542927265167,
0.009190229699015617,
-0.1832866370677948,
0.0033411451149731874,
0.006990144960582256,
0.006271545309573412,
-0.01631288416683674,
-0.0237744003534317,
0.02143673039972782,
-0.06919058412313461,
-0.08797082304954529,
0.07390318065881729,
0.12586358189582825,
0.040484119206666946,
-0.03409954532980919,
0.02756665088236332,
0.13671787083148956,
-0.006132281385362148,
-0.14359864592552185,
-0.0623389333486557,
-0.02811248227953911,
-0.1526070386171341,
-0.026324329897761345,
-0.0078178895637393,
0.05101781710982323,
0.060680244117975235,
0.1826726645231247,
0.0033432620111852884,
0.07268958538770676,
0.05586571991443634,
0.02246018685400486,
-0.03197067230939865,
0.16902749240398407,
-0.04202648252248764,
-0.10204850137233734,
-0.060416460037231445,
0.1235840767621994,
-0.016642754897475243,
0.010947342030704021,
0.0210842527449131,
0.015455212444067001,
0.10413124412298203,
0.0969378799200058,
0.002517526037991047,
0.03568917512893677,
-0.11069813370704651,
-0.007467776536941528,
-0.02567736990749836,
-0.1621495485305786,
0.04958468675613403,
0.058682579547166824,
-0.07521774619817734,
-0.0008383263484574854,
0.013007625006139278,
-0.0269465409219265,
-0.0599699392914772,
0.08595810830593109,
-0.04379771649837494,
-0.016174566000699997,
-0.09646549820899963,
-0.10026942193508148,
0.022241026163101196,
-0.02406258136034012,
-0.04216432571411133,
-0.029257511720061302,
-0.09086541086435318,
-0.08597610145807266,
0.05638823285698891,
-0.08317344635725021,
-0.010999060235917568,
-0.05367634445428848,
-0.08173287659883499,
0.027032941579818726,
-0.007518778555095196,
0.11361504346132278,
-0.05229004845023155,
0.07235196232795715,
0.06634697318077087,
0.04753461480140686,
0.1499418467283249,
0.04667220637202263,
-0.03728780895471573,
0.08964935690164566,
-0.13040949404239655,
0.12525327503681183,
-0.1298953890800476,
0.05063995346426964,
-0.1396789848804474,
-0.07463933527469635,
-0.018557528033852577,
0.0008536988752894104,
0.1129242405295372,
0.1503823697566986,
-0.15483853220939636,
-0.05294313654303551,
0.15664643049240112,
-0.06766141206026077,
-0.11245816946029663,
0.12801769375801086,
-0.01915055699646473,
0.010610116645693779,
0.03121449612081051,
0.13906951248645782,
0.16688022017478943,
-0.10519297420978546,
-0.0295888539403677,
-0.05081329494714737,
0.02958049438893795,
0.12165709584951401,
0.06193862482905388,
-0.06794209778308868,
0.03133056312799454,
0.0022320908028632402,
-0.1017286628484726,
0.004924274515360594,
-0.07407583296298981,
-0.08673429489135742,
-0.03313601016998291,
-0.04769137129187584,
0.006480405572801828,
0.03956377133727074,
-0.013742403127253056,
-0.09427626430988312,
-0.14721055328845978,
-0.023895109072327614,
0.10161056369543076,
-0.08534623682498932,
0.010589312762022018,
-0.12364917993545532,
0.09269055724143982,
-0.03943059965968132,
0.029693622142076492,
-0.15844222903251648,
-0.03639194369316101,
0.037435129284858704,
-0.0707763060927391,
-0.024072423577308655,
-0.012142025865614414,
0.07877915352582932,
0.012131689116358757,
-0.018302928656339645,
-0.05568074434995651,
-0.028047969564795494,
-0.002113571623340249,
-0.03509828820824623,
-0.2218036949634552,
-0.0961800068616867,
-0.02094181813299656,
0.17063935101032257,
-0.23154191672801971,
-0.00828489102423191,
0.09687097370624542,
0.13051122426986694,
0.0032213940285146236,
-0.046822771430015564,
0.02920188568532467,
0.03526816889643669,
-0.025709109380841255,
-0.0657176747918129,
0.02996404841542244,
-0.009853479452431202,
-0.11946005374193192,
0.02679424174129963,
-0.13091102242469788,
-0.03183978796005249,
0.06366509199142456,
0.025414027273654938,
-0.08806461840867996,
-0.030355019494891167,
-0.06295621395111084,
-0.04617113247513771,
-0.03490786999464035,
-0.0610644556581974,
0.08659398555755615,
0.06063392758369446,
0.09316734969615936,
-0.06990464776754379,
-0.07356582581996918,
0.0034257215447723866,
-0.004784959834069014,
-0.0012779843527823687,
0.19102782011032104,
0.009031803347170353,
-0.06579798460006714,
0.046510178595781326,
0.0558563694357872,
-0.04182293266057968,
0.12647169828414917,
-0.0812409296631813,
-0.09866762906312943,
-0.05347534269094467,
0.09102270752191544,
0.04596057906746864,
0.08720747381448746,
-0.17530910670757294,
-0.014909643679857254,
0.031221438199281693,
0.014523735269904137,
0.02778421901166439,
-0.19316981732845306,
0.026529761031270027,
0.05079067125916481,
-0.09043888002634048,
-0.02459467016160488,
0.010875625535845757,
0.010463048703968525,
0.06528270244598389,
-0.0010314590763300657,
-0.038138143718242645,
-0.014945645816624165,
-0.061685606837272644,
-0.09575051069259644,
0.14544691145420074,
-0.07623396813869476,
-0.14534901082515717,
-0.1502867341041565,
-0.0001801552571123466,
-0.04109366238117218,
-0.03351010009646416,
0.03086390532553196,
-0.07494674623012543,
-0.08504189550876617,
-0.0721023753285408,
-0.011924568563699722,
0.02056238055229187,
-0.02618163451552391,
0.04410141333937645,
0.0006286000134423375,
0.10522528737783432,
-0.12550686299800873,
0.0016633947379887104,
0.009277775883674622,
-0.025716109201312065,
0.02246721275150776,
0.01603805087506771,
0.11448083072900772,
0.13602420687675476,
0.03590821102261543,
0.06338484585285187,
0.004861342255026102,
0.19603237509727478,
-0.13327555358409882,
-0.006068322341889143,
0.10635726898908615,
0.005130419507622719,
0.06668861210346222,
0.11575542390346527,
0.014633935876190662,
-0.08924031257629395,
0.028475338593125343,
0.07160661369562149,
-0.012615910731256008,
-0.2547135055065155,
-0.002360581886023283,
-0.09163989871740341,
-0.02602209895849228,
0.07400600612163544,
0.046670135110616684,
-0.04226509854197502,
0.004715129267424345,
-0.009930833242833614,
-0.06976290047168732,
0.05656516179442406,
0.07148389518260956,
0.0589391365647316,
0.04431556537747383,
0.0843651220202446,
-0.06014962121844292,
-0.02745353989303112,
0.026773782446980476,
0.02181280218064785,
0.21148143708705902,
-0.023906337097287178,
0.17278288304805756,
0.1152365505695343,
0.1434803307056427,
-0.01402037963271141,
0.06120546907186508,
-0.019519668072462082,
0.015438443049788475,
0.039787210524082184,
-0.08171932399272919,
-0.04278815910220146,
0.027873745188117027,
0.0922345295548439,
0.0035270084626972675,
-0.08355573564767838,
0.03637559339404106,
0.09171102941036224,
0.3042754530906677,
0.11622054129838943,
-0.2382834106683731,
-0.05003100633621216,
0.029589051380753517,
-0.04016606882214546,
-0.04780442267656326,
-0.01806820183992386,
0.09902774542570114,
-0.09564823657274246,
0.07291319221258163,
-0.048162683844566345,
0.09425576031208038,
-0.06741852313280106,
0.020132694393396378,
0.08204898238182068,
0.1165371835231781,
0.018596509471535683,
0.07374419271945953,
-0.2069534957408905,
0.2071341872215271,
0.0028332907240837812,
0.10352563112974167,
-0.03355712443590164,
0.05601102486252785,
0.028730483725667,
-0.030267413705587387,
0.11220572143793106,
0.008641207590699196,
0.009896031580865383,
-0.1420544981956482,
-0.0610026977956295,
0.0034259543754160404,
0.13958171010017395,
-0.06984679400920868,
0.14373691380023956,
-0.055732954293489456,
-0.04203047603368759,
0.019445249810814857,
-0.0024764188565313816,
-0.12831799685955048,
-0.12096209079027176,
0.08087988942861557,
-0.040880873799324036,
0.09229271113872528,
-0.06762237101793289,
-0.05568236857652664,
-0.09149020910263062,
0.24793662130832672,
-0.11976361274719238,
-0.04558992385864258,
-0.10806412249803543,
0.004573177546262741,
0.18098369240760803,
-0.04313109442591667,
-0.006874388549476862,
0.013672668486833572,
0.11144512891769409,
0.05612439662218094,
0.008242364972829819,
0.06863778829574585,
-0.05995362251996994,
-0.22104820609092712,
-0.019425392150878906,
0.1781378537416458,
0.0017384291859343648,
0.03602489084005356,
0.007163373287767172,
-0.003380168927833438,
-0.00258776918053627,
-0.10743898153305054,
0.07747749239206314,
0.08056803792715073,
0.0010128095746040344,
0.05261429399251938,
-0.031088493764400482,
0.038746099919080734,
-0.11160343885421753,
-0.052218466997146606,
0.06391099095344543,
0.30886027216911316,
-0.07002502679824829,
0.04366209730505943,
0.029935071244835854,
-0.08810169249773026,
-0.13084650039672852,
-0.03499418497085571,
0.07725014537572861,
0.0539001002907753,
-0.025463828817009926,
-0.10371130704879761,
0.02112756110727787,
0.09226340055465698,
-0.001861691358499229,
0.0689658373594284,
-0.33131611347198486,
-0.12891030311584473,
0.031654421240091324,
0.019237976521253586,
-0.05884462222456932,
-0.16136381030082703,
-0.08086629211902618,
-0.01713276095688343,
-0.1616458296775818,
0.03830452635884285,
0.028529398143291473,
0.1329328715801239,
-0.0011190816294401884,
-0.020196707919239998,
0.037629202008247375,
-0.05081845074892044,
0.17960228025913239,
0.03730487450957298,
0.06621307879686356,
0.0013954368187114596,
0.05907132849097252,
0.0504172258079052,
-0.10160981118679047,
0.06266026943922043,
-0.07248495519161224,
0.02229916676878929,
-0.1889709234237671,
-0.016813917085528374,
-0.06672065705060959,
0.0021501241717487574,
-0.06919803470373154,
-0.00284419022500515,
-0.021587444469332695,
0.07468249648809433,
0.09816134721040726,
0.027950484305620193,
0.04369223117828369,
-0.06875631958246231,
0.07945066690444946,
0.1271195113658905,
0.11585110425949097,
0.02216716855764389,
-0.14996613562107086,
0.018281573429703712,
0.028328832238912582,
0.032067663967609406,
-0.11170784384012222,
0.09021943807601929,
0.15525734424591064,
0.04665176570415497,
0.18322555720806122,
0.03545542061328888,
-0.12961125373840332,
-0.010575277730822563,
0.0966692790389061,
-0.10342799872159958,
-0.12037386000156403,
0.004100574646145105,
-0.05637264624238014,
-0.10445009171962738,
-0.0131797194480896,
0.10869064927101135,
0.0011748687829822302,
-0.001596722286194563,
0.016697952523827553,
0.07713153213262558,
-0.043940912932157516,
0.21519111096858978,
0.01746753789484501,
0.11967175453901291,
-0.0829683467745781,
0.09311112761497498,
0.04422376677393913,
-0.09948675334453583,
0.0243007093667984,
0.07107916474342346,
-0.047873370349407196,
-0.024804791435599327,
-0.028076544404029846,
0.06783778965473175,
0.034030526876449585,
-0.059319812804460526,
-0.10913188755512238,
-0.1262323558330536,
0.05024450644850731,
0.059870753437280655,
0.022351516410708427,
0.09785392135381699,
0.006963418330997229,
0.004086819011718035,
-0.10406286269426346,
0.10937699675559998,
0.16531012952327728,
0.03661154955625534,
-0.12291013449430466,
0.08954306691884995,
-0.020622747018933296,
-0.013407391496002674,
0.014367148280143738,
-0.014538675546646118,
-0.12429964542388916,
0.02020406350493431,
-0.12358430027961731,
0.004297448787838221,
-0.046594344079494476,
-0.010713527910411358,
0.032193537801504135,
-0.014784887433052063,
-0.04752464219927788,
0.030095621943473816,
-0.11474187672138214,
-0.07293993979692459,
0.010183781385421753,
0.06491321325302124,
-0.12796510756015778,
-0.008450336754322052,
0.05870208144187927,
-0.16494250297546387,
0.07148591428995132,
0.012702981010079384,
0.020188434049487114,
-0.011046488769352436,
-0.10301171988248825,
-0.007896948605775833,
0.015547722578048706,
0.0003304621495772153,
0.04101075232028961,
-0.19370733201503754,
0.025475623086094856,
-0.03241688385605812,
-0.011626679450273514,
0.010399392805993557,
-0.009794516488909721,
-0.1376321166753769,
-0.0009583334904164076,
-0.012298213317990303,
-0.028405942022800446,
-0.060377705842256546,
0.056645069271326065,
0.08021815121173859,
0.05546046420931816,
0.16640064120292664,
-0.07202723622322083,
0.04382821545004845,
-0.18398594856262207,
-0.011603721417486668,
0.0029790839180350304,
-0.0365978479385376,
-0.007178626023232937,
-0.012086550705134869,
0.09443856030702591,
-0.0774630457162857,
0.09268787503242493,
-0.029030127450823784,
0.026518331840634346,
0.020662732422351837,
-0.13018716871738434,
-0.014419077895581722,
0.038172878324985504,
0.09487275779247284,
0.019326597452163696,
-0.0065015098080039024,
0.02734314650297165,
-0.035740967839956284,
0.027287201955914497,
0.030602367594838142,
0.17072178423404694,
0.18654830753803253,
0.04420916363596916,
0.07408207654953003,
0.06110476329922676,
-0.12943468987941742,
-0.07963314652442932,
0.12184549123048782,
-0.0653143897652626,
0.15439777076244354,
-0.07403925061225891,
0.10919070243835449,
0.10021732747554779,
-0.18119865655899048,
0.09036234766244888,
-0.08405288308858871,
-0.09994392842054367,
-0.0975479781627655,
-0.08790387213230133,
-0.05905205383896828,
-0.13118714094161987,
0.033314965665340424,
-0.1019497662782669,
0.08451434969902039,
0.04151587560772896,
0.051151763647794724,
-0.007353785913437605,
0.08376458287239075,
0.011574173346161842,
-0.015583897940814495,
0.12277498841285706,
-0.012042447924613953,
-0.0018076273845508695,
0.012345259077847004,
-0.04768041521310806,
0.051827240735292435,
-0.009371674619615078,
0.07854007184505463,
0.016887115314602852,
-0.013701035641133785,
0.0258801132440567,
0.006382690276950598,
-0.10312099754810333,
0.005784406326711178,
-0.007635843940079212,
0.05806579813361168,
0.08817898482084274,
0.059224024415016174,
-0.01492601353675127,
-0.029593845829367638,
0.20117798447608948,
-0.08150654286146164,
-0.06260837614536285,
-0.16250845789909363,
0.2421560287475586,
0.027102142572402954,
0.021200120449066162,
0.00891126412898302,
-0.13093556463718414,
0.016325969249010086,
0.17144367098808289,
0.09072259068489075,
0.004006196744740009,
-0.016074294224381447,
-0.013901269994676113,
-0.0025294870138168335,
-0.024365615099668503,
0.058412566781044006,
0.06999953091144562,
0.0392574667930603,
-0.0060134343802928925,
0.061008188873529434,
-0.020147165283560753,
-0.05705641955137253,
-0.008011170662939548,
0.1430908739566803,
0.025601567700505257,
0.007446820382028818,
-0.035387467592954636,
0.10055159777402878,
-0.058054111897945404,
-0.1729062795639038,
0.05830877274274826,
-0.14080476760864258,
-0.17112967371940613,
-0.051809873431921005,
0.058102935552597046,
0.020239166915416718,
0.09089786559343338,
-0.018483292311429977,
-0.053942326456308365,
0.11348017305135727,
0.021749814972281456,
-0.022579627111554146,
-0.09644901007413864,
0.05659922957420349,
-0.06966080516576767,
0.18004873394966125,
0.008218884468078613,
0.03242470324039459,
0.1348978877067566,
-0.018535414710640907,
-0.1453092247247696,
0.05392008274793625,
0.06394654512405396,
-0.13395065069198608,
0.049907829612493515,
0.17992472648620605,
0.010930287651717663,
0.10588052123785019,
0.044746000319719315,
-0.04141935706138611,
-0.009265543892979622,
-0.04119803383946419,
0.009968726895749569,
-0.11671973019838333,
-0.033285219222307205,
-0.05163693055510521,
0.11401358991861343,
0.24629972875118256,
-0.073849618434906,
0.03731440007686615,
-0.0749027356505394,
0.05152431130409241,
-0.0008380081271752715,
0.10485401004552841,
-0.038759730756282806,
-0.2097531110048294,
0.05515053868293762,
0.033241089433431625,
0.0381297692656517,
-0.19794780015945435,
-0.1022404357790947,
0.05023389682173729,
-0.030498081818223,
-0.026592859998345375,
0.1424971967935562,
0.088805191218853,
0.08935151994228363,
-0.03263205289840698,
-0.1396612972021103,
-0.013248066417872906,
0.15858612954616547,
-0.1338287591934204,
-0.054047033190727234
] |
null | null | transformers |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
#
This model is a fine-tuned version of [facebook/wav2vec2-xls-r-1b](https://huggingface.co/facebook/wav2vec2-xls-r-1b) on the MOZILLA-FOUNDATION/COMMON_VOICE_7_0 - FR dataset.
It achieves the following results on the evaluation set:
- Loss: 0.2464
- Wer: 0.2220
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 7.5e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- gradient_accumulation_steps: 8
- total_train_batch_size: 128
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 2000
- num_epochs: 5.0
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Wer |
|:-------------:|:-----:|:-----:|:---------------:|:------:|
| 1.0326 | 0.32 | 1000 | 0.3092 | 0.2718 |
| 1.0828 | 0.65 | 2000 | 0.2843 | 0.2606 |
| 1.0771 | 0.97 | 3000 | 0.2774 | 0.2488 |
| 1.0306 | 1.3 | 4000 | 0.2588 | 0.2351 |
| 1.0052 | 1.62 | 5000 | 0.2483 | 0.2284 |
| 0.9865 | 1.94 | 6000 | 0.2464 | 0.2220 |
| 0.978 | 2.27 | 7000 | 0.2514 | 0.2172 |
| 1.7438 | 2.59 | 8000 | 0.7983 | 0.5072 |
| 2.3309 | 2.92 | 9000 | 1.8917 | 0.9416 |
| 2.1834 | 3.24 | 10000 | 1.7496 | 0.9030 |
| 2.3047 | 3.56 | 11000 | 1.5377 | 0.8747 |
| 2.1378 | 3.89 | 12000 | 1.3501 | 0.7923 |
| 1.9812 | 4.21 | 13000 | 1.2662 | 0.7697 |
| 2.6855 | 4.54 | 14000 | 2.4120 | 0.9902 |
| 2.7482 | 4.86 | 15000 | 2.5341 | 0.9874 |
### Framework versions
- Transformers 4.17.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.18.2.dev0
- Tokenizers 0.11.0
| {"language": ["fr"], "license": "apache-2.0", "tags": ["automatic-speech-recognition", "mozilla-foundation/common_voice_7_0", "generated_from_trainer"], "model-index": [{"name": "", "results": []}]} | automatic-speech-recognition | Plim/xls-r-1b-fr | [
"transformers",
"pytorch",
"wav2vec2",
"automatic-speech-recognition",
"mozilla-foundation/common_voice_7_0",
"generated_from_trainer",
"fr",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [
"fr"
] | TAGS
#transformers #pytorch #wav2vec2 #automatic-speech-recognition #mozilla-foundation/common_voice_7_0 #generated_from_trainer #fr #license-apache-2.0 #endpoints_compatible #region-us
|
This model is a fine-tuned version of facebook/wav2vec2-xls-r-1b on the MOZILLA-FOUNDATION/COMMON\_VOICE\_7\_0 - FR dataset.
It achieves the following results on the evaluation set:
* Loss: 0.2464
* Wer: 0.2220
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 7.5e-05
* train\_batch\_size: 16
* eval\_batch\_size: 16
* seed: 42
* gradient\_accumulation\_steps: 8
* total\_train\_batch\_size: 128
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* lr\_scheduler\_warmup\_steps: 2000
* num\_epochs: 5.0
* mixed\_precision\_training: Native AMP
### Training results
### Framework versions
* Transformers 4.17.0.dev0
* Pytorch 1.10.2+cu102
* Datasets 1.18.2.dev0
* Tokenizers 0.11.0
| [
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 7.5e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* gradient\\_accumulation\\_steps: 8\n* total\\_train\\_batch\\_size: 128\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 2000\n* num\\_epochs: 5.0\n* mixed\\_precision\\_training: Native AMP",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.17.0.dev0\n* Pytorch 1.10.2+cu102\n* Datasets 1.18.2.dev0\n* Tokenizers 0.11.0"
] | [
"TAGS\n#transformers #pytorch #wav2vec2 #automatic-speech-recognition #mozilla-foundation/common_voice_7_0 #generated_from_trainer #fr #license-apache-2.0 #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 7.5e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* gradient\\_accumulation\\_steps: 8\n* total\\_train\\_batch\\_size: 128\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 2000\n* num\\_epochs: 5.0\n* mixed\\_precision\\_training: Native AMP",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.17.0.dev0\n* Pytorch 1.10.2+cu102\n* Datasets 1.18.2.dev0\n* Tokenizers 0.11.0"
] | [
70,
159,
4,
39
] | [
"passage: TAGS\n#transformers #pytorch #wav2vec2 #automatic-speech-recognition #mozilla-foundation/common_voice_7_0 #generated_from_trainer #fr #license-apache-2.0 #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 7.5e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* gradient\\_accumulation\\_steps: 8\n* total\\_train\\_batch\\_size: 128\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 2000\n* num\\_epochs: 5.0\n* mixed\\_precision\\_training: Native AMP### Training results### Framework versions\n\n\n* Transformers 4.17.0.dev0\n* Pytorch 1.10.2+cu102\n* Datasets 1.18.2.dev0\n* Tokenizers 0.11.0"
] | [
-0.11151127517223358,
0.08632023632526398,
-0.0036184675991535187,
0.03985733911395073,
0.1197858676314354,
0.01230425201356411,
0.08409548550844193,
0.14963766932487488,
-0.10289264470338821,
0.08553725481033325,
0.09584895521402359,
0.0856693759560585,
0.07439073920249939,
0.12018322199583054,
-0.025100890547037125,
-0.2899647057056427,
0.025051632896065712,
-0.010965747758746147,
-0.11397240310907364,
0.10768185555934906,
0.09926886111497879,
-0.10541996359825134,
0.026874078437685966,
0.018789418041706085,
-0.10651350021362305,
-0.004897396080195904,
-0.02908121421933174,
-0.05556635931134224,
0.11912531405687332,
0.05874619260430336,
0.0677822157740593,
0.02746649831533432,
0.07563693821430206,
-0.2748730182647705,
0.015370883978903294,
0.06304237246513367,
0.04000541940331459,
0.06533478200435638,
0.10310472548007965,
0.011335469782352448,
0.12971512973308563,
-0.07368996739387512,
0.04776635393500328,
0.04958634078502655,
-0.08754724264144897,
-0.30409908294677734,
-0.09778902679681778,
0.01865716651082039,
0.12301869690418243,
0.0840233862400055,
-0.027518583461642265,
0.06008360534906387,
-0.061435066163539886,
0.09710501879453659,
0.22748300433158875,
-0.23192042112350464,
-0.07463281601667404,
-0.022880082949995995,
0.05503212660551071,
0.033931564539670944,
-0.10527703166007996,
-0.01175954844802618,
0.02797570265829563,
0.031558942049741745,
0.08754798769950867,
0.005608588457107544,
-0.019124681130051613,
0.0024540184531360865,
-0.14187611639499664,
-0.047262582927942276,
0.11663227528333664,
0.08817385882139206,
-0.02225663885474205,
-0.09128785878419876,
-0.031028201803565025,
-0.17827573418617249,
-0.04983938857913017,
0.0014388058334589005,
0.030436968430876732,
-0.023742355406284332,
-0.0752800852060318,
-0.005966322962194681,
-0.06671421974897385,
-0.07878359407186508,
0.01459476351737976,
0.14989979565143585,
0.0440310463309288,
-0.03248971328139305,
0.012023918330669403,
0.08903530985116959,
0.04955444484949112,
-0.1411110907793045,
-0.0023318547755479813,
0.048581961542367935,
-0.09133147448301315,
-0.012719988822937012,
-0.04356776177883148,
-0.029882660135626793,
0.03436528891324997,
0.11489938944578171,
-0.03165140002965927,
0.08823442459106445,
0.008500685915350914,
0.023974047973752022,
-0.09000150859355927,
0.15464089810848236,
-0.06488300114870071,
-0.0730154812335968,
-0.029184842482209206,
0.10023508220911026,
0.0034427389036864042,
-0.012727588415145874,
-0.07780871540307999,
0.01495983824133873,
0.0994531512260437,
0.04722430184483528,
-0.009073167107999325,
0.017687777057290077,
-0.06479354202747345,
-0.024358201771974564,
0.0005137177649885416,
-0.10444224625825882,
0.03804478794336319,
0.03075394593179226,
-0.06478617340326309,
0.008441167883574963,
-0.004304205067455769,
0.023909255862236023,
-0.014142168685793877,
0.10783067345619202,
-0.0450480654835701,
0.0043837823905050755,
-0.08203354477882385,
-0.10509268939495087,
0.03617345914244652,
-0.02270297147333622,
0.00009359385148854926,
-0.07377086579799652,
-0.09996935725212097,
-0.062248412519693375,
0.05680001899600029,
-0.049908027052879333,
-0.06568320095539093,
-0.07900314033031464,
-0.06788606941699982,
0.05476359650492668,
-0.028299571946263313,
0.19412125647068024,
-0.06285016983747482,
0.10704689472913742,
0.017207318916916847,
0.04463000223040581,
0.038125988095998764,
0.0702499970793724,
-0.026636412367224693,
0.039655182510614395,
-0.12639564275741577,
0.08671993762254715,
-0.08748038858175278,
0.03519861772656441,
-0.14615075290203094,
-0.10128175467252731,
-0.011001547798514366,
-0.0012898605782538652,
0.10770425945520401,
0.10683749616146088,
-0.18150591850280762,
-0.09443977475166321,
0.15260563790798187,
-0.07053571194410324,
-0.08146771043539047,
0.1475074291229248,
-0.020861927419900894,
-0.018165651708841324,
0.03814687952399254,
0.1613198220729828,
0.1104830652475357,
-0.08894853293895721,
0.01949233002960682,
-0.0521460585296154,
0.12216903269290924,
0.038159292191267014,
0.09458261728286743,
-0.042479388415813446,
0.014711083844304085,
0.0021335904020816088,
-0.02256016992032528,
0.09186314791440964,
-0.08881530910730362,
-0.07460367679595947,
-0.02415085956454277,
-0.07743195444345474,
0.014851980842649937,
0.045796073973178864,
0.026335181668400764,
-0.09742245078086853,
-0.11149406433105469,
0.010189000517129898,
0.12111236155033112,
-0.10172262042760849,
0.03580266609787941,
-0.07100770622491837,
0.05581265687942505,
-0.021984659135341644,
-0.0028579591307789087,
-0.16432802379131317,
0.021869473159313202,
0.03798997774720192,
-0.051422085613012314,
0.02603333443403244,
-0.0006169839762151241,
0.0725640207529068,
0.042642105370759964,
-0.058238282799720764,
-0.059214454144239426,
-0.049670323729515076,
-0.0028976050671190023,
-0.06708557903766632,
-0.2375771701335907,
-0.062054552137851715,
-0.03488612920045853,
0.15545564889907837,
-0.19667960703372955,
-0.0011389460414648056,
0.02376568131148815,
0.13801485300064087,
0.031046481803059578,
-0.03394902125000954,
0.0024169767275452614,
0.08076384663581848,
-0.01864967867732048,
-0.06731215864419937,
0.03859210014343262,
0.00618323776870966,
-0.12429160624742508,
0.014836887829005718,
-0.10244452953338623,
0.08057888597249985,
0.09820172935724258,
-0.017644677311182022,
-0.07416421175003052,
-0.0648840069770813,
-0.05131235718727112,
-0.05974756181240082,
-0.020769784227013588,
0.004841375630348921,
0.19297675788402557,
0.020163124427199364,
0.11476518213748932,
-0.07381290942430496,
-0.027840005233883858,
0.03605034574866295,
0.022581182420253754,
-0.015261305496096611,
0.13128690421581268,
0.0687255784869194,
-0.07838701456785202,
0.09266964346170425,
0.09669696539640427,
-0.06761941313743591,
0.1509963572025299,
-0.06986122578382492,
-0.09445249289274216,
-0.027950512245297432,
0.029895838350057602,
0.010712413117289543,
0.09128660708665848,
-0.1585453897714615,
-0.011920147575438023,
0.021443752571940422,
0.02921673096716404,
0.024554302915930748,
-0.1971445083618164,
0.007892351597547531,
0.04280818626284599,
-0.075016550719738,
0.011932840570807457,
-0.0020853455644100904,
-0.010177569463849068,
0.08534111827611923,
0.014045598916709423,
-0.08145198225975037,
-0.012597695924341679,
-0.030039899051189423,
-0.0808950811624527,
0.1642061024904251,
-0.11877366155385971,
-0.1451612263917923,
-0.1194605603814125,
-0.034991372376680374,
-0.019935242831707,
-0.014623739756643772,
0.05413004010915756,
-0.10808903723955154,
-0.027938343584537506,
-0.053951144218444824,
0.03965485468506813,
-0.06797812134027481,
0.028908343985676765,
0.00422467989847064,
0.006434588227421045,
0.06153447926044464,
-0.10294947773218155,
0.023925572633743286,
-0.009660501964390278,
-0.015506433323025703,
0.007768082898110151,
0.027897587046027184,
0.0872158408164978,
0.15769469738006592,
0.04172656312584877,
0.016976555809378624,
-0.04995943605899811,
0.16397437453269958,
-0.11153309792280197,
-0.015185654163360596,
0.12521372735500336,
0.012220289558172226,
0.034653954207897186,
0.14344538748264313,
0.04185684770345688,
-0.08631306886672974,
0.019852586090564728,
0.021418392658233643,
-0.00380100985057652,
-0.23278450965881348,
-0.03857438266277313,
-0.07486521452665329,
-0.03078475594520569,
0.09335848689079285,
0.031047040596604347,
-0.020473239943385124,
0.016830261796712875,
-0.031293343752622604,
0.006213087122887373,
0.00957285426557064,
0.055084872990846634,
0.11387476325035095,
0.03247644752264023,
0.11503805220127106,
-0.016745904460549355,
-0.01887965016067028,
0.038655441254377365,
-0.005922569893300533,
0.23629237711429596,
0.011944837868213654,
0.17425060272216797,
0.04153142869472504,
0.15901456773281097,
0.0025945818051695824,
0.035695631057024,
0.019449811428785324,
-0.007113793399184942,
-0.0029637201223522425,
-0.05183445289731026,
-0.050356425344944,
0.03958979249000549,
0.11310725659132004,
0.028112612664699554,
-0.11422562599182129,
-0.015735749155282974,
0.01704929955303669,
0.3642955422401428,
0.06508446484804153,
-0.2615274488925934,
-0.09143123775720596,
0.008342119865119457,
-0.08200465142726898,
-0.04054662585258484,
0.03020305000245571,
0.1270904392004013,
-0.0822317972779274,
0.06591013073921204,
-0.05965212732553482,
0.0935574546456337,
-0.06515481323003769,
0.004639165475964546,
0.06067347526550293,
0.10237728804349899,
0.010739625431597233,
0.062063898891210556,
-0.27095064520835876,
0.2936946153640747,
-0.0035191955976188183,
0.0740102156996727,
-0.042125001549720764,
0.042758919298648834,
0.04252145439386368,
-0.02513921447098255,
0.051124829798936844,
-0.014354288578033447,
-0.1437949389219284,
-0.15031275153160095,
-0.07576287537813187,
0.022130805999040604,
0.12786158919334412,
-0.03983691334724426,
0.10316663235425949,
-0.0392945259809494,
-0.02544178068637848,
0.0499948151409626,
-0.04061482474207878,
-0.11241688579320908,
-0.10480007529258728,
0.02026873268187046,
0.05874992161989212,
0.0773109421133995,
-0.09211156517267227,
-0.10417007654905319,
-0.07968560606241226,
0.16672083735466003,
-0.07181006669998169,
-0.0016774499090388417,
-0.12104188650846481,
0.09231644868850708,
0.15817753970623016,
-0.0669964924454689,
0.054790377616882324,
0.007118970155715942,
0.10828061401844025,
0.016426103189587593,
-0.010755596682429314,
0.11482450366020203,
-0.08336146920919418,
-0.1984284371137619,
-0.06486638635396957,
0.16241995990276337,
0.025539878755807877,
0.07468372583389282,
-0.026954224333167076,
0.026704616844654083,
-0.01668810285627842,
-0.06583953648805618,
0.05671276897192001,
0.048585064709186554,
0.015356735326349735,
0.06619537621736526,
-0.038017578423023224,
-0.022541897371411324,
-0.07889480143785477,
-0.09984496980905533,
0.15243467688560486,
0.29279670119285583,
-0.0833764299750328,
0.04529072344303131,
0.05149612948298454,
-0.051066022366285324,
-0.13472743332386017,
0.004032354801893234,
0.1309128850698471,
0.04461745172739029,
0.00424184137955308,
-0.21303889155387878,
-0.004597003571689129,
0.07721914350986481,
-0.025236640125513077,
0.08728165179491043,
-0.3232084810733795,
-0.13133589923381805,
0.0881827175617218,
0.08849877119064331,
0.0036113818641752005,
-0.15015734732151031,
-0.05752594769001007,
-0.020346924662590027,
-0.08323304355144501,
0.04066005349159241,
-0.013721157796680927,
0.13694697618484497,
0.006480705924332142,
0.03455854207277298,
0.019106874242424965,
-0.04320438578724861,
0.14942023158073425,
-0.0355377271771431,
0.049638863652944565,
-0.010449958965182304,
0.04703962430357933,
-0.03548439219594002,
-0.04886600002646446,
-0.01579573005437851,
-0.07989683747291565,
0.02771259844303131,
-0.11695858836174011,
-0.030919643118977547,
-0.08215200901031494,
0.01875011995434761,
-0.032034631818532944,
-0.0345287062227726,
-0.038645219057798386,
0.04233228415250778,
0.06925816833972931,
0.008861902169883251,
0.12540008127689362,
-0.06001434102654457,
0.14462602138519287,
0.12372562289237976,
0.08709202706813812,
-0.013876518234610558,
-0.06784496456384659,
-0.015173924155533314,
-0.017043408006429672,
0.0521538145840168,
-0.11271687597036362,
0.024476073682308197,
0.1364368498325348,
0.035222575068473816,
0.15177927911281586,
0.04316657781600952,
-0.07766666263341904,
0.024395180866122246,
0.055009305477142334,
-0.07218069583177567,
-0.14464491605758667,
-0.0168638713657856,
0.05296441167593002,
-0.12010475248098373,
-0.0095224529504776,
0.12571586668491364,
-0.045561399310827255,
-0.010237107053399086,
0.012008796446025372,
0.02545406110584736,
-0.048032183200120926,
0.2267814725637436,
0.010102515108883381,
0.0714324489235878,
-0.08968618512153625,
0.07525472342967987,
0.05848728120326996,
-0.16347414255142212,
0.0325394831597805,
0.07234799861907959,
-0.048461515456438065,
-0.023832397535443306,
0.03263745829463005,
0.0798170417547226,
0.0283801406621933,
-0.062038879841566086,
-0.10372412204742432,
-0.13895681500434875,
0.08766424655914307,
0.08307786285877228,
0.034490134567022324,
0.02186114341020584,
-0.026800869032740593,
0.02918289043009281,
-0.08606822788715363,
0.08780878037214279,
0.08132559061050415,
0.05535731092095375,
-0.1226891428232193,
0.1355426162481308,
0.01647052727639675,
0.008324778638780117,
0.0014635054394602776,
-0.005107578821480274,
-0.08138077706098557,
0.023959437385201454,
-0.131024569272995,
-0.019374681636691093,
-0.0535619743168354,
-0.0016572867752984166,
0.017068807035684586,
-0.061410870403051376,
-0.05232685059309006,
0.020831655710935593,
-0.11766406148672104,
-0.044963765889406204,
-0.024646034464240074,
0.07212653011083603,
-0.0823482871055603,
-0.03107067011296749,
0.03273138776421547,
-0.10640254616737366,
0.08578595519065857,
0.05023777857422829,
0.019628474488854408,
0.027543051168322563,
-0.11376683413982391,
-0.009266749955713749,
0.039520397782325745,
0.00514241773635149,
0.015856022015213966,
-0.18653541803359985,
-0.010555864311754704,
-0.014584888704121113,
0.027462441474199295,
-0.016503525897860527,
0.037838164716959,
-0.12050553411245346,
-0.047700658440589905,
-0.042700543999671936,
-0.05972933769226074,
-0.046601418405771255,
0.04072462394833565,
0.0670088678598404,
0.034183070063591,
0.15844176709651947,
-0.09173451364040375,
0.05664259195327759,
-0.21342191100120544,
0.013056010007858276,
-0.04372691735625267,
-0.07529137283563614,
-0.07896862924098969,
-0.04171141982078552,
0.08896394073963165,
-0.05204073712229729,
0.07545126974582672,
-0.058650728315114975,
0.06820051372051239,
0.02957894653081894,
-0.1347181797027588,
0.01105887908488512,
0.038886141031980515,
0.1927693486213684,
0.057475246489048004,
-0.014523036777973175,
0.06849589198827744,
0.005832068622112274,
0.06844796240329742,
0.17546634376049042,
0.12844480574131012,
0.16304907202720642,
0.06716182827949524,
0.10812099277973175,
0.059398289769887924,
-0.12687692046165466,
-0.1097971722483635,
0.13709081709384918,
-0.029109837487339973,
0.14562319219112396,
-0.02280193194746971,
0.21884103119373322,
0.10465654730796814,
-0.19530142843723297,
0.06085865572094917,
-0.03772672638297081,
-0.06670220196247101,
-0.09226652979850769,
-0.0403890535235405,
-0.08433219790458679,
-0.1933707594871521,
0.0037723598070442677,
-0.0948835015296936,
0.05856684595346451,
0.021122336387634277,
0.042511921375989914,
0.02549883909523487,
0.08768520504236221,
0.04146025329828262,
-0.01566937007009983,
0.1184505820274353,
0.009001440368592739,
-0.013540058396756649,
-0.06736970692873001,
-0.12042652815580368,
0.05401347205042839,
-0.04299160838127136,
0.054336048662662506,
-0.03295992687344551,
-0.10088785737752914,
0.05889391899108887,
0.01747378520667553,
-0.10513846576213837,
0.026285404339432716,
-0.008197718299925327,
0.07723600417375565,
0.12099172919988632,
0.03624247759580612,
-0.005113114602863789,
-0.007687689736485481,
0.21942560374736786,
-0.09863681346178055,
-0.06115424633026123,
-0.1284731924533844,
0.21781834959983826,
0.005058362614363432,
-0.0039324043318629265,
0.020241690799593925,
-0.06508470326662064,
-0.019934844225645065,
0.15892727673053741,
0.13933826982975006,
-0.02864399179816246,
-0.02199765294790268,
0.03031962364912033,
-0.013680640608072281,
-0.051409605890512466,
0.06963527947664261,
0.12561367452144623,
0.030141152441501617,
-0.05717376247048378,
-0.03793001547455788,
-0.04650331288576126,
-0.04613352194428444,
-0.018327299505472183,
0.058093976229429245,
0.007029957138001919,
-0.01806446723639965,
-0.0050569805316627026,
0.10588105022907257,
-0.03790580481290817,
-0.15796050429344177,
0.03208637610077858,
-0.18893224000930786,
-0.17985308170318604,
-0.029422933235764503,
0.06674972176551819,
0.05462249740958214,
0.04874042794108391,
-0.01631181687116623,
-0.015621498227119446,
0.11351273953914642,
-0.011831793002784252,
-0.03291048854589462,
-0.1299920678138733,
0.0821131020784378,
-0.09161719679832458,
0.15605993568897247,
-0.03692113235592842,
0.038397908210754395,
0.12041885405778885,
0.08179839700460434,
-0.0723804235458374,
0.0456729494035244,
0.07736323773860931,
-0.12858732044696808,
0.055259037762880325,
0.17976197600364685,
-0.046629663556814194,
0.1577746570110321,
0.04514654725790024,
-0.09262958914041519,
0.03931669890880585,
-0.08391781151294708,
-0.05634380131959915,
-0.04864021763205528,
0.011305281892418861,
-0.051048602908849716,
0.1427978277206421,
0.19668826460838318,
-0.05832110345363617,
-0.0199260376393795,
-0.05267491564154625,
0.012072020210325718,
0.04085317254066467,
0.13303640484809875,
-0.05501165986061096,
-0.26800134778022766,
0.015821389853954315,
0.018560877069830894,
0.021532708778977394,
-0.2104753851890564,
-0.09876614063978195,
0.032399050891399384,
-0.058931734412908554,
-0.06433507800102234,
0.11403194814920425,
0.04481952264904976,
0.03433842957019806,
-0.05359964445233345,
-0.1436581015586853,
-0.026989446952939034,
0.17185556888580322,
-0.17385771870613098,
-0.05654507502913475
] |
null | null | transformers |
## Model description
This model is a fine-tuned version of [facebook/wav2vec2-xls-r-300m](https://huggingface.co/facebook/wav2vec2-xls-r-300m) on the MOZILLA-FOUNDATION/COMMON_VOICE_8_0 - FR dataset.
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 7.5e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- gradient_accumulation_steps: 8
- total_train_batch_size: 128
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 2000
- num_epochs: 5.0 (extended to 7.0 with training with checkpoint)
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Wer |
|:-------------:|:-----:|:-----:|:---------------:|:------:|
| 2.9114 | 0.29 | 1000 | inf | 0.9997 |
| 1.2436 | 0.57 | 2000 | inf | 0.4310 |
| 1.0552 | 0.86 | 3000 | inf | 0.3144 |
| 1.0044 | 1.15 | 4000 | inf | 0.2814 |
| 0.9718 | 1.43 | 5000 | inf | 0.2658 |
| 0.9502 | 1.72 | 6000 | inf | 0.2566 |
| 0.9418 | 2.01 | 7000 | inf | 0.2476 |
| 0.9215 | 2.29 | 8000 | inf | 0.2420 |
| 0.9236 | 2.58 | 9000 | inf | 0.2388 |
| 0.9014 | 2.87 | 10000 | inf | 0.2354 |
| 0.8814 | 3.15 | 11000 | inf | 0.2312 |
| 0.8809 | 3.44 | 12000 | inf | 0.2285 |
| 0.8717 | 3.73 | 13000 | inf | 0.2263 |
| 0.8787 | 4.01 | 14000 | inf | 0.2218 |
| 0.8567 | 4.3 | 15000 | inf | 0.2193 |
| 0.8488 | 4.59 | 16000 | inf | 0.2187 |
| 0.8359 | 4.87 | 17000 | inf | 0.2172 |
Training continued with checkpoint from STEP 17000:
| Training Loss | Epoch | Step | Validation Loss | Wer |
|:-------------:|:-----:|:-----:|:---------------:|:------:|
| / | 5.16 | 18000 | inf | 0.2176 |
| / | 5.45 | 19000 | inf | 0.2181 |
| / | 5.73 | 20000 | inf | 0.2155 |
| / | 6.02 | 21000 | inf | 0.2140 |
| / | 6.31 | 22000 | inf | 0.2124 |
| / | 6.59 | 23000 | inf | 0.2117 |
| / | 6.88 | 24000 | inf | 0.2116 |
It achieves the best result on the validation set on Step 24000:
- Wer: 0.2116
Got some issue with validation loss calculation.
### Framework versions
- Transformers 4.17.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.18.3.dev0
- Tokenizers 0.11.0
### Evaluation Commands
1. To evaluate on `mozilla-foundation/common_voice_8` with split `test`
```bash
python eval.py --model_id Plim/xls-r-300m-cv_8-fr --dataset mozilla-foundation/common_voice_8_0 --config fr --split test
```
2. To evaluate on `speech-recognition-community-v2/dev_data`
```bash
python eval.py --model_id Plim/xls-r-300m-cv_8-fr --dataset speech-recognition-community-v2/dev_data --config fr --split validation --chunk_length_s 5.0 --stride_length_s 1.0
```
| {"language": ["fr"], "license": "apache-2.0", "tags": ["automatic-speech-recognition", "mozilla-foundation/common_voice_8_0", "generated_from_trainer"], "model-index": [{"name": "XLS-R-300m - French", "results": [{"task": {"type": "automatic-speech-recognition", "name": "Automatic Speech Recognition"}, "dataset": {"name": "Common Voice 8", "type": "mozilla-foundation/common_voice_8_0", "args": "fr"}, "metrics": [{"type": "wer", "value": "to recompute with STEP 24000", "name": "Test WER"}, {"type": "cer", "value": "to recompute with STEP 24000", "name": "Test CER"}]}, {"task": {"type": "automatic-speech-recognition", "name": "Automatic Speech Recognition"}, "dataset": {"name": "Robust Speech Event - Dev Data", "type": "speech-recognition-community-v2/dev_data", "args": "fr"}, "metrics": [{"type": "wer", "value": 35.29, "name": "Test WER"}, {"type": "cer", "value": 13.94, "name": "Test CER"}]}]}]} | automatic-speech-recognition | Plim/xls-r-300m-cv_8-fr | [
"transformers",
"pytorch",
"wav2vec2",
"automatic-speech-recognition",
"mozilla-foundation/common_voice_8_0",
"generated_from_trainer",
"fr",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [
"fr"
] | TAGS
#transformers #pytorch #wav2vec2 #automatic-speech-recognition #mozilla-foundation/common_voice_8_0 #generated_from_trainer #fr #license-apache-2.0 #model-index #endpoints_compatible #region-us
| Model description
-----------------
This model is a fine-tuned version of facebook/wav2vec2-xls-r-300m on the MOZILLA-FOUNDATION/COMMON\_VOICE\_8\_0 - FR dataset.
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 7.5e-05
* train\_batch\_size: 16
* eval\_batch\_size: 16
* seed: 42
* gradient\_accumulation\_steps: 8
* total\_train\_batch\_size: 128
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* lr\_scheduler\_warmup\_steps: 2000
* num\_epochs: 5.0 (extended to 7.0 with training with checkpoint)
* mixed\_precision\_training: Native AMP
### Training results
Training continued with checkpoint from STEP 17000:
It achieves the best result on the validation set on Step 24000:
* Wer: 0.2116
Got some issue with validation loss calculation.
### Framework versions
* Transformers 4.17.0.dev0
* Pytorch 1.10.2+cu102
* Datasets 1.18.3.dev0
* Tokenizers 0.11.0
### Evaluation Commands
1. To evaluate on 'mozilla-foundation/common\_voice\_8' with split 'test'
2. To evaluate on 'speech-recognition-community-v2/dev\_data'
| [
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 7.5e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* gradient\\_accumulation\\_steps: 8\n* total\\_train\\_batch\\_size: 128\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 2000\n* num\\_epochs: 5.0 (extended to 7.0 with training with checkpoint)\n* mixed\\_precision\\_training: Native AMP",
"### Training results\n\n\n\nTraining continued with checkpoint from STEP 17000:\n\n\n\nIt achieves the best result on the validation set on Step 24000:\n\n\n* Wer: 0.2116\n\n\nGot some issue with validation loss calculation.",
"### Framework versions\n\n\n* Transformers 4.17.0.dev0\n* Pytorch 1.10.2+cu102\n* Datasets 1.18.3.dev0\n* Tokenizers 0.11.0",
"### Evaluation Commands\n\n\n1. To evaluate on 'mozilla-foundation/common\\_voice\\_8' with split 'test'\n2. To evaluate on 'speech-recognition-community-v2/dev\\_data'"
] | [
"TAGS\n#transformers #pytorch #wav2vec2 #automatic-speech-recognition #mozilla-foundation/common_voice_8_0 #generated_from_trainer #fr #license-apache-2.0 #model-index #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 7.5e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* gradient\\_accumulation\\_steps: 8\n* total\\_train\\_batch\\_size: 128\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 2000\n* num\\_epochs: 5.0 (extended to 7.0 with training with checkpoint)\n* mixed\\_precision\\_training: Native AMP",
"### Training results\n\n\n\nTraining continued with checkpoint from STEP 17000:\n\n\n\nIt achieves the best result on the validation set on Step 24000:\n\n\n* Wer: 0.2116\n\n\nGot some issue with validation loss calculation.",
"### Framework versions\n\n\n* Transformers 4.17.0.dev0\n* Pytorch 1.10.2+cu102\n* Datasets 1.18.3.dev0\n* Tokenizers 0.11.0",
"### Evaluation Commands\n\n\n1. To evaluate on 'mozilla-foundation/common\\_voice\\_8' with split 'test'\n2. To evaluate on 'speech-recognition-community-v2/dev\\_data'"
] | [
74,
172,
45,
39,
57
] | [
"passage: TAGS\n#transformers #pytorch #wav2vec2 #automatic-speech-recognition #mozilla-foundation/common_voice_8_0 #generated_from_trainer #fr #license-apache-2.0 #model-index #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 7.5e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* gradient\\_accumulation\\_steps: 8\n* total\\_train\\_batch\\_size: 128\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 2000\n* num\\_epochs: 5.0 (extended to 7.0 with training with checkpoint)\n* mixed\\_precision\\_training: Native AMP### Training results\n\n\n\nTraining continued with checkpoint from STEP 17000:\n\n\n\nIt achieves the best result on the validation set on Step 24000:\n\n\n* Wer: 0.2116\n\n\nGot some issue with validation loss calculation.### Framework versions\n\n\n* Transformers 4.17.0.dev0\n* Pytorch 1.10.2+cu102\n* Datasets 1.18.3.dev0\n* Tokenizers 0.11.0### Evaluation Commands\n\n\n1. To evaluate on 'mozilla-foundation/common\\_voice\\_8' with split 'test'\n2. To evaluate on 'speech-recognition-community-v2/dev\\_data'"
] | [
-0.09840355068445206,
0.10969115793704987,
-0.004354308824986219,
0.01962009258568287,
0.12468060851097107,
0.027073068544268608,
0.04064808785915375,
0.18719740211963654,
-0.07123677432537079,
0.12692557275295258,
0.09379497170448303,
0.08226274698972702,
0.1125507578253746,
0.12297618389129639,
-0.03846459463238716,
-0.21341685950756073,
0.03625490888953209,
-0.06230312958359718,
-0.07059882581233978,
0.10947105288505554,
0.1021023690700531,
-0.09126840531826019,
0.04905617982149124,
0.009951537474989891,
-0.11872950941324234,
-0.003470900235697627,
-0.016553858295083046,
-0.01858050748705864,
0.07539252936840057,
0.04754238575696945,
0.03834812343120575,
0.01838005892932415,
0.025766529142856598,
-0.2965754568576813,
-0.0006420532590709627,
0.10942358523607254,
0.040377046912908554,
0.05626565217971802,
0.0945531576871872,
-0.014399874024093151,
0.09007712453603745,
-0.0852121114730835,
0.011222800239920616,
0.10411212593317032,
-0.11339615285396576,
-0.2570153772830963,
-0.17109711468219757,
0.08883413672447205,
0.1253734827041626,
0.06131596863269806,
-0.043222010135650635,
0.058776773512363434,
-0.04428575560450554,
0.06982793658971786,
0.18052087724208832,
-0.30686187744140625,
-0.05737635865807533,
0.03543472662568092,
0.09186387062072754,
0.011941751465201378,
-0.11758444458246231,
0.02756112441420555,
0.05419963225722313,
0.007199286948889494,
0.06376657634973526,
-0.013419631868600845,
0.02558775618672371,
-0.00761706568300724,
-0.12915462255477905,
-0.08175289630889893,
0.1529669463634491,
0.09266591817140579,
-0.07906778156757355,
-0.1020246297121048,
-0.005784651730209589,
-0.17378006875514984,
0.013470442965626717,
-0.01663311757147312,
0.011300641112029552,
-0.012845061719417572,
-0.024210700765252113,
-0.07545851916074753,
-0.07249137759208679,
-0.060314882546663284,
0.04595719277858734,
0.1507372111082077,
0.03918391838669777,
-0.012434814125299454,
0.04944951832294464,
0.10578104108572006,
-0.043726224452257156,
-0.1403397023677826,
-0.05945206806063652,
-0.01780553162097931,
-0.10481566190719604,
-0.019033165648579597,
-0.03848004713654518,
0.06242721527814865,
0.06351911276578903,
0.23097315430641174,
-0.05167710408568382,
0.08259659260511398,
-0.009527851827442646,
0.029898103326559067,
-0.06630384922027588,
0.1631864458322525,
-0.05543576180934906,
-0.10072433203458786,
-0.03618037328124046,
0.11381586641073227,
-0.009070923551917076,
-0.010706490837037563,
-0.02492840774357319,
0.011749631725251675,
0.13489970564842224,
0.09911395609378815,
0.010008256882429123,
0.010761585086584091,
-0.08760049194097519,
-0.01728406362235546,
-0.0029065560083836317,
-0.13401372730731964,
0.060125675052404404,
0.035910286009311676,
-0.05656123906373978,
0.013490457087755203,
-0.014680838212370872,
0.010791700333356857,
-0.06223556771874428,
0.08107452839612961,
-0.05749602988362312,
-0.025036398321390152,
-0.10094020515680313,
-0.11256089061498642,
0.03437286242842674,
-0.110585518181324,
-0.03292514383792877,
-0.06437293440103531,
-0.12256917357444763,
-0.08623477071523666,
0.05088460072875023,
-0.06853155791759491,
-0.013376186601817608,
-0.02980872616171837,
-0.11184269934892654,
0.027094054967164993,
-0.019327256828546524,
0.13523398339748383,
-0.03889482468366623,
0.08318088203668594,
0.04286862537264824,
0.04217388853430748,
0.049427375197410583,
0.03886450454592705,
-0.0066362847574055195,
0.07888445258140564,
-0.10654489696025848,
0.08126108348369598,
-0.09376639127731323,
0.02588050626218319,
-0.1616906374692917,
-0.07996263355016708,
-0.03419724106788635,
-0.008078072220087051,
0.12200538069009781,
0.13364510238170624,
-0.14062905311584473,
-0.05600028857588768,
0.1512119472026825,
-0.07294795662164688,
-0.1363886147737503,
0.1261066198348999,
0.01589849404990673,
0.015701599419116974,
0.005282900296151638,
0.1329200565814972,
0.10712488740682602,
-0.09420054405927658,
-0.03621061146259308,
-0.0951317846775055,
0.03940589353442192,
0.09623653441667557,
0.07753422111272812,
-0.10275530815124512,
0.03863747790455818,
-0.0007997153443284333,
-0.05160953477025032,
-0.023965034633874893,
-0.08384566754102707,
-0.059339385479688644,
-0.03215490281581879,
-0.043485213071107864,
0.00783766619861126,
0.011168929748237133,
-0.014028280042111874,
-0.1047176644206047,
-0.1488851010799408,
-0.04363235458731651,
0.10865513980388641,
-0.07231377065181732,
0.03555300459265709,
-0.10352267324924469,
0.11015988141298294,
-0.043667227029800415,
0.03177357092499733,
-0.17808829247951508,
-0.01711398735642433,
0.03475157916545868,
-0.03543322905898094,
-0.02074217051267624,
-0.04792029410600662,
0.04407421126961708,
-0.00536611070856452,
-0.03928587958216667,
-0.060610681772232056,
-0.08400183916091919,
-0.015484930016100407,
-0.049327924847602844,
-0.18584658205509186,
-0.11728329211473465,
-0.024503225460648537,
0.18340671062469482,
-0.22893750667572021,
0.02430650219321251,
0.10799381136894226,
0.14768630266189575,
0.020627832040190697,
-0.0474732331931591,
-0.007728880271315575,
0.06320536136627197,
-0.045239079743623734,
-0.08553563803434372,
0.027598578482866287,
-0.0009059011936187744,
-0.10028337687253952,
0.0023362014908343554,
-0.13315556943416595,
0.023160705342888832,
0.0993351861834526,
0.036207035183906555,
-0.07729386538267136,
-0.06839776039123535,
-0.06412632018327713,
-0.04143311828374863,
-0.02694327011704445,
-0.06082100048661232,
0.04829827696084976,
0.0500543937087059,
0.10487566143274307,
-0.06748741865158081,
-0.030738070607185364,
0.050101496279239655,
-0.007102339528501034,
-0.015054228715598583,
0.10801707953214645,
0.010903969407081604,
-0.11259035766124725,
0.09242450445890427,
0.10765358060598373,
-0.020402055233716965,
0.09569196403026581,
-0.039880696684122086,
-0.09757551550865173,
-0.051243722438812256,
0.061149198561906815,
0.052956581115722656,
0.11600197106599808,
-0.14493577182292938,
-0.01037386804819107,
0.03041704371571541,
0.030597617849707603,
0.032113216817379,
-0.17667198181152344,
0.058644849807024,
0.03454937785863876,
-0.07490130513906479,
0.01417969074100256,
0.011562836356461048,
-0.009670654311776161,
0.06450097262859344,
0.04085248336195946,
0.003914182540029287,
0.002591226249933243,
-0.030300917103886604,
-0.0939716175198555,
0.15655481815338135,
-0.10779601335525513,
-0.1626269519329071,
-0.1543281227350235,
-0.0007972292369231582,
-0.071149542927742,
-0.020540213212370872,
0.060618504881858826,
-0.049584414809942245,
-0.05874745547771454,
-0.06585966050624847,
-0.02775360643863678,
-0.0262896791100502,
0.006874234415590763,
0.026074009016156197,
0.012342765927314758,
0.07652027159929276,
-0.09693607687950134,
-0.003084716619923711,
0.015590333379805088,
-0.035998210310935974,
0.006005581934005022,
0.021221991628408432,
0.10591007769107819,
0.11874113231897354,
0.012257826514542103,
0.05146864429116249,
-0.021016452461481094,
0.16409330070018768,
-0.1295478492975235,
0.012738574296236038,
0.16745643317699432,
0.02004551887512207,
0.07085230946540833,
0.10364191979169846,
0.007874878123402596,
-0.08896925300359726,
0.03082399070262909,
0.07551705837249756,
0.001581078628078103,
-0.2590424716472626,
-0.007568870671093464,
-0.07781617343425751,
-0.012925730086863041,
0.0757569670677185,
0.03846095874905586,
0.031545963138341904,
0.03265562281012535,
-0.05292722210288048,
-0.03974621742963791,
0.014080092310905457,
0.11303944885730743,
0.09082631766796112,
0.02455868199467659,
0.09937427937984467,
-0.03851731866598129,
0.03796398639678955,
0.04378379136323929,
0.04595770314335823,
0.21845412254333496,
0.03125134855508804,
0.14877885580062866,
0.09724347293376923,
0.15944825112819672,
-0.02754461206495762,
0.02938157506287098,
0.01272638514637947,
0.008641785942018032,
0.027815863490104675,
-0.06212814897298813,
-0.055148277431726456,
0.05687008053064346,
0.09230559319257736,
0.026177966967225075,
-0.12306942790746689,
-0.004165018443018198,
0.07975677400827408,
0.3392668664455414,
0.09745432436466217,
-0.26034894585609436,
-0.06942562758922577,
0.021250449120998383,
-0.043234068900346756,
-0.08263136446475983,
-0.024546584114432335,
0.12544602155685425,
-0.11607944965362549,
0.05769921466708183,
-0.05916932597756386,
0.10121136903762817,
-0.008884384296834469,
0.02402435429394245,
0.07630801200866699,
0.13403981924057007,
0.005061781033873558,
0.05081195384263992,
-0.24505296349525452,
0.2153661698102951,
0.040125522762537,
0.09204008430242538,
-0.05711619555950165,
0.05098825693130493,
0.017709314823150635,
-0.027584346011281013,
0.09438958019018173,
-0.011435536667704582,
-0.05381416156888008,
-0.12173335254192352,
-0.057628050446510315,
0.005683042109012604,
0.20351196825504303,
-0.0798400416970253,
0.13632656633853912,
-0.04966017231345177,
-0.012621545232832432,
0.0027025053277611732,
0.0017000003717839718,
-0.1461838334798813,
-0.09827875345945358,
0.06909152120351791,
0.01805444061756134,
0.08414370566606522,
-0.08609632402658463,
-0.05610358342528343,
-0.043941523879766464,
0.2738161087036133,
-0.13348184525966644,
-0.033456526696681976,
-0.13724488019943237,
0.06148410588502884,
0.14899912476539612,
-0.0412675216794014,
0.052752234041690826,
-0.003438582643866539,
0.12211941927671432,
0.027862997725605965,
0.013183188624680042,
0.09129247069358826,
-0.06784988939762115,
-0.23785603046417236,
-0.02620498463511467,
0.172914057970047,
-0.02419835515320301,
0.016659116372466087,
0.008414867334067822,
-0.0065610078163445,
0.0020546684972941875,
-0.09805246442556381,
0.05261022970080376,
0.07827125489711761,
0.055700644850730896,
0.08026350289583206,
-0.04108482226729393,
0.00514317536726594,
-0.10687727481126785,
-0.06586825847625732,
0.06671936810016632,
0.28808173537254333,
-0.07659803330898285,
0.0013879293110221624,
0.050631891936063766,
-0.07055560499429703,
-0.14192184805870056,
0.021647930145263672,
0.07358428835868835,
0.0523986853659153,
-0.013950576074421406,
-0.13139717280864716,
-0.047480642795562744,
0.12501563131809235,
-0.02663250081241131,
0.16494624316692352,
-0.2979636788368225,
-0.11465005576610565,
0.004323148634284735,
0.047321438789367676,
-0.021839002147316933,
-0.1984694004058838,
-0.09426076710224152,
-0.02232392318546772,
-0.1515912115573883,
0.021981053054332733,
-0.05257680267095566,
0.13102568686008453,
-0.002107745036482811,
-0.02362644113600254,
0.02605847641825676,
-0.05312993377447128,
0.18484769761562347,
-0.003611423308029771,
0.04316216707229614,
-0.003195863449946046,
0.07807620614767075,
0.06738589704036713,
-0.08840151876211166,
0.03686797246336937,
-0.07352358847856522,
0.03772713989019394,
-0.16029168665409088,
-0.010823537595570087,
-0.07071404159069061,
0.024353258311748505,
-0.06431875377893448,
-0.02221747487783432,
-0.03436822444200516,
0.07615895569324493,
0.05178254097700119,
0.021813932806253433,
0.0565536767244339,
-0.0625198557972908,
0.14318910241127014,
0.21417118608951569,
0.07983628660440445,
-0.01939312368631363,
-0.09689408540725708,
0.025119230151176453,
-0.020015791058540344,
0.05024239420890808,
-0.13061203062534332,
0.0581049770116806,
0.1398351788520813,
0.045792147517204285,
0.16584716737270355,
0.04437737539410591,
-0.13564977049827576,
0.020016031339764595,
0.07840721309185028,
-0.09401850402355194,
-0.13161025941371918,
-0.005206243135035038,
0.019870959222316742,
-0.12742041051387787,
0.042966797947883606,
0.13846130669116974,
-0.0041034407913684845,
-0.01735319383442402,
0.01075497642159462,
0.07165718823671341,
-0.04566275328397751,
0.18734456598758698,
0.016361841931939125,
0.11700087785720825,
-0.0623757429420948,
0.1287555694580078,
0.027645763009786606,
-0.1505725085735321,
0.04290122538805008,
0.04666111618280411,
-0.05788209289312363,
-0.0389910563826561,
-0.062455806881189346,
-0.018060563132166862,
-0.005070510786026716,
-0.08360843360424042,
-0.07376113533973694,
-0.11847749352455139,
0.03431215137243271,
0.08093664050102234,
0.021057557314634323,
0.07484880089759827,
0.03661962226033211,
0.008372556418180466,
-0.12649746239185333,
0.06283427774906158,
0.09059548377990723,
0.0473313182592392,
-0.13537587225437164,
0.07113084942102432,
0.0014476376818493009,
0.014564123004674911,
0.011675850488245487,
-0.00757858669385314,
-0.12101580202579498,
0.002470254199579358,
-0.09888621419668198,
-0.037381984293460846,
-0.052244313061237335,
-0.04071718454360962,
0.03438428044319153,
-0.031141232699155807,
-0.05493563786149025,
0.03934812545776367,
-0.09710398316383362,
-0.11160188913345337,
-0.014523237012326717,
0.05916967988014221,
-0.1144586130976677,
-0.009772012010216713,
0.044427692890167236,
-0.16044147312641144,
0.08872333914041519,
0.01323186606168747,
0.045564357191324234,
-0.008205755613744259,
-0.03594055399298668,
-0.00047764889313839376,
0.027388056740164757,
-0.004339014645665884,
0.03582990542054176,
-0.18283255398273468,
0.012262863107025623,
-0.028546441346406937,
0.016142189502716064,
0.003997839987277985,
0.049293890595436096,
-0.1370609551668167,
-0.025716213509440422,
-0.04758115112781525,
-0.002945003332570195,
-0.05119563266634941,
0.044781703501939774,
0.07940181344747543,
0.05386995151638985,
0.1433345377445221,
-0.05352497100830078,
0.03378370776772499,
-0.1749359667301178,
0.020044097676873207,
-0.038307320326566696,
-0.022392531856894493,
0.022875383496284485,
0.00898805819451809,
0.08509993553161621,
-0.0660133883357048,
0.06133679300546646,
-0.07335219532251358,
0.12510575354099274,
0.04052995890378952,
-0.16756030917167664,
-0.045110899955034256,
0.03221789002418518,
0.14203204214572906,
0.042077865451574326,
-0.012271515093743801,
0.03231363743543625,
-0.013232172466814518,
0.025995677337050438,
0.01560264267027378,
0.16183888912200928,
0.17309993505477905,
0.04766112565994263,
0.11178531497716904,
0.06471408903598785,
-0.12679290771484375,
-0.11435680091381073,
0.09371216595172882,
-0.026687288656830788,
0.14584320783615112,
-0.06090521067380905,
0.10405918955802917,
0.17471227049827576,
-0.1944739818572998,
0.06418467313051224,
-0.04250901937484741,
-0.07293593138456345,
-0.09484393149614334,
-0.04537511616945267,
-0.06471379101276398,
-0.14026238024234772,
0.017799846827983856,
-0.09501215070486069,
0.08639204502105713,
0.06720095872879028,
0.028034470975399017,
-0.0005421272944658995,
0.06515878438949585,
-0.021042779088020325,
-0.036453135311603546,
0.12438567727804184,
0.0022586171980947256,
0.0011540238047018647,
0.015426085330545902,
-0.09357324242591858,
0.05209769308567047,
-0.01337157841771841,
0.11189872026443481,
0.030562615022063255,
-0.021841417998075485,
0.0562707856297493,
0.010894503444433212,
-0.09220033884048462,
0.012003745883703232,
-0.0011003840481862426,
0.05440569296479225,
0.13589218258857727,
0.04200201854109764,
-0.026956290006637573,
-0.02653118595480919,
0.21509774029254913,
-0.10205776989459991,
-0.018073637038469315,
-0.15718674659729004,
0.2454432249069214,
0.058208249509334564,
0.012059501372277737,
-0.009992875158786774,
-0.13523875176906586,
0.009843241423368454,
0.17141461372375488,
0.08730947226285934,
-0.00938490778207779,
-0.01930948905646801,
0.02341899462044239,
-0.012674601748585701,
-0.05219918116927147,
0.07178927212953568,
0.09840822964906693,
0.002933985088020563,
-0.019857307896018028,
0.01909438520669937,
0.005413094535470009,
-0.056669194251298904,
-0.042628783732652664,
0.07773043215274811,
-0.019831733778119087,
-0.0021588995587080717,
-0.015134366229176521,
0.09697436541318893,
-0.014201429672539234,
-0.17268802225589752,
0.10031628608703613,
-0.12819282710552216,
-0.1719413846731186,
-0.028346268460154533,
0.053329285234212875,
0.004492165055125952,
0.10727346688508987,
-0.012401873245835304,
-0.03151509165763855,
0.1420651078224182,
-0.0006105824722908437,
0.008312258869409561,
-0.12759317457675934,
0.05697441101074219,
-0.0639941617846489,
0.1896301954984665,
-0.020445754751563072,
0.023770678788423538,
0.1508461833000183,
0.013164815492928028,
-0.14860239624977112,
0.04173198714852333,
0.05674317851662636,
-0.10282681882381439,
0.056461915373802185,
0.16560271382331848,
-0.00100870116148144,
0.07842881232500076,
0.042790260165929794,
-0.10175938159227371,
-0.017633438110351562,
-0.036161042749881744,
-0.00448276661336422,
-0.0864839181303978,
-0.004946678411215544,
-0.0581875741481781,
0.13437646627426147,
0.18696673214435577,
-0.07382100820541382,
0.025678521022200584,
-0.05455086752772331,
0.03678732365369797,
0.02772831730544567,
0.12139122933149338,
-0.04490029439330101,
-0.2050289511680603,
0.05962574481964111,
-0.007817521691322327,
-0.014710567891597748,
-0.23656338453292847,
-0.07094547152519226,
0.05624737590551376,
-0.01958896778523922,
-0.03983897715806961,
0.10744515806436539,
0.056930601596832275,
0.04589083790779114,
-0.04298165813088417,
-0.19820404052734375,
-0.0278934296220541,
0.17927028238773346,
-0.15293826162815094,
-0.0678924098610878
] |
null | null | transformers | ---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
## Model description
This model is a fine-tuned version of [facebook/wav2vec2-xls-r-300m](https://huggingface.co/facebook/wav2vec2-xls-r-300m) on the MOZILLA-FOUNDATION/COMMON_VOICE_7_0 - FR dataset.
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 7.5e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- gradient_accumulation_steps: 8
- total_train_batch_size: 128
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 2000
- num_epochs: 2.0
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Wer |
|:-------------:|:-----:|:----:|:---------------:|:------:|
| 3.495 | 0.16 | 500 | 3.3883 | 1.0 |
| 2.9095 | 0.32 | 1000 | 2.9152 | 1.0000 |
| 1.8434 | 0.49 | 1500 | 1.0473 | 0.7446 |
| 1.4298 | 0.65 | 2000 | 0.5729 | 0.5130 |
| 1.1937 | 0.81 | 2500 | 0.3795 | 0.3450 |
| 1.1248 | 0.97 | 3000 | 0.3321 | 0.3052 |
| 1.0835 | 1.13 | 3500 | 0.3038 | 0.2805 |
| 1.0479 | 1.3 | 4000 | 0.2910 | 0.2689 |
| 1.0413 | 1.46 | 4500 | 0.2798 | 0.2593 |
| 1.014 | 1.62 | 5000 | 0.2727 | 0.2512 |
| 1.004 | 1.78 | 5500 | 0.2646 | 0.2471 |
| 0.9949 | 1.94 | 6000 | 0.2619 | 0.2457 |
It achieves the best result on STEP 6000 on the validation set:
- Loss: 0.2619
- Wer: 0.2457
### Framework versions
- Transformers 4.17.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.18.2.dev0
- Tokenizers 0.11.0
### Evaluation Commands
1. To evaluate on `mozilla-foundation/common_voice_7` with split `test`
```bash
python eval.py --model_id Plim/xls-r-300m-fr --dataset mozilla-foundation/common_voice_7_0 --config fr --split test
```
2. To evaluate on `speech-recognition-community-v2/dev_data`
```bash
python eval.py --model_id Plim/xls-r-300m-fr --dataset speech-recognition-community-v2/dev_data --config fr --split validation --chunk_length_s 5.0 --stride_length_s 1.0
```
| {"language": ["fr"], "license": "apache-2.0", "tags": ["automatic-speech-recognition", "mozilla-foundation/common_voice_7_0", "generated_from_trainer", "robust-speech-event", "hf-asr-leaderboard"], "datasets": ["mozilla-foundation/common_voice_7_0"], "model-index": [{"name": "XLS-R-300M - French", "results": [{"task": {"type": "automatic-speech-recognition", "name": "Automatic Speech Recognition"}, "dataset": {"name": "Common Voice 7", "type": "mozilla-foundation/common_voice_7_0", "args": "fr"}, "metrics": [{"type": "wer", "value": 24.56, "name": "Test WER"}, {"type": "cer", "value": 7.3, "name": "Test CER"}]}, {"task": {"type": "automatic-speech-recognition", "name": "Automatic Speech Recognition"}, "dataset": {"name": "Robust Speech Event - Dev Data", "type": "speech-recognition-community-v2/dev_data", "args": "fr"}, "metrics": [{"type": "wer", "value": 63.62, "name": "Test WER"}, {"type": "cer", "value": 17.2, "name": "Test CER"}]}, {"task": {"type": "automatic-speech-recognition", "name": "Automatic Speech Recognition"}, "dataset": {"name": "Robust Speech Event - Test Data", "type": "speech-recognition-community-v2/eval_data", "args": "fr"}, "metrics": [{"type": "wer", "value": 66.45, "name": "Test WER"}]}]}]} | automatic-speech-recognition | Plim/xls-r-300m-fr | [
"transformers",
"pytorch",
"wav2vec2",
"automatic-speech-recognition",
"mozilla-foundation/common_voice_7_0",
"generated_from_trainer",
"robust-speech-event",
"hf-asr-leaderboard",
"fr",
"dataset:mozilla-foundation/common_voice_7_0",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [
"fr"
] | TAGS
#transformers #pytorch #wav2vec2 #automatic-speech-recognition #mozilla-foundation/common_voice_7_0 #generated_from_trainer #robust-speech-event #hf-asr-leaderboard #fr #dataset-mozilla-foundation/common_voice_7_0 #license-apache-2.0 #model-index #endpoints_compatible #region-us
|
---
Model description
-----------------
This model is a fine-tuned version of facebook/wav2vec2-xls-r-300m on the MOZILLA-FOUNDATION/COMMON\_VOICE\_7\_0 - FR dataset.
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 7.5e-05
* train\_batch\_size: 16
* eval\_batch\_size: 16
* seed: 42
* gradient\_accumulation\_steps: 8
* total\_train\_batch\_size: 128
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* lr\_scheduler\_warmup\_steps: 2000
* num\_epochs: 2.0
* mixed\_precision\_training: Native AMP
### Training results
It achieves the best result on STEP 6000 on the validation set:
* Loss: 0.2619
* Wer: 0.2457
### Framework versions
* Transformers 4.17.0.dev0
* Pytorch 1.10.2+cu102
* Datasets 1.18.2.dev0
* Tokenizers 0.11.0
### Evaluation Commands
1. To evaluate on 'mozilla-foundation/common\_voice\_7' with split 'test'
2. To evaluate on 'speech-recognition-community-v2/dev\_data'
| [
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 7.5e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* gradient\\_accumulation\\_steps: 8\n* total\\_train\\_batch\\_size: 128\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 2000\n* num\\_epochs: 2.0\n* mixed\\_precision\\_training: Native AMP",
"### Training results\n\n\n\nIt achieves the best result on STEP 6000 on the validation set:\n\n\n* Loss: 0.2619\n* Wer: 0.2457",
"### Framework versions\n\n\n* Transformers 4.17.0.dev0\n* Pytorch 1.10.2+cu102\n* Datasets 1.18.2.dev0\n* Tokenizers 0.11.0",
"### Evaluation Commands\n\n\n1. To evaluate on 'mozilla-foundation/common\\_voice\\_7' with split 'test'\n2. To evaluate on 'speech-recognition-community-v2/dev\\_data'"
] | [
"TAGS\n#transformers #pytorch #wav2vec2 #automatic-speech-recognition #mozilla-foundation/common_voice_7_0 #generated_from_trainer #robust-speech-event #hf-asr-leaderboard #fr #dataset-mozilla-foundation/common_voice_7_0 #license-apache-2.0 #model-index #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 7.5e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* gradient\\_accumulation\\_steps: 8\n* total\\_train\\_batch\\_size: 128\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 2000\n* num\\_epochs: 2.0\n* mixed\\_precision\\_training: Native AMP",
"### Training results\n\n\n\nIt achieves the best result on STEP 6000 on the validation set:\n\n\n* Loss: 0.2619\n* Wer: 0.2457",
"### Framework versions\n\n\n* Transformers 4.17.0.dev0\n* Pytorch 1.10.2+cu102\n* Datasets 1.18.2.dev0\n* Tokenizers 0.11.0",
"### Evaluation Commands\n\n\n1. To evaluate on 'mozilla-foundation/common\\_voice\\_7' with split 'test'\n2. To evaluate on 'speech-recognition-community-v2/dev\\_data'"
] | [
111,
159,
32,
39,
57
] | [
"passage: TAGS\n#transformers #pytorch #wav2vec2 #automatic-speech-recognition #mozilla-foundation/common_voice_7_0 #generated_from_trainer #robust-speech-event #hf-asr-leaderboard #fr #dataset-mozilla-foundation/common_voice_7_0 #license-apache-2.0 #model-index #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 7.5e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* gradient\\_accumulation\\_steps: 8\n* total\\_train\\_batch\\_size: 128\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 2000\n* num\\_epochs: 2.0\n* mixed\\_precision\\_training: Native AMP### Training results\n\n\n\nIt achieves the best result on STEP 6000 on the validation set:\n\n\n* Loss: 0.2619\n* Wer: 0.2457### Framework versions\n\n\n* Transformers 4.17.0.dev0\n* Pytorch 1.10.2+cu102\n* Datasets 1.18.2.dev0\n* Tokenizers 0.11.0### Evaluation Commands\n\n\n1. To evaluate on 'mozilla-foundation/common\\_voice\\_7' with split 'test'\n2. To evaluate on 'speech-recognition-community-v2/dev\\_data'"
] | [
-0.13069859147071838,
0.10591106116771698,
-0.006190532352775335,
0.06644678860902786,
0.08422137051820755,
0.031233545392751694,
0.01707153581082821,
0.17761115729808807,
-0.052214477211236954,
0.10755587369203568,
0.06849461793899536,
0.10647522658109665,
0.08518850058317184,
0.1315607875585556,
-0.02718825824558735,
-0.20512309670448303,
0.025151625275611877,
-0.04913864657282829,
-0.06485677510499954,
0.11455004662275314,
0.09314917773008347,
-0.09514287114143372,
0.02503039501607418,
0.01745316945016384,
-0.05198105052113533,
0.0015528686344623566,
-0.023120582103729248,
-0.012081864289939404,
0.06697491556406021,
0.020326489582657814,
0.044968102127313614,
0.027806639671325684,
0.025524664670228958,
-0.25283223390579224,
-0.0033904362935572863,
0.09254402667284012,
0.03631136193871498,
0.05804792791604996,
0.12017317116260529,
-0.05601070448756218,
0.10187769681215286,
-0.09136112034320831,
0.0037138285115361214,
0.07835910469293594,
-0.11184556037187576,
-0.2352338582277298,
-0.12706221640110016,
0.06328427046537399,
0.13487794995307922,
0.06290242075920105,
-0.04572059586644173,
0.03974718227982521,
-0.09338466823101044,
0.0821637287735939,
0.21688833832740784,
-0.25909677147865295,
-0.049531854689121246,
0.007230700924992561,
0.03197479248046875,
-0.01694006845355034,
-0.1208975687623024,
-0.011409863829612732,
0.03194299712777138,
-0.010884392075240612,
0.05025099590420723,
0.0022293732035905123,
0.06777889281511307,
0.015629854053258896,
-0.1340136080980301,
-0.0904431864619255,
0.1245400533080101,
0.08768247067928314,
-0.056873586028814316,
-0.09649074077606201,
-0.015707572922110558,
-0.22044305503368378,
-0.03375693038105965,
0.025576965883374214,
0.021619394421577454,
-0.013065850362181664,
-0.0414111502468586,
0.04298892617225647,
-0.05799213796854019,
-0.08916418254375458,
0.06794839352369308,
0.10206248611211777,
0.040586572140455246,
-0.03187328577041626,
0.04399169981479645,
0.13121436536312103,
-0.006672238931059837,
-0.1313861608505249,
-0.08983048796653748,
0.002648907480761409,
-0.15573354065418243,
-0.01991884410381317,
-0.0006043507601134479,
0.04935388267040253,
0.04960284009575844,
0.16528324782848358,
0.004605901427567005,
0.0811004638671875,
0.045400381088256836,
0.02957715094089508,
-0.06921187043190002,
0.16283532977104187,
-0.043134406208992004,
-0.11180156469345093,
-0.05090651661157608,
0.12505146861076355,
-0.014689845964312553,
0.013658471405506134,
0.016968758776783943,
0.017453238368034363,
0.11877376586198807,
0.08294704556465149,
0.025700442492961884,
0.03472791984677315,
-0.10334479808807373,
-0.003964679315686226,
0.022089265286922455,
-0.14518922567367554,
0.05325518548488617,
0.05348515510559082,
-0.10757555067539215,
-0.022901054471731186,
0.01212626788765192,
-0.02559509128332138,
-0.0490855947136879,
0.0968121662735939,
-0.047059789299964905,
-0.011443354189395905,
-0.09822109341621399,
-0.1044476181268692,
0.033704642206430435,
-0.04280657693743706,
-0.04364838823676109,
-0.05739998817443848,
-0.1405460685491562,
-0.08187363296747208,
0.05428597331047058,
-0.07606106996536255,
-0.021349085494875908,
-0.06103429198265076,
-0.10909945517778397,
0.037896085530519485,
-0.019647520035505295,
0.12839320302009583,
-0.053257428109645844,
0.08568018674850464,
0.03970552608370781,
0.05024412274360657,
0.13548798859119415,
0.04070014879107475,
-0.02212001197040081,
0.08304659277200699,
-0.14756889641284943,
0.13735371828079224,
-0.12719976902008057,
0.0823310986161232,
-0.15436428785324097,
-0.08287814259529114,
-0.02722659893333912,
-0.002895072568207979,
0.11153247207403183,
0.15398694574832916,
-0.16999278962612152,
-0.06131727248430252,
0.15873834490776062,
-0.06279751658439636,
-0.09678279608488083,
0.11610458791255951,
-0.0034352955408394337,
-0.0070294286124408245,
0.03647322952747345,
0.1314706951379776,
0.16120783984661102,
-0.08294650912284851,
-0.05406719446182251,
-0.046853259205818176,
0.02951587177813053,
0.07681335508823395,
0.06410031020641327,
-0.057347435504198074,
0.03848206251859665,
0.0021425571758300066,
-0.06536193192005157,
0.010726156644523144,
-0.0762968584895134,
-0.07964836061000824,
-0.029172617942094803,
-0.053836580365896225,
0.01836102455854416,
0.04562506824731827,
0.004883162677288055,
-0.08191995322704315,
-0.13794253766536713,
-0.03580159693956375,
0.11525344848632812,
-0.06809260696172714,
0.01411812286823988,
-0.12067466974258423,
0.0919053703546524,
-0.029922451823949814,
0.03638697788119316,
-0.16487263143062592,
-0.015557555481791496,
0.04180465638637543,
-0.06620069593191147,
-0.05403197556734085,
-0.01686341129243374,
0.08119023591279984,
0.010903884656727314,
-0.018955059349536896,
-0.06783504039049149,
-0.023414412513375282,
-0.01628420688211918,
-0.04293794929981232,
-0.22289182245731354,
-0.10300598293542862,
-0.010218203067779541,
0.17966632544994354,
-0.24313059449195862,
0.0031692825723439455,
0.10552257299423218,
0.11850414425134659,
0.0036961755249649286,
-0.0473911352455616,
0.019066326320171356,
0.03610502928495407,
-0.02686861716210842,
-0.07184310257434845,
0.026991818100214005,
-0.0008499834220856428,
-0.11089278757572174,
0.014090456999838352,
-0.10794635117053986,
0.010503449477255344,
0.06571187824010849,
0.032565437257289886,
-0.08452171832323074,
-0.020868713036179543,
-0.06959477066993713,
-0.043952617794275284,
-0.029703805223107338,
-0.06036732718348503,
0.0859544575214386,
0.05178564414381981,
0.1109379455447197,
-0.07437880337238312,
-0.074199378490448,
0.020802447572350502,
0.010360009036958218,
0.007011390291154385,
0.1895623505115509,
-0.0011577127734199166,
-0.07946982234716415,
0.06184813752770424,
0.020488841459155083,
-0.03441472724080086,
0.12451612949371338,
-0.06752625107765198,
-0.09053383767604828,
-0.043280914425849915,
0.08985096216201782,
0.06315082311630249,
0.0920303538441658,
-0.13529115915298462,
-0.005603306461125612,
0.02962115965783596,
0.03895905241370201,
0.015162095427513123,
-0.1861885040998459,
0.018346361815929413,
0.041818853467702866,
-0.09004636108875275,
-0.038562845438718796,
0.007727878168225288,
0.0005454238853417337,
0.07665470242500305,
0.0016137890052050352,
-0.04784445837140083,
-0.024107221513986588,
-0.053187090903520584,
-0.08780010789632797,
0.16089887917041779,
-0.08062457293272018,
-0.1300658881664276,
-0.13663682341575623,
-0.004111361224204302,
-0.014016378670930862,
-0.02790127880871296,
0.03870050981640816,
-0.08948452025651932,
-0.0659172311425209,
-0.07389572262763977,
-0.017290135845541954,
-0.017837945371866226,
-0.0022589608561247587,
0.03924892842769623,
0.007234467659145594,
0.11030185967683792,
-0.10598479211330414,
0.004829676356166601,
-0.0016495330492034554,
-0.04461928457021713,
0.0147336320951581,
0.020742136985063553,
0.09802722185850143,
0.12971921265125275,
0.050760071724653244,
0.05523570254445076,
-0.015322142280638218,
0.1702485829591751,
-0.1268419325351715,
0.001692369463853538,
0.10409339517354965,
0.03362782299518585,
0.06615323573350906,
0.1185929924249649,
0.015509459190070629,
-0.08385404199361801,
0.012120651081204414,
0.059629663825035095,
-0.016125081107020378,
-0.23929041624069214,
0.00286399619653821,
-0.08530310541391373,
0.007822495885193348,
0.09425493329763412,
0.053850289434194565,
0.0002564371097832918,
0.0036229686811566353,
-0.012261811643838882,
-0.02952658385038376,
0.05710773542523384,
0.061715587973594666,
0.025535082444548607,
0.04929457977414131,
0.09049785137176514,
-0.05171721428632736,
0.008371221832931042,
0.03374190628528595,
0.008935400284826756,
0.2409949004650116,
-0.03490861505270004,
0.19807328283786774,
0.10107304900884628,
0.1489148736000061,
-0.016722414642572403,
0.0651298314332962,
-0.010817013680934906,
0.011292262002825737,
0.04625384509563446,
-0.05603030323982239,
-0.04288448765873909,
0.0266436580568552,
0.0818772166967392,
0.03145570680499077,
-0.10025815665721893,
0.025306643918156624,
0.08208786696195602,
0.33552443981170654,
0.11048724502325058,
-0.27500849962234497,
-0.06981702893972397,
0.015927590429782867,
-0.058154135942459106,
-0.03197362646460533,
-0.0277544017881155,
0.10923057049512863,
-0.09254702180624008,
0.08569040149450302,
-0.06145913153886795,
0.09138582646846771,
-0.0813523381948471,
0.0040538557805120945,
0.08678543567657471,
0.10343873500823975,
0.03153315186500549,
0.057694997638463974,
-0.22489620745182037,
0.2384013682603836,
-0.004704366438090801,
0.1055009663105011,
-0.01967736706137657,
0.05434378609061241,
0.044490616768598557,
-0.015216520987451077,
0.10293581336736679,
0.00433281110599637,
-0.03207022324204445,
-0.16559551656246185,
-0.0897006094455719,
-0.008099828846752644,
0.11253632605075836,
-0.08065690845251083,
0.1390162855386734,
-0.052807532250881195,
-0.06226882338523865,
0.009112992323935032,
-0.008252360858023167,
-0.12333518266677856,
-0.09356827288866043,
0.07470772415399551,
-0.015686316415667534,
0.09431187808513641,
-0.08107980340719223,
-0.05501415207982063,
-0.06005028262734413,
0.22062620520591736,
-0.1585981547832489,
-0.02986479178071022,
-0.13239894807338715,
0.05565696954727173,
0.15723825991153717,
-0.06408756226301193,
0.027193637564778328,
0.006608281750231981,
0.11840831488370895,
0.0575137697160244,
0.0070200772024691105,
0.08212129026651382,
-0.07729706913232803,
-0.217055544257164,
-0.03358813375234604,
0.15395887196063995,
0.018913710489869118,
0.04090297967195511,
0.01527435053139925,
-0.0009540411992929876,
-0.0027855178341269493,
-0.09166504442691803,
0.07799205183982849,
0.07309725135564804,
-0.0021220315247774124,
0.042308490723371506,
-0.013454982079565525,
0.03331602364778519,
-0.0900040939450264,
-0.039912279695272446,
0.07118556648492813,
0.28957831859588623,
-0.07994255423545837,
0.02010110206902027,
-0.0044817631132900715,
-0.07519488036632538,
-0.1423456072807312,
-0.019977951422333717,
0.10069169849157333,
0.03997652605175972,
-0.04925704747438431,
-0.1197798103094101,
-0.012085271999239922,
0.09922907501459122,
-0.022880004718899727,
0.08870115876197815,
-0.3288417458534241,
-0.12298239767551422,
0.014662988483905792,
0.038893163204193115,
-0.02672453597187996,
-0.17485423386096954,
-0.08504759520292282,
-0.027167176827788353,
-0.1388619989156723,
0.008171013556420803,
0.018738659098744392,
0.1284046769142151,
-0.004643160849809647,
-0.027830056846141815,
0.014012147672474384,
-0.050159357488155365,
0.1626034677028656,
0.03908969834446907,
0.0523698627948761,
0.004580195061862469,
0.08925116062164307,
0.04041558504104614,
-0.10726236552000046,
0.043899547308683395,
-0.05899644270539284,
0.02239294722676277,
-0.17115795612335205,
-0.013930992223322392,
-0.08056127279996872,
0.014876192435622215,
-0.055613648146390915,
0.01692524552345276,
-0.011851917020976543,
0.06761632114648819,
0.08831007033586502,
0.02996569499373436,
0.042379267513751984,
-0.06428268551826477,
0.09615806490182877,
0.11358699947595596,
0.09428377449512482,
0.030059432610869408,
-0.14083236455917358,
0.012634391896426678,
0.02742951177060604,
0.029234344139695168,
-0.1409270167350769,
0.0685177817940712,
0.14988377690315247,
0.04166599363088608,
0.16186591982841492,
0.04179166629910469,
-0.11363720893859863,
0.007890874519944191,
0.09552629292011261,
-0.06455431133508682,
-0.1395021378993988,
0.019815992563962936,
-0.019939515739679337,
-0.1031365618109703,
0.0029830485582351685,
0.1118335947394371,
-0.013764950446784496,
0.003744778223335743,
0.010586262680590153,
0.06227008253335953,
-0.03703779727220535,
0.22898852825164795,
-0.009503360837697983,
0.12540487945079803,
-0.08838743716478348,
0.08671725541353226,
0.023100517690181732,
-0.08852440118789673,
0.013162223622202873,
0.07022985070943832,
-0.054062265902757645,
-0.03146389126777649,
-0.042881663888692856,
0.048621173948049545,
0.06321840733289719,
-0.057361554354429245,
-0.11365658789873123,
-0.1246236190199852,
0.09019923210144043,
0.06800724565982819,
0.025359194725751877,
0.08341223001480103,
0.013345642015337944,
0.007982979528605938,
-0.090650275349617,
0.0788428783416748,
0.15206411480903625,
0.046928320080041885,
-0.12210512906312943,
0.09438877552747726,
-0.0020922012627124786,
0.00015264465764630586,
0.021854152902960777,
-0.004769726190716028,
-0.10035454481840134,
0.02051444537937641,
-0.0959504172205925,
-0.0054040877148509026,
-0.05418785288929939,
-0.02033119462430477,
0.020747622475028038,
-0.012353554368019104,
-0.056801505386829376,
0.037114467471838,
-0.12528470158576965,
-0.07997078448534012,
-0.00024167659285012633,
0.0629613846540451,
-0.11299770325422287,
-0.007369144354015589,
0.04729199782013893,
-0.1511368304491043,
0.07856440544128418,
0.03589383140206337,
0.0289462860673666,
0.008448274806141853,
-0.0835619643330574,
-0.005603536497801542,
0.027429169043898582,
-0.011330203153192997,
0.054150741547346115,
-0.17301680147647858,
0.03298391029238701,
-0.04262056201696396,
-0.01256596576422453,
-0.0036792857572436333,
-0.02112630568444729,
-0.13747793436050415,
-0.009819588623940945,
-0.0064763957634568214,
-0.017772801220417023,
-0.05502111837267876,
0.05591851845383644,
0.08391982316970825,
0.041335027664899826,
0.1605496108531952,
-0.05725930258631706,
0.03479241579771042,
-0.20519167184829712,
-0.00811921339482069,
-0.006827810779213905,
-0.07233119010925293,
0.006000963505357504,
-0.01823640801012516,
0.10655613988637924,
-0.06928414106369019,
0.05259348824620247,
-0.036853719502687454,
0.03988318890333176,
0.023613061755895615,
-0.13562802970409393,
-0.005070567596703768,
0.04461998492479324,
0.08388844132423401,
0.04013790190219879,
-0.010800319723784924,
0.06473618000745773,
-0.02408124879002571,
0.04689457640051842,
0.06721031665802002,
0.16489246487617493,
0.18758197128772736,
0.05500584468245506,
0.09534870833158493,
0.03417017683386803,
-0.13932566344738007,
-0.1071653887629509,
0.12011275440454483,
-0.11635199189186096,
0.16975609958171844,
-0.0555989108979702,
0.1264769732952118,
0.09689496457576752,
-0.18878504633903503,
0.08176914602518082,
-0.06340682506561279,
-0.08511341363191605,
-0.09510421007871628,
-0.09582725167274475,
-0.0625847727060318,
-0.14676229655742645,
0.02721894532442093,
-0.08526016771793365,
0.0808444693684578,
0.06139737740159035,
0.044843945652246475,
0.02448843978345394,
0.0820639505982399,
0.021466802805662155,
-0.002135374117642641,
0.1359691172838211,
-0.016663501039147377,
-0.026516979560256004,
0.00851372629404068,
-0.07396185398101807,
0.04221628978848457,
-0.03185168653726578,
0.08603590726852417,
-0.009396410547196865,
-0.06068895757198334,
0.057065192610025406,
0.022930331528186798,
-0.10765795409679413,
0.013209715485572815,
-0.01596761867403984,
0.05073421075940132,
0.07208598405122757,
0.05626443400979042,
-0.0057455855421721935,
-0.018332188948988914,
0.234176367521286,
-0.07283181697130203,
-0.0329524464905262,
-0.16344018280506134,
0.2067563235759735,
0.04209119826555252,
-0.004202953074127436,
0.011499063111841679,
-0.11083461344242096,
0.006638025399297476,
0.1808803230524063,
0.07687795162200928,
0.010191504843533039,
-0.02051524631679058,
-0.012666561640799046,
-0.009213889949023724,
-0.003016483271494508,
0.06311216205358505,
0.07962865382432938,
0.042635608464479446,
-0.023010455071926117,
0.03096570260822773,
-0.024280970916152,
-0.06802864372730255,
0.00336641538888216,
0.12729217112064362,
0.0026187244802713394,
0.017148727551102638,
-0.03390679880976677,
0.10022282600402832,
-0.04582339525222778,
-0.17274601757526398,
0.08390918374061584,
-0.147968128323555,
-0.17780856788158417,
-0.04790822044014931,
0.06033952534198761,
0.029682757332921028,
0.09377136081457138,
-0.010065415874123573,
-0.04900936782360077,
0.11978324502706528,
0.0011570944916456938,
-0.02756413258612156,
-0.12496302276849747,
0.045300066471099854,
-0.08230715990066528,
0.20682795345783234,
-0.02961687743663788,
-0.009780674241483212,
0.15058860182762146,
-0.001461517415009439,
-0.15021969377994537,
0.03778760880231857,
0.06437305361032486,
-0.14087343215942383,
0.0571373850107193,
0.20376916229724884,
-0.01244609709829092,
0.11010874807834625,
0.0391504243016243,
-0.06861183792352676,
-0.003187354886904359,
-0.05604840815067291,
0.0016109456773847342,
-0.09354640543460846,
-0.010414071381092072,
-0.034563444554805756,
0.10639569908380508,
0.23892256617546082,
-0.07268165051937103,
0.034589529037475586,
-0.07732909917831421,
0.035658709704875946,
-0.0101255401968956,
0.10555382072925568,
-0.03567836806178093,
-0.23294667899608612,
0.04614660143852234,
0.024369806051254272,
0.03304232284426689,
-0.1868038773536682,
-0.09291616827249527,
0.04238848015666008,
-0.05206643417477608,
-0.0482732355594635,
0.1369803100824356,
0.06704986840486526,
0.06959687918424606,
-0.03568854182958603,
-0.10696130990982056,
-0.018350526690483093,
0.16710473597049713,
-0.13267552852630615,
-0.061986327171325684
] |
null | null | transformers |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
#
This model is a fine-tuned version of [./checkpoint-6000](https://huggingface.co/./checkpoint-6000) on the MOZILLA-FOUNDATION/COMMON_VOICE_7_0 - FR dataset.
It achieves the following results on the evaluation set:
- Loss: 0.2619
- Wer: 0.2457
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 7.5e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- gradient_accumulation_steps: 8
- total_train_batch_size: 128
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 2000
- num_epochs: 2.0
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Wer |
|:-------------:|:-----:|:----:|:---------------:|:------:|
| 3.495 | 0.16 | 500 | 3.3883 | 1.0 |
| 2.9095 | 0.32 | 1000 | 2.9152 | 1.0000 |
| 1.8434 | 0.49 | 1500 | 1.0473 | 0.7446 |
| 1.4298 | 0.65 | 2000 | 0.5729 | 0.5130 |
| 1.1937 | 0.81 | 2500 | 0.3795 | 0.3450 |
| 1.1248 | 0.97 | 3000 | 0.3321 | 0.3052 |
| 1.0835 | 1.13 | 3500 | 0.3038 | 0.2805 |
| 1.0479 | 1.3 | 4000 | 0.2910 | 0.2689 |
| 1.0413 | 1.46 | 4500 | 0.2798 | 0.2593 |
| 1.014 | 1.62 | 5000 | 0.2727 | 0.2512 |
| 1.004 | 1.78 | 5500 | 0.2646 | 0.2471 |
| 0.9949 | 1.94 | 6000 | 0.2619 | 0.2457 |
### Framework versions
- Transformers 4.17.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.18.2.dev0
- Tokenizers 0.11.0
| {"language": ["fr"], "tags": ["automatic-speech-recognition", "mozilla-foundation/common_voice_7_0", "generated_from_trainer"], "model-index": [{"name": "", "results": []}]} | automatic-speech-recognition | Plim/xls-r-300m-lm-fr | [
"transformers",
"pytorch",
"wav2vec2",
"automatic-speech-recognition",
"mozilla-foundation/common_voice_7_0",
"generated_from_trainer",
"fr",
"endpoints_compatible",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [
"fr"
] | TAGS
#transformers #pytorch #wav2vec2 #automatic-speech-recognition #mozilla-foundation/common_voice_7_0 #generated_from_trainer #fr #endpoints_compatible #region-us
|
This model is a fine-tuned version of ./checkpoint-6000 on the MOZILLA-FOUNDATION/COMMON\_VOICE\_7\_0 - FR dataset.
It achieves the following results on the evaluation set:
* Loss: 0.2619
* Wer: 0.2457
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 7.5e-05
* train\_batch\_size: 16
* eval\_batch\_size: 16
* seed: 42
* gradient\_accumulation\_steps: 8
* total\_train\_batch\_size: 128
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* lr\_scheduler\_warmup\_steps: 2000
* num\_epochs: 2.0
* mixed\_precision\_training: Native AMP
### Training results
### Framework versions
* Transformers 4.17.0.dev0
* Pytorch 1.10.2+cu102
* Datasets 1.18.2.dev0
* Tokenizers 0.11.0
| [
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 7.5e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* gradient\\_accumulation\\_steps: 8\n* total\\_train\\_batch\\_size: 128\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 2000\n* num\\_epochs: 2.0\n* mixed\\_precision\\_training: Native AMP",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.17.0.dev0\n* Pytorch 1.10.2+cu102\n* Datasets 1.18.2.dev0\n* Tokenizers 0.11.0"
] | [
"TAGS\n#transformers #pytorch #wav2vec2 #automatic-speech-recognition #mozilla-foundation/common_voice_7_0 #generated_from_trainer #fr #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 7.5e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* gradient\\_accumulation\\_steps: 8\n* total\\_train\\_batch\\_size: 128\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 2000\n* num\\_epochs: 2.0\n* mixed\\_precision\\_training: Native AMP",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.17.0.dev0\n* Pytorch 1.10.2+cu102\n* Datasets 1.18.2.dev0\n* Tokenizers 0.11.0"
] | [
62,
159,
4,
39
] | [
"passage: TAGS\n#transformers #pytorch #wav2vec2 #automatic-speech-recognition #mozilla-foundation/common_voice_7_0 #generated_from_trainer #fr #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 7.5e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* gradient\\_accumulation\\_steps: 8\n* total\\_train\\_batch\\_size: 128\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 2000\n* num\\_epochs: 2.0\n* mixed\\_precision\\_training: Native AMP### Training results### Framework versions\n\n\n* Transformers 4.17.0.dev0\n* Pytorch 1.10.2+cu102\n* Datasets 1.18.2.dev0\n* Tokenizers 0.11.0"
] | [
-0.13354748487472534,
0.05410636216402054,
-0.0031321390997618437,
0.033491816371679306,
0.1454644352197647,
0.0016416616272181273,
0.08356544375419617,
0.1395300030708313,
-0.11146029829978943,
0.08497711271047592,
0.09653787314891815,
0.09800130873918533,
0.07481744140386581,
0.09218282997608185,
-0.02248617820441723,
-0.3275825083255768,
0.024035215377807617,
0.018630599603056908,
-0.11906071752309799,
0.11438003927469254,
0.11488046497106552,
-0.1126536950469017,
0.01994781568646431,
0.03961353749036789,
-0.13689054548740387,
0.013266797177493572,
-0.018934525549411774,
-0.06092584878206253,
0.11772904545068741,
0.0419561043381691,
0.09451014548540115,
0.019185660406947136,
0.08581491559743881,
-0.2628604769706726,
0.015857810154557228,
0.06580114364624023,
0.05651186406612396,
0.06670428812503815,
0.11277240514755249,
-0.011814823374152184,
0.1347690373659134,
-0.07367493212223053,
0.04620186984539032,
0.05519728735089302,
-0.10819920897483826,
-0.3157089650630951,
-0.08253967016935349,
0.01597949117422104,
0.10840228199958801,
0.10372696816921234,
-0.03049324080348015,
0.06414889544248581,
-0.06876164674758911,
0.09433772414922714,
0.23711411654949188,
-0.21553345024585724,
-0.09173526614904404,
-0.030316833406686783,
0.04976595565676689,
0.028218258172273636,
-0.13124291598796844,
-0.021934915333986282,
0.0378681980073452,
0.03620598837733269,
0.08312516659498215,
0.014715073630213737,
-0.010522197932004929,
0.004686500411480665,
-0.13482913374900818,
-0.05668632313609123,
0.12371762841939926,
0.0857086256146431,
-0.04124901071190834,
-0.06542530655860901,
-0.017916565760970116,
-0.19049829244613647,
-0.05681261420249939,
0.018324483186006546,
0.020574426278471947,
-0.0355769507586956,
-0.09893887490034103,
0.0008186702616512775,
-0.0781191810965538,
-0.09722106158733368,
0.02687867544591427,
0.19301918148994446,
0.045476026833057404,
-0.03562764450907707,
0.004387982655316591,
0.0963083803653717,
0.04403824731707573,
-0.14400488138198853,
-0.019613252952694893,
0.05211573466658592,
-0.09016676247119904,
-0.019092202186584473,
-0.05018136277794838,
-0.00968908704817295,
0.002734103240072727,
0.1048867478966713,
-0.025770800188183784,
0.08292233198881149,
0.01733706332743168,
0.02544989623129368,
-0.10250663757324219,
0.18362745642662048,
-0.06917450577020645,
-0.011115934699773788,
-0.04583347588777542,
0.08666699379682541,
-0.04053621366620064,
-0.016854166984558105,
-0.05165654048323631,
0.004871436860412359,
0.1138206422328949,
0.0424787700176239,
-0.035445570945739746,
0.02093316800892353,
-0.05242906138300896,
-0.021889330819249153,
-0.04635007679462433,
-0.10955799371004105,
0.029039155691862106,
0.02671172097325325,
-0.09249677509069443,
0.03351183608174324,
-0.01187223196029663,
0.019594745710492134,
-0.02785572223365307,
0.09690525382757187,
-0.06646274030208588,
0.01600509323179722,
-0.10272249579429626,
-0.11404786258935928,
0.026324806734919548,
-0.03508276864886284,
0.011898675002157688,
-0.07009802013635635,
-0.102079376578331,
-0.05999035760760307,
0.051510121673345566,
-0.04767535254359245,
-0.060363855212926865,
-0.07547196745872498,
-0.08211682736873627,
0.046148575842380524,
-0.02773600071668625,
0.18209213018417358,
-0.05428434908390045,
0.11525323241949081,
0.039697613567113876,
0.033124759793281555,
0.027637064456939697,
0.07529785484075546,
-0.03527612239122391,
0.04127337038516998,
-0.10379131883382797,
0.07762131094932556,
-0.08873447775840759,
0.059824563562870026,
-0.13729728758335114,
-0.13036274909973145,
-0.03171771019697189,
0.0027033884543925524,
0.11962175369262695,
0.10230715572834015,
-0.15904712677001953,
-0.09328527748584747,
0.16922985017299652,
-0.06678971648216248,
-0.09629946202039719,
0.1301523745059967,
-0.017782479524612427,
0.01878191903233528,
0.04351118952035904,
0.15823456645011902,
0.10476142168045044,
-0.08132772892713547,
0.022781556472182274,
-0.05250843986868858,
0.1344214379787445,
0.031259678304195404,
0.09983915090560913,
-0.04285905137658119,
0.014832846820354462,
-0.0036211267579346895,
-0.022638093680143356,
0.07631835341453552,
-0.1123289167881012,
-0.0747130811214447,
-0.023500800132751465,
-0.07759670913219452,
0.003968480043113232,
0.06462244689464569,
0.042552363127470016,
-0.0995071530342102,
-0.11049408465623856,
0.03258504346013069,
0.1080322340130806,
-0.11361642926931381,
0.03033626079559326,
-0.08268691599369049,
0.043562836945056915,
-0.025517892092466354,
-0.013442914001643658,
-0.16633820533752441,
-0.013800048269331455,
0.028685247525572777,
-0.025453362613916397,
0.017943084239959717,
-0.014586208388209343,
0.08434820175170898,
0.04373808205127716,
-0.03981570899486542,
-0.06999139487743378,
-0.08583736419677734,
-0.012125949375331402,
-0.07339830696582794,
-0.20478060841560364,
-0.08255625516176224,
-0.01897469349205494,
0.1509622484445572,
-0.22531569004058838,
0.007543977815657854,
0.027940120548009872,
0.11020052433013916,
0.0194211695343256,
-0.0480135940015316,
-0.014422855339944363,
0.07660678774118423,
-0.02067389152944088,
-0.06667400896549225,
0.03654220700263977,
0.0088425287976861,
-0.1107952669262886,
0.030718915164470673,
-0.1093098372220993,
0.09503094106912613,
0.09710045158863068,
-0.058021482080221176,
-0.06944834440946579,
-0.06347070634365082,
-0.06719198822975159,
-0.058781180530786514,
-0.021905574947595596,
-0.011692583560943604,
0.19769451022148132,
0.013682936318218708,
0.1203296035528183,
-0.08612635731697083,
-0.03233029693365097,
0.03710683807730675,
0.002877624938264489,
-0.0014947260497137904,
0.13225020468235016,
0.06349125504493713,
-0.04458088427782059,
0.09136519581079483,
0.0531088262796402,
-0.08104036748409271,
0.15657415986061096,
-0.07084590941667557,
-0.12813445925712585,
-0.005346693564206362,
0.03282266855239868,
0.02653420902788639,
0.10578709840774536,
-0.18569421768188477,
-0.007988872937858105,
0.019725251942873,
0.040734920650720596,
0.04086102917790413,
-0.21957480907440186,
-0.007696136366575956,
0.04814628139138222,
-0.0720033049583435,
-0.04998951777815819,
-0.0037919627502560616,
-0.0031881327740848064,
0.08795355260372162,
0.00937646348029375,
-0.06280611455440521,
-0.01591498963534832,
-0.03225428983569145,
-0.08778859674930573,
0.18726180493831635,
-0.09661554545164108,
-0.1358029842376709,
-0.135182723402977,
-0.044025909155607224,
-0.010547110810875893,
-0.009238076396286488,
0.03998785838484764,
-0.11585889011621475,
-0.03200023993849754,
-0.05071970820426941,
0.0616145096719265,
-0.08197326958179474,
0.029311073943972588,
0.006570734083652496,
0.008129539899528027,
0.07920527458190918,
-0.10481864213943481,
0.020923523232340813,
-0.018085211515426636,
-0.051523417234420776,
0.017096057534217834,
0.008417369797825813,
0.09956333786249161,
0.16090328991413116,
0.030382128432393074,
0.029045449569821358,
-0.03542708978056908,
0.18017621338367462,
-0.11065059900283813,
-0.039604198187589645,
0.1290545016527176,
0.017507372424006462,
0.036514148116111755,
0.10607374459505081,
0.04987836629152298,
-0.0951015055179596,
0.03422042354941368,
0.048914987593889236,
-0.016562815755605698,
-0.23585137724876404,
-0.02076406590640545,
-0.07925614714622498,
-0.04817797616124153,
0.10990433394908905,
0.020652180537581444,
-0.0027686168905347586,
0.03496824949979782,
-0.023036805912852287,
-0.0021200652699917555,
0.009986917488276958,
0.06329348683357239,
0.06992635130882263,
0.04025479033589363,
0.11414022743701935,
-0.012670434080064297,
-0.04367682710289955,
0.01458161510527134,
-0.005272859241813421,
0.2786214351654053,
0.012146919034421444,
0.17318840324878693,
0.05521969497203827,
0.1608797162771225,
0.006182831712067127,
0.09182142466306686,
0.010047337040305138,
-0.03177591413259506,
0.023550424724817276,
-0.04923572018742561,
-0.04369550943374634,
0.03058592416346073,
0.08959305286407471,
0.04229629039764404,
-0.1451398730278015,
-0.03196229040622711,
0.006864896044135094,
0.36634939908981323,
0.06499435007572174,
-0.2999729812145233,
-0.09822843968868256,
-0.008617610670626163,
-0.09031187742948532,
-0.06327448040246964,
0.024974878877401352,
0.11934035271406174,
-0.09541841596364975,
0.0387677401304245,
-0.06651166081428528,
0.11138954758644104,
-0.03890508785843849,
0.0011914944043383002,
0.06576672196388245,
0.09001852571964264,
0.0022605264093726873,
0.07759701460599899,
-0.2754673361778259,
0.31728991866111755,
-0.02105935662984848,
0.08937521278858185,
-0.04343066364526749,
0.030018659308552742,
0.0417049415409565,
-0.02655690722167492,
0.04618076980113983,
-0.00680280989035964,
-0.0954737663269043,
-0.19685572385787964,
-0.05700695142149925,
0.03341131657361984,
0.15364225208759308,
-0.04537614807486534,
0.12826181948184967,
-0.03794151917099953,
0.000014066639778320678,
0.06300214678049088,
-0.07417497783899307,
-0.11648125946521759,
-0.09108059853315353,
0.019499896094202995,
0.027072131633758545,
0.10856164991855621,
-0.10851041227579117,
-0.11378566920757294,
-0.06051290035247803,
0.15900546312332153,
-0.053079430013895035,
0.002190947998315096,
-0.13630974292755127,
0.09903363883495331,
0.1628655344247818,
-0.06567024439573288,
0.05123230442404747,
0.01845588907599449,
0.1398531049489975,
0.04007875174283981,
0.00660674786195159,
0.10293761640787125,
-0.08283230662345886,
-0.18612928688526154,
-0.03828532621264458,
0.1657332479953766,
0.03817949444055557,
0.06586481630802155,
-0.012288452126085758,
0.016480183228850365,
-0.02625349536538124,
-0.07796263694763184,
0.047683235257864,
0.0003151069104205817,
-0.00229000230319798,
0.07318798452615738,
-0.038296110928058624,
0.03694703429937363,
-0.09050910174846649,
-0.06574099510908127,
0.15955154597759247,
0.27057597041130066,
-0.057919323444366455,
-0.016760988160967827,
0.014676214195787907,
-0.05272546038031578,
-0.14680063724517822,
0.04431024193763733,
0.15256233513355255,
0.05071115121245384,
-0.02200843021273613,
-0.2342689484357834,
0.034629106521606445,
0.08782826364040375,
-0.014955325052142143,
0.0866340771317482,
-0.3057085871696472,
-0.12765945494174957,
0.09750712662935257,
0.11928433924913406,
-0.01439397782087326,
-0.14394479990005493,
-0.056935425847768784,
-0.024560660123825073,
-0.09742490202188492,
0.05967200547456741,
-0.011085993610322475,
0.13318049907684326,
0.006320428568869829,
0.06240256130695343,
0.025318870320916176,
-0.050365619361400604,
0.16192837059497833,
-0.009880919009447098,
0.053491584956645966,
-0.00003310024112579413,
0.07006688416004181,
0.007439271546900272,
-0.04131665825843811,
-0.002316135447472334,
-0.04491369053721428,
0.021093033254146576,
-0.14448127150535583,
-0.03999970108270645,
-0.10188765823841095,
0.03111245669424534,
-0.03456827998161316,
-0.034460633993148804,
-0.011962995864450932,
0.03972527012228966,
0.06240438297390938,
0.0251518152654171,
0.12650230526924133,
-0.075499027967453,
0.1544622778892517,
0.06969979405403137,
0.10484001040458679,
-0.029905470088124275,
-0.06611783057451248,
-0.008082141168415546,
0.006194720510393381,
0.04993303492665291,
-0.10671529918909073,
0.031172648072242737,
0.15264353156089783,
0.05083637312054634,
0.15679460763931274,
0.06490979343652725,
-0.07696399837732315,
0.022823037579655647,
0.06617612391710281,
-0.05392720177769661,
-0.12460367381572723,
-0.02812894433736801,
0.06279602646827698,
-0.1374480128288269,
0.00913066603243351,
0.10135453939437866,
-0.057206351310014725,
-0.0012940071756020188,
-0.0025698174722492695,
0.0050154938362538815,
-0.07360593229532242,
0.22934359312057495,
0.02987455390393734,
0.08655941486358643,
-0.0878242626786232,
0.062387824058532715,
0.03539757803082466,
-0.14535531401634216,
0.007991807535290718,
0.07286384701728821,
-0.032692812383174896,
-0.0150718092918396,
0.007103951182216406,
0.05877932161092758,
0.0016551815206184983,
-0.06444035470485687,
-0.1212669163942337,
-0.15208688378334045,
0.08242315798997879,
0.09607837349176407,
0.03577734902501106,
0.028679290786385536,
-0.03792966529726982,
0.04511415585875511,
-0.12137337028980255,
0.06617487967014313,
0.09306728839874268,
0.06812886893749237,
-0.12667766213417053,
0.1682247668504715,
0.016077009961009026,
0.020924631506204605,
0.0053682173602283,
-0.02100498043000698,
-0.07997604459524155,
0.048215724527835846,
-0.12268621474504471,
-0.03544938564300537,
-0.05275549739599228,
-0.013440468348562717,
0.015354201197624207,
-0.07052602618932724,
-0.06818641722202301,
0.022893700748682022,
-0.1312161087989807,
-0.04934694245457649,
-0.006310494150966406,
0.06031368300318718,
-0.07785215973854065,
-0.017460601404309273,
0.05738231539726257,
-0.12141230702400208,
0.0824364721775055,
0.06939644366502762,
0.01443333551287651,
0.05775461345911026,
-0.12542667984962463,
0.006502772215753794,
0.04552053287625313,
0.004472564440220594,
0.02732979878783226,
-0.15546491742134094,
0.0028161501977592707,
-0.009274239651858807,
0.05173376575112343,
-0.011622193269431591,
0.013399609364569187,
-0.12973925471305847,
-0.06430868059396744,
-0.03431318700313568,
-0.060373641550540924,
-0.049003299325704575,
0.04904654994606972,
0.04592888802289963,
0.04771709814667702,
0.15136583149433136,
-0.07936283946037292,
0.054463859647512436,
-0.22009335458278656,
0.017046771943569183,
-0.041117310523986816,
-0.07605969905853271,
-0.040327318012714386,
-0.04722665250301361,
0.09475941210985184,
-0.06446930766105652,
0.07372192293405533,
-0.05883949249982834,
0.06427065283060074,
0.02921944297850132,
-0.13425981998443604,
0.043723441660404205,
0.043607115745544434,
0.2560865879058838,
0.07131865620613098,
-0.014584462158381939,
0.08604500442743301,
0.010880501009523869,
0.06319325417280197,
0.17542137205600739,
0.1569380760192871,
0.18100015819072723,
0.039814651012420654,
0.10967065393924713,
0.07706174999475479,
-0.11659209430217743,
-0.10092810541391373,
0.10566890984773636,
-0.019725853577256203,
0.12236592173576355,
-0.010246221907436848,
0.2279747575521469,
0.127731591463089,
-0.19271494448184967,
0.056517891585826874,
-0.029492205008864403,
-0.09193754941225052,
-0.09310086816549301,
-0.03742259368300438,
-0.07270210981369019,
-0.18795894086360931,
0.020736506208777428,
-0.11808420717716217,
0.05856136232614517,
0.06014113500714302,
0.043689895421266556,
0.01554038841277361,
0.14595770835876465,
0.0479465015232563,
-0.01852758601307869,
0.13073290884494781,
-0.005468488670885563,
0.00003603607910918072,
-0.06428927928209305,
-0.12694580852985382,
0.04987769201397896,
-0.0358244925737381,
0.06445812433958054,
-0.051263902336359024,
-0.11220910400152206,
0.05479014292359352,
0.008932005614042282,
-0.10895770788192749,
0.018515659496188164,
-0.011418811045587063,
0.071695476770401,
0.08806246519088745,
0.025436509400606155,
-0.0011892280308529735,
-0.022293519228696823,
0.2430645376443863,
-0.10020031780004501,
-0.05680646747350693,
-0.13238425552845,
0.234880730509758,
0.03611941635608673,
-0.007692995015531778,
0.018956243991851807,
-0.06687173992395401,
-0.010323692113161087,
0.17676715552806854,
0.10618460178375244,
-0.02093108370900154,
-0.02209511771798134,
0.014808210544288158,
-0.014996498823165894,
-0.05081313103437424,
0.09425017237663269,
0.1349712759256363,
0.021335015073418617,
-0.06473638862371445,
-0.055282432585954666,
-0.059365447610616684,
-0.04486260563135147,
-0.024955108761787415,
0.06638151407241821,
0.029242148622870445,
-0.012857590802013874,
-0.026132965460419655,
0.12188781052827835,
-0.0727723091840744,
-0.1101018562912941,
0.019950909540057182,
-0.1659739911556244,
-0.1852238029241562,
-0.04541761055588722,
0.02258877456188202,
0.04456718638539314,
0.06093459576368332,
-0.024293426424264908,
-0.01814253255724907,
0.10092920064926147,
0.009233329445123672,
-0.024148285388946533,
-0.13116279244422913,
0.09280066192150116,
-0.10849745571613312,
0.175288587808609,
-0.04642575606703758,
0.0205976702272892,
0.12319313734769821,
0.0755874291062355,
-0.0687255784869194,
0.05789537727832794,
0.07182338088750839,
-0.1338101476430893,
0.04747334495186806,
0.20838065445423126,
-0.04183148220181465,
0.1293485313653946,
0.028731929138302803,
-0.13869306445121765,
0.02364168129861355,
-0.08971265703439713,
-0.04095218703150749,
-0.060048554092645645,
-0.015728743746876717,
-0.05473221093416214,
0.12098149210214615,
0.23056088387966156,
-0.06701745092868805,
-0.020405033603310585,
-0.06523775309324265,
0.013960530050098896,
0.04509470984339714,
0.12130501866340637,
-0.059636279940605164,
-0.30770015716552734,
0.0036292686127126217,
0.005936866160482168,
-0.005486808717250824,
-0.2362436205148697,
-0.07990904152393341,
0.03980307653546333,
-0.07439621537923813,
-0.03941959887742996,
0.11492561548948288,
0.056288160383701324,
0.04421858862042427,
-0.0514674074947834,
-0.125130757689476,
-0.039869073778390884,
0.2016735076904297,
-0.17069202661514282,
-0.06459164619445801
] |
null | null | transformers |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# distilbert-base-uncased-finetuned-squad
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the squad_v2 dataset.
It achieves the following results on the evaluation set:
- Loss: 2.4285
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 3
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:-----:|:----:|:---------------:|
| 1.5169 | 1.0 | 1642 | 1.6958 |
| 1.1326 | 2.0 | 3284 | 2.0009 |
| 0.8638 | 3.0 | 4926 | 2.4285 |
### Framework versions
- Transformers 4.12.5
- Pytorch 1.10.0+cu111
- Datasets 1.15.1
- Tokenizers 0.10.3
| {"license": "apache-2.0", "tags": ["generated_from_trainer"], "datasets": ["squad_v2"], "model-index": [{"name": "distilbert-base-uncased-finetuned-squad", "results": []}]} | question-answering | Plimpton/distilbert-base-uncased-finetuned-squad | [
"transformers",
"pytorch",
"tensorboard",
"distilbert",
"question-answering",
"generated_from_trainer",
"dataset:squad_v2",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [] | TAGS
#transformers #pytorch #tensorboard #distilbert #question-answering #generated_from_trainer #dataset-squad_v2 #license-apache-2.0 #endpoints_compatible #region-us
| distilbert-base-uncased-finetuned-squad
=======================================
This model is a fine-tuned version of distilbert-base-uncased on the squad\_v2 dataset.
It achieves the following results on the evaluation set:
* Loss: 2.4285
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 2e-05
* train\_batch\_size: 16
* eval\_batch\_size: 16
* seed: 42
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* num\_epochs: 3
### Training results
### Framework versions
* Transformers 4.12.5
* Pytorch 1.10.0+cu111
* Datasets 1.15.1
* Tokenizers 0.10.3
| [
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 3",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.12.5\n* Pytorch 1.10.0+cu111\n* Datasets 1.15.1\n* Tokenizers 0.10.3"
] | [
"TAGS\n#transformers #pytorch #tensorboard #distilbert #question-answering #generated_from_trainer #dataset-squad_v2 #license-apache-2.0 #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 3",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.12.5\n* Pytorch 1.10.0+cu111\n* Datasets 1.15.1\n* Tokenizers 0.10.3"
] | [
59,
98,
4,
33
] | [
"passage: TAGS\n#transformers #pytorch #tensorboard #distilbert #question-answering #generated_from_trainer #dataset-squad_v2 #license-apache-2.0 #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 3### Training results### Framework versions\n\n\n* Transformers 4.12.5\n* Pytorch 1.10.0+cu111\n* Datasets 1.15.1\n* Tokenizers 0.10.3"
] | [
-0.11196358501911163,
0.07828319072723389,
-0.002272425452247262,
0.12194211781024933,
0.1616610735654831,
0.022361472249031067,
0.10950581729412079,
0.11674146354198456,
-0.1102324053645134,
0.03171104937791824,
0.1389569193124771,
0.1585618555545807,
0.002093547023832798,
0.05661763995885849,
-0.043653927743434906,
-0.22572079300880432,
-0.01575384847819805,
0.05311654135584831,
-0.09860370308160782,
0.1468828320503235,
0.0845995619893074,
-0.1550334095954895,
0.08022013306617737,
0.003756915684789419,
-0.2086530178785324,
0.013741781935095787,
-0.0005567232146859169,
-0.03755583241581917,
0.14655959606170654,
0.00487697497010231,
0.12230424582958221,
0.00890358630567789,
0.07360775768756866,
-0.18396028876304626,
0.014038627035915852,
0.038752805441617966,
0.011076918803155422,
0.08589712530374527,
0.04055339843034744,
0.007998514920473099,
0.10897314548492432,
-0.07685442268848419,
0.04042913392186165,
0.02828359045088291,
-0.13347186148166656,
-0.2570168972015381,
-0.10131285339593887,
0.016477113589644432,
0.06263826787471771,
0.12069644033908844,
-0.007457566913217306,
0.15866519510746002,
-0.11504672467708588,
0.08807338774204254,
0.26091355085372925,
-0.2910968065261841,
-0.07559112459421158,
0.023853527382016182,
0.017342684790492058,
0.06987795233726501,
-0.10530183464288712,
-0.030429257079958916,
0.05604499578475952,
0.05238587409257889,
0.10964711010456085,
-0.03955972567200661,
-0.11435496062040329,
0.04566177725791931,
-0.15125419199466705,
-0.05237428843975067,
0.15600191056728363,
0.0531812347471714,
-0.029455654323101044,
-0.02126743644475937,
-0.05463113263249397,
-0.1369953602552414,
-0.021332811564207077,
-0.021752484142780304,
0.04254941642284393,
-0.05562013387680054,
-0.09749654680490494,
0.002415313618257642,
-0.11142054200172424,
-0.09541485458612442,
-0.06819026917219162,
0.12702438235282898,
0.04410289600491524,
0.02618447318673134,
-0.053134143352508545,
0.10920940339565277,
0.019829995930194855,
-0.1351853758096695,
0.011142650619149208,
0.03041265532374382,
-0.0107857221737504,
-0.03744833916425705,
-0.06081166863441467,
-0.05171370878815651,
0.027690378949046135,
0.1207469031214714,
-0.07489734888076782,
0.02375831827521324,
0.05540243908762932,
0.0380595438182354,
-0.09151627123355865,
0.18299859762191772,
-0.06479500979185104,
0.008355590514838696,
-0.019777311012148857,
0.04988875985145569,
-0.0076084258034825325,
0.005262305028736591,
-0.09622097760438919,
-0.015198319219052792,
0.09536184370517731,
0.024859413504600525,
-0.046301692724227905,
0.0528375580906868,
-0.04683791846036911,
-0.021457869559526443,
-0.005937596783041954,
-0.07750235497951508,
0.026484662666916847,
0.004441546741873026,
-0.09290497750043869,
-0.009815862402319908,
0.008911493234336376,
0.010182277299463749,
-0.011359505355358124,
0.08937391638755798,
-0.09078224003314972,
0.033543240278959274,
-0.09000697731971741,
-0.09758587926626205,
0.022376449778676033,
-0.08549317717552185,
0.03258476033806801,
-0.08672455698251724,
-0.15693044662475586,
-0.008207161910831928,
0.04287830740213394,
-0.02372817136347294,
-0.03918490931391716,
-0.04730105400085449,
-0.0891234502196312,
-0.0166221484541893,
-0.010372388176620007,
0.14678628742694855,
-0.05789335444569588,
0.1252107471227646,
0.050198789685964584,
0.06821195781230927,
-0.035119008272886276,
0.05328794941306114,
-0.1085369661450386,
0.015616072341799736,
-0.17043617367744446,
0.033606864511966705,
-0.05124302953481674,
0.06584233790636063,
-0.1024014800786972,
-0.1300860196352005,
0.02497495710849762,
-0.021282650530338287,
0.08865194767713547,
0.10482075065374374,
-0.16823457181453705,
-0.06659761071205139,
0.1499643176794052,
-0.05861441791057587,
-0.15419963002204895,
0.1256421059370041,
-0.057838428765535355,
0.0360984168946743,
0.06394033133983612,
0.1652880758047104,
0.06358266621828079,
-0.09773522615432739,
0.010933185927569866,
-0.003535663243383169,
0.04098634794354439,
-0.06429820507764816,
0.07682877033948898,
-0.009016890078783035,
0.01916009746491909,
0.023627955466508865,
-0.06749328970909119,
0.05842957645654678,
-0.12299513071775436,
-0.09936066716909409,
-0.0485512837767601,
-0.10793381929397583,
0.05111232027411461,
0.0865369364619255,
0.07085411250591278,
-0.09829676151275635,
-0.062170617282390594,
0.06770587712526321,
0.07869932800531387,
-0.06075946241617203,
0.026577243581414223,
-0.06830120086669922,
0.07804668694734573,
-0.06035935506224632,
-0.029165690764784813,
-0.18455791473388672,
-0.016181863844394684,
0.006093161646276712,
0.008276890963315964,
0.009354890324175358,
0.04775552824139595,
0.07943759113550186,
0.04997144266963005,
-0.05316529795527458,
-0.02855549566447735,
-0.06795989722013474,
-0.0071495454758405685,
-0.12374958395957947,
-0.19873899221420288,
-0.03800143674015999,
-0.009605229832231998,
0.09257488697767258,
-0.1860717087984085,
0.029394207522273064,
-0.009705938398838043,
0.07411301136016846,
-0.0030583145562559366,
-0.01154734380543232,
-0.039172008633613586,
0.0788993164896965,
-0.017625857144594193,
-0.04180913418531418,
0.06614898890256882,
-0.0042141154408454895,
-0.09613409638404846,
-0.06512977182865143,
-0.05594500154256821,
0.15982186794281006,
0.12504050135612488,
-0.13158898055553436,
-0.06260579824447632,
-0.005681196227669716,
-0.07633643597364426,
-0.03359878435730934,
-0.048384152352809906,
0.04067062586545944,
0.1667497456073761,
-0.007206902373582125,
0.12230656296014786,
-0.08819694817066193,
-0.0448281392455101,
0.019219854846596718,
-0.03938702493906021,
0.04398949816823006,
0.13981613516807556,
0.11451476067304611,
-0.061501823365688324,
0.13535436987876892,
0.1608070582151413,
-0.0888439267873764,
0.10860338807106018,
-0.06952887028455734,
-0.0933050587773323,
-0.03867003694176674,
0.004269045311957598,
-0.007152237929403782,
0.12817925214767456,
-0.15717098116874695,
0.0180194154381752,
0.03207358717918396,
0.022845912724733353,
0.016645072028040886,
-0.23220762610435486,
-0.06829484552145004,
0.031948063522577286,
-0.05059358477592468,
-0.049496639519929886,
0.001682054135017097,
0.0172521211206913,
0.10312844067811966,
-0.0114281065762043,
-0.06868095695972443,
0.03509964048862457,
0.0007642157725058496,
-0.07125426083803177,
0.22010944783687592,
-0.06843775510787964,
-0.10676532238721848,
-0.09738311171531677,
-0.037592872977256775,
-0.04256245866417885,
-0.002180980984121561,
0.06858865916728973,
-0.10383428633213043,
-0.006172532215714455,
-0.04907355457544327,
0.03540181368589401,
-0.01361860055476427,
0.032986823469400406,
0.014215273782610893,
0.0025676873046904802,
0.07594732940196991,
-0.11476185917854309,
0.00843413732945919,
-0.0604081004858017,
-0.07923708856105804,
0.0551970973610878,
0.042828015983104706,
0.13852788507938385,
0.1323888599872589,
-0.014728454872965813,
0.013671212829649448,
-0.025402607396245003,
0.2639724314212799,
-0.06680911779403687,
-0.04731748625636101,
0.14777213335037231,
0.012890291400253773,
0.055809371173381805,
0.09890782088041306,
0.07512474805116653,
-0.09051889926195145,
0.0038783643394708633,
0.030640633776783943,
-0.035764891654253006,
-0.24879443645477295,
-0.022129695862531662,
-0.052847810089588165,
-0.008757779374718666,
0.07335149496793747,
0.017302216961979866,
0.03664980083703995,
0.07553952932357788,
0.03858272731304169,
0.04803001135587692,
-0.06601642817258835,
0.03836221620440483,
0.10717403888702393,
0.04214723780751228,
0.10693711042404175,
-0.04627670720219612,
-0.0648825541138649,
0.023864256218075752,
-0.007148293312638998,
0.2561993896961212,
-0.009391394443809986,
0.14523814618587494,
0.08805018663406372,
0.21491551399230957,
-0.019513100385665894,
0.08132904767990112,
-0.006573573220521212,
-0.05145811662077904,
-0.00531430821865797,
-0.03923238813877106,
-0.021739691495895386,
0.004373402800410986,
-0.04152899980545044,
0.08279285579919815,
-0.11945062130689621,
0.010297069326043129,
0.06062611937522888,
0.27130335569381714,
0.027298729866743088,
-0.30178481340408325,
-0.09686766564846039,
-0.016218319535255432,
-0.02093150094151497,
-0.0039431205950677395,
0.022699568420648575,
0.12850159406661987,
-0.08892177790403366,
-0.003790773218497634,
-0.06210740655660629,
0.09541945904493332,
-0.014226350001990795,
0.04821931943297386,
0.06915776431560516,
0.0782301053404808,
0.011759272776544094,
0.09511388838291168,
-0.3218227028846741,
0.2885831594467163,
-0.0008910239557735622,
0.07740700244903564,
-0.07144568860530853,
-0.02236318401992321,
0.004089210648089647,
0.042523521929979324,
0.08847402781248093,
-0.0073923710733652115,
0.019530802965164185,
-0.18066203594207764,
-0.03859727084636688,
0.04340240731835365,
0.0880366638302803,
-0.02943732775747776,
0.09798784554004669,
-0.009911593981087208,
0.016672058030962944,
0.07500821352005005,
0.005144026596099138,
-0.056284088641405106,
-0.06957916170358658,
-0.017818309366703033,
0.0036765665281563997,
-0.048384010791778564,
-0.07195738703012466,
-0.10287806391716003,
-0.12120546400547028,
0.09346603602170944,
-0.007720030844211578,
-0.03356778249144554,
-0.10005772858858109,
0.09323909133672714,
0.10959939658641815,
-0.08957388997077942,
0.03120056912302971,
0.017361853271722794,
0.03324275463819504,
0.0466696135699749,
-0.04705493152141571,
0.09495967626571655,
-0.06096509099006653,
-0.17200972139835358,
-0.047733668237924576,
0.11379721760749817,
0.05903836712241173,
0.07092993706464767,
-0.009429839439690113,
0.005259742494672537,
-0.054695211350917816,
-0.10210435092449188,
0.0318182073533535,
-0.03878293186426163,
0.08536454290151596,
0.019729921594262123,
-0.022239383310079575,
0.06646489351987839,
-0.06841795146465302,
-0.02515295147895813,
0.1897670328617096,
0.24709880352020264,
-0.10128344595432281,
0.003964612726122141,
0.032958805561065674,
-0.049706194549798965,
-0.17487454414367676,
0.05273722484707832,
0.06759218871593475,
-0.005283989477902651,
0.05065289884805679,
-0.1656562089920044,
0.13867904245853424,
0.10658803582191467,
-0.002882882719859481,
0.0924578458070755,
-0.3798423707485199,
-0.1123235821723938,
0.10431693494319916,
0.15675874054431915,
0.11314269155263901,
-0.15174928307533264,
-0.0192851722240448,
0.006554265506565571,
-0.16788698732852936,
0.09718141704797745,
-0.09998185187578201,
0.1112380102276802,
-0.04586413875222206,
0.10114295035600662,
0.004016026388853788,
-0.07123732566833496,
0.13300111889839172,
0.04352675750851631,
0.10600098967552185,
-0.042388666421175,
-0.038498613983392715,
0.07187151163816452,
-0.022434836253523827,
0.016711920499801636,
-0.05962497740983963,
0.048586226999759674,
-0.10372678935527802,
-0.01001941878348589,
-0.10882177203893661,
0.03490235283970833,
-0.04601138085126877,
-0.05769777670502663,
-0.04325240105390549,
0.01929633691906929,
0.049840133637189865,
-0.008381319232285023,
0.1134147047996521,
0.03234589472413063,
0.14018741250038147,
0.07749500870704651,
0.07267323136329651,
-0.05186374858021736,
-0.10324111580848694,
-0.012114547193050385,
-0.003100060159340501,
0.05917908251285553,
-0.13747909665107727,
0.02548268437385559,
0.1585574895143509,
0.0507417656481266,
0.12031348794698715,
0.07924067974090576,
-0.03141581267118454,
0.012524786405265331,
0.03677142783999443,
-0.1658669114112854,
-0.14236874878406525,
0.00038904164102859795,
-0.06531108170747757,
-0.1246572732925415,
0.0694570317864418,
0.06527052819728851,
-0.059177465736866,
-0.01076862309128046,
-0.007021832279860973,
-0.009094782173633575,
-0.06707269698381424,
0.20853875577449799,
0.08173365145921707,
0.05725933983922005,
-0.1105823889374733,
0.07240123301744461,
0.03244207426905632,
-0.07963380217552185,
-0.008506936021149158,
0.04600856080651283,
-0.07313614338636398,
-0.04190737381577492,
0.08365554362535477,
0.16340546309947968,
-0.06088021397590637,
-0.037418339401483536,
-0.1434519737958908,
-0.11631309986114502,
0.08056335896253586,
0.15024089813232422,
0.11455812305212021,
0.016153788194060326,
-0.044904518872499466,
0.01328815147280693,
-0.12175239622592926,
0.08725839853286743,
0.04375544190406799,
0.058219872415065765,
-0.1319912225008011,
0.14663274586200714,
-0.0021489232312887907,
0.061488427221775055,
-0.01455342024564743,
0.028242308646440506,
-0.09946674108505249,
0.037247173488140106,
-0.14488688111305237,
-0.03989773988723755,
-0.030210310593247414,
-0.004792904015630484,
-0.00751872081309557,
-0.0887814462184906,
-0.058502040803432465,
0.022446587681770325,
-0.12618538737297058,
-0.012335165403783321,
0.05738646537065506,
0.05425915867090225,
-0.14110516011714935,
-0.0426672101020813,
0.0378916896879673,
-0.055552393198013306,
0.07016118615865707,
0.07229726761579514,
0.012519443407654762,
0.05486300215125084,
-0.1353844851255417,
-0.010154830291867256,
0.050008125603199005,
0.01605144515633583,
0.078494131565094,
-0.09245967119932175,
-0.018238438293337822,
0.0031511823181062937,
0.05801723897457123,
0.018192561343312263,
0.032954879105091095,
-0.13862982392311096,
-0.009244455024600029,
-0.01954503171145916,
-0.07191761583089828,
-0.07428032159805298,
0.01566464640200138,
0.0946277529001236,
0.02935580164194107,
0.19246932864189148,
-0.058017224073410034,
0.05962138622999191,
-0.227731391787529,
-0.005082281772047281,
-0.008193643763661385,
-0.09428360313177109,
-0.1259022355079651,
-0.05136200785636902,
0.06715299934148788,
-0.06404943019151688,
0.11566021293401718,
-0.018456660211086273,
0.05769556388258934,
0.019596237689256668,
-0.012577200308442116,
0.02949267067015171,
0.014217324554920197,
0.2333606779575348,
0.020458601415157318,
-0.03336789086461067,
0.07786829769611359,
0.04848858341574669,
0.07955129444599152,
0.12618021667003632,
0.20052331686019897,
0.17015446722507477,
0.007570682559162378,
0.07071834802627563,
0.04434381052851677,
-0.03914343938231468,
-0.13907752931118011,
0.03824854642152786,
-0.01674540713429451,
0.07968427240848541,
-0.016348887234926224,
0.23867645859718323,
0.07020460814237595,
-0.17331337928771973,
0.050131019204854965,
-0.05798480287194252,
-0.09515558928251266,
-0.08512607216835022,
-0.025743741542100906,
-0.06746909767389297,
-0.15817981958389282,
0.013901082798838615,
-0.12532460689544678,
0.013327881693840027,
0.11477330327033997,
0.012078829109668732,
-0.02994733862578869,
0.182638019323349,
0.07360807806253433,
0.03667262941598892,
0.04571618512272835,
-0.006921315100044012,
-0.022967275232076645,
-0.08778291195631027,
-0.04482847824692726,
0.009120998904109001,
-0.02914404682815075,
0.035393208265304565,
-0.05493904650211334,
-0.07997872680425644,
0.023639922961592674,
-0.033598393201828,
-0.09820973873138428,
0.005778023041784763,
0.026292331516742706,
0.06243589520454407,
0.045181531459093094,
0.016537457704544067,
0.028408899903297424,
-0.019185034558176994,
0.2188376784324646,
-0.07477718591690063,
-0.0864005982875824,
-0.0976809412240982,
0.236707866191864,
0.03965812176465988,
-0.015735147520899773,
0.048334285616874695,
-0.06731796264648438,
0.008992413990199566,
0.2453508824110031,
0.177278071641922,
-0.09117957204580307,
-0.011675603687763214,
0.007821870036423206,
-0.00839181337505579,
-0.039101097732782364,
0.07700339704751968,
0.14020635187625885,
0.03355831280350685,
-0.11049424111843109,
-0.04240589588880539,
-0.08407134562730789,
-0.009895574301481247,
-0.020252680405974388,
0.049840036779642105,
0.06081083416938782,
-0.000959628087002784,
-0.044192250818014145,
0.07079392671585083,
-0.08091168850660324,
-0.11254022270441055,
0.06754167377948761,
-0.19232624769210815,
-0.15134412050247192,
-0.023057585582137108,
0.1106657087802887,
0.010285167023539543,
0.06788922101259232,
-0.04158560559153557,
0.006041097920387983,
0.08458446711301804,
-0.015009414404630661,
-0.09311635047197342,
-0.08358661830425262,
0.12300538271665573,
-0.10947303473949432,
0.18342404067516327,
-0.04366597905755043,
0.0906248390674591,
0.1289510577917099,
0.059070080518722534,
-0.08344391733407974,
0.058970771729946136,
0.06506644189357758,
-0.08522962033748627,
0.005576193332672119,
0.08007150143384933,
-0.006125760264694691,
0.05652591213583946,
0.038948316127061844,
-0.11647812277078629,
0.016753047704696655,
-0.03741274029016495,
-0.02638128027319908,
-0.06512987613677979,
-0.03693368658423424,
-0.047543253749608994,
0.11702671647071838,
0.21175897121429443,
-0.045019447803497314,
0.01805037446320057,
-0.07281932234764099,
0.01738845184445381,
0.06202071160078049,
0.02334466762840748,
-0.06782740354537964,
-0.20884989202022552,
0.02075701765716076,
0.06201203167438507,
-0.031791746616363525,
-0.19695337116718292,
-0.09418880939483643,
0.027159983292222023,
-0.08784366399049759,
-0.06307018548250198,
0.062108755111694336,
0.07555504888296127,
0.06729690730571747,
-0.045073021203279495,
-0.04651917889714241,
-0.08972059190273285,
0.1628512740135193,
-0.14643250405788422,
-0.08555466681718826
] |
null | null | transformers |
[Google's mT5](https://github.com/google-research/multilingual-t5)
This is a model for generating questions from Thai texts. It was fine-tuned on NSC2018 corpus
```python
from transformers import MT5Tokenizer, MT5ForConditionalGeneration
tokenizer = MT5Tokenizer.from_pretrained("Pollawat/mt5-small-thai-qa-qg")
model = MT5ForConditionalGeneration.from_pretrained("Pollawat/mt5-small-thai-qa-qg")
text = "กรุงเทพมหานคร เป็นเมืองหลวงและนครที่มีประชากรมากที่สุดของประเทศไทย เป็นศูนย์กลางการปกครอง การศึกษา การคมนาคมขนส่ง การเงินการธนาคาร การพาณิชย์ การสื่อสาร และความเจริญของประเทศ เป็นเมืองที่มีชื่อยาวที่สุดในโลก ตั้งอยู่บนสามเหลี่ยมปากแม่น้ำเจ้าพระยา มีแม่น้ำเจ้าพระยาไหลผ่านและแบ่งเมืองออกเป็น 2 ฝั่ง คือ ฝั่งพระนครและฝั่งธนบุรี กรุงเทพมหานครมีพื้นที่ทั้งหมด 1,568.737 ตร.กม. มีประชากรตามทะเบียนราษฎรกว่า 5 ล้านคน"
input_ids = tokenizer.encode(text, return_tensors='pt')
beam_output = model.generate(
input_ids,
max_length=50,
num_beams=5,
early_stopping=True
)
print(tokenizer.decode(beam_output[0]))
>> <pad> <extra_id_0> แม่น้ําเจ้าพระยาไหลผ่านและแบ่งเมืองออกเป็น 2 ฝั่ง คือ ฝั่งใด <ANS> ฝั่งพระนครและฝั่งธนบุรี</s>
print(tokenizer.decode(beam_output[0], skip_special_tokens=True))
>> <extra_id_0> แม่น้ําเจ้าพระยาไหลผ่านและแบ่งเมืองออกเป็น 2 ฝั่ง คือ ฝั่งใด ฝั่งพระนครและฝั่งธนบุรี
``` | {"language": ["thai", "th"], "license": "mit", "tags": ["question-generation", "question-answering"], "datasets": ["NSC2018", "iapp-wiki-qa-dataset", "XQuAD"]} | question-answering | Pollawat/mt5-small-thai-qa-qg | [
"transformers",
"pytorch",
"mt5",
"text2text-generation",
"question-generation",
"question-answering",
"dataset:NSC2018",
"dataset:iapp-wiki-qa-dataset",
"dataset:XQuAD",
"license:mit",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [
"thai",
"th"
] | TAGS
#transformers #pytorch #mt5 #text2text-generation #question-generation #question-answering #dataset-NSC2018 #dataset-iapp-wiki-qa-dataset #dataset-XQuAD #license-mit #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
Google's mT5
This is a model for generating questions from Thai texts. It was fine-tuned on NSC2018 corpus
| [] | [
"TAGS\n#transformers #pytorch #mt5 #text2text-generation #question-generation #question-answering #dataset-NSC2018 #dataset-iapp-wiki-qa-dataset #dataset-XQuAD #license-mit #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n"
] | [
93
] | [
"passage: TAGS\n#transformers #pytorch #mt5 #text2text-generation #question-generation #question-answering #dataset-NSC2018 #dataset-iapp-wiki-qa-dataset #dataset-XQuAD #license-mit #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n"
] | [
-0.07023035734891891,
0.16146889328956604,
-0.0042781345546245575,
0.03947309032082558,
0.09653294831514359,
0.02397637441754341,
0.14220090210437775,
0.1567472517490387,
0.06308744847774506,
-0.018306449055671692,
0.16505973041057587,
0.2046506255865097,
0.03157305717468262,
0.06862244009971619,
-0.11143551766872406,
-0.13809245824813843,
0.037872739136219025,
0.05442778021097183,
-0.02063249982893467,
0.1416333019733429,
0.09982608258724213,
-0.0820680558681488,
0.09084036946296692,
-0.04008420929312706,
-0.11038172245025635,
0.007863318547606468,
0.019037805497646332,
-0.09092129021883011,
0.12354925274848938,
0.04025603458285332,
0.03980076313018799,
0.10959433764219284,
-0.02377619780600071,
-0.17252957820892334,
0.04537494480609894,
-0.002646686276420951,
-0.06697593629360199,
0.07858147472143173,
0.052596740424633026,
-0.032959017902612686,
0.04552371799945831,
0.045861031860113144,
-0.032460231333971024,
0.09021320194005966,
-0.11989842355251312,
-0.042345933616161346,
-0.10095076262950897,
0.037407901138067245,
0.07282805442810059,
0.11407425254583359,
-0.02081981860101223,
0.15943513810634613,
-0.1055721789598465,
0.1095244437456131,
0.1103295311331749,
-0.35563647747039795,
-0.00008625940245110542,
0.05591599643230438,
0.10254399478435516,
0.07724753022193909,
-0.03337816521525383,
0.055146876722574234,
0.06582006812095642,
0.014648099429905415,
-0.020118946209549904,
-0.07235933840274811,
-0.11193381994962692,
0.05706002190709114,
-0.08254964649677277,
-0.052725650370121,
0.32892972230911255,
0.019151952117681503,
0.02518356218934059,
-0.030819613486528397,
-0.05701436102390289,
0.005087210796773434,
0.025058884173631668,
-0.00950001273304224,
-0.02553177811205387,
0.015626775100827217,
-0.03733260929584503,
-0.03340910002589226,
-0.1108442172408104,
-0.0261189304292202,
-0.18340934813022614,
0.005458435975015163,
-0.006543916650116444,
0.07430066168308258,
-0.18897095322608948,
0.06288348883390427,
0.06368133425712585,
-0.11241438239812851,
-0.027944039553403854,
-0.07200898975133896,
-0.011428797617554665,
-0.013276868499815464,
-0.00951696839183569,
-0.022090667858719826,
0.12028541415929794,
0.14845098555088043,
0.001160947373136878,
-0.01609042100608349,
-0.07702995836734772,
0.07498183846473694,
0.05778859928250313,
0.04657939076423645,
-0.0343378446996212,
-0.11020448803901672,
0.01656275987625122,
-0.06418395787477493,
-0.04308061674237251,
-0.04495522007346153,
-0.11171264946460724,
-0.03705605864524841,
-0.0030528074130415916,
0.1141117736697197,
0.12011150270700455,
0.0703827291727066,
-0.014969064854085445,
-0.039928585290908813,
0.07593829184770584,
-0.03311724588274956,
-0.010365793481469154,
0.0203217975795269,
-0.03625626489520073,
0.10290253907442093,
-0.0004188938473816961,
0.06678980588912964,
-0.07978998869657516,
0.022381430491805077,
-0.07347512245178223,
-0.03629285842180252,
-0.005149249453097582,
-0.03875754401087761,
0.08906690776348114,
-0.08784303069114685,
0.01264987513422966,
-0.11269430071115494,
-0.22146491706371307,
-0.017881102859973907,
0.004707048647105694,
-0.0009590524132363498,
-0.07954257726669312,
-0.029000097885727882,
-0.0444958470761776,
0.02340654656291008,
-0.07524412125349045,
0.062125299125909805,
-0.06200125813484192,
0.10339856147766113,
-0.06301245093345642,
0.0520411916077137,
-0.14134877920150757,
0.058755893260240555,
-0.11115041375160217,
-0.005839690566062927,
0.06736829876899719,
0.0486137680709362,
-0.042147282510995865,
0.06938433647155762,
-0.07646045088768005,
-0.030504804104566574,
0.008886856026947498,
0.004245412070304155,
-0.037173718214035034,
0.20661459863185883,
-0.15206526219844818,
-0.055439505726099014,
0.16545632481575012,
-0.060009557753801346,
-0.28442269563674927,
0.09722666442394257,
0.00497645977884531,
0.04089505597949028,
0.08329332619905472,
0.17401506006717682,
0.00871298462152481,
-0.06782492250204086,
-0.0690506175160408,
0.06760493665933609,
-0.07027777284383774,
-0.14431801438331604,
0.08166863024234772,
-0.0027683197986334562,
-0.030709853395819664,
0.036957383155822754,
0.031032737344503403,
0.011710886843502522,
-0.0732661709189415,
-0.09828786551952362,
-0.05650218948721886,
-0.054719891399145126,
0.029431598260998726,
0.01416908111423254,
0.034011732786893845,
-0.04123258963227272,
-0.001972283236682415,
-0.03739267587661743,
0.019469205290079117,
0.010551253333687782,
0.00931162666529417,
-0.10650133341550827,
0.10361560434103012,
-0.08438138663768768,
0.019544357433915138,
-0.12735413014888763,
-0.07989094406366348,
-0.0191021878272295,
0.10226976871490479,
0.008424031548202038,
0.0724286362528801,
0.02880171127617359,
-0.052849505096673965,
-0.01795176975429058,
-0.02728448621928692,
0.12068324536085129,
0.008439457044005394,
-0.09485727548599243,
-0.12865516543388367,
0.021578282117843628,
-0.05447828769683838,
0.02883635275065899,
-0.0699448361992836,
0.0018931032391265035,
0.013155038468539715,
0.08880559355020523,
-0.02640671841800213,
0.06429330259561539,
0.0005337447510100901,
0.0063538015820086,
-0.05174541845917702,
0.023667918518185616,
0.09880244731903076,
-0.00922547560185194,
-0.0918956845998764,
0.1528599113225937,
-0.09294488281011581,
0.19154591858386993,
0.16012172400951385,
-0.12137379497289658,
0.06644657254219055,
-0.08198858052492142,
-0.05851568654179573,
-0.020640812814235687,
-0.0008783512748777866,
0.018575821071863174,
0.03963645175099373,
0.023177368566393852,
0.11099749803543091,
-0.07556002587080002,
-0.05116685479879379,
0.0004998974618501961,
-0.0244921687990427,
-0.035261526703834534,
0.08693820238113403,
0.17788922786712646,
-0.20834022760391235,
0.17236748337745667,
0.24018120765686035,
0.08119355142116547,
0.15876786410808563,
-0.049954332411289215,
-0.04423222690820694,
-0.0024681799113750458,
-0.038402486592531204,
-0.050122156739234924,
0.045092567801475525,
-0.15398292243480682,
0.04064060002565384,
0.10718764364719391,
0.019164547324180603,
0.07815798372030258,
-0.0882311686873436,
-0.08685597032308578,
-0.036334309726953506,
-0.026882708072662354,
-0.10389019548892975,
0.09111055731773376,
0.04631416127085686,
0.14831778407096863,
-0.026160884648561478,
-0.010545767843723297,
0.10709737241268158,
-0.02092069387435913,
-0.12882263958454132,
0.18784460425376892,
-0.1253344714641571,
-0.2565145492553711,
-0.07006044685840607,
-0.10105259716510773,
-0.07968521863222122,
-0.026697102934122086,
0.09803800284862518,
-0.06889995187520981,
-0.002722682198509574,
-0.007397517561912537,
-0.0006184919038787484,
-0.061022691428661346,
-0.015664968639612198,
-0.03494374454021454,
0.029186705127358437,
-0.046715058386325836,
-0.10691297054290771,
-0.03017764538526535,
0.0014040826354175806,
-0.08468979597091675,
0.13981586694717407,
-0.07016521692276001,
0.11650941520929337,
0.1286087930202484,
0.008894880302250385,
0.03597409650683403,
-0.04555029794573784,
0.23361952602863312,
-0.10633029788732529,
0.0483962781727314,
0.19477903842926025,
0.0074866157956421375,
0.05496244505047798,
0.14615751802921295,
0.007688438985496759,
-0.07018333673477173,
0.02017219550907612,
-0.00896598119288683,
-0.04340478032827377,
-0.33713188767433167,
-0.10598073899745941,
-0.08011186122894287,
0.11663278937339783,
0.013995129615068436,
0.05616705119609833,
0.12524563074111938,
0.0928771048784256,
-0.037050824612379074,
-0.06767060607671738,
-0.04414306581020355,
0.0502554252743721,
0.1303233951330185,
-0.01879589818418026,
0.1421436369419098,
-0.07835742831230164,
-0.033053044229745865,
0.08538363873958588,
0.12938223779201508,
0.11623099446296692,
0.0699167251586914,
0.058518942445516586,
0.09114426374435425,
0.1686401218175888,
0.0908084586262703,
0.06828723847866058,
0.04300539940595627,
-0.04266689717769623,
-0.009836847893893719,
-0.025991665199398994,
-0.03526102006435394,
0.033243730664253235,
0.04679689556360245,
-0.06660526990890503,
-0.05043061822652817,
-0.014057677239179611,
0.07624715566635132,
0.14428585767745972,
0.07741308957338333,
-0.17335447669029236,
-0.006281370762735605,
0.06960249692201614,
-0.026742057874798775,
-0.0703621581196785,
0.09217292815446854,
0.03555256873369217,
-0.15713155269622803,
0.05911450460553169,
-0.05136760324239731,
0.1521807163953781,
0.019153546541929245,
0.04973265528678894,
-0.011917976662516594,
-0.1132204607129097,
0.0029798292089253664,
0.12056443095207214,
-0.364990770816803,
0.22419355809688568,
0.022921385243535042,
-0.025048673152923584,
-0.11488956958055496,
-0.026620065793395042,
-0.028129464015364647,
0.11769512295722961,
0.17801061272621155,
0.006972579751163721,
0.005107629578560591,
-0.020210999995470047,
-0.04691483825445175,
0.09552302956581116,
0.0519830696284771,
-0.007638496346771717,
0.026007769629359245,
-0.005585245322436094,
0.025837592780590057,
-0.015036472119390965,
0.10162258893251419,
-0.07022333890199661,
-0.146791011095047,
0.05712335556745529,
0.05141199752688408,
0.07960782945156097,
-0.014404181391000748,
-0.024772223085165024,
-0.13598792254924774,
0.17171098291873932,
-0.11042046546936035,
-0.0690179243683815,
-0.11535806953907013,
-0.05319125950336456,
0.06128507852554321,
-0.06964223831892014,
-0.03541060537099838,
-0.03950096294283867,
0.018040338531136513,
-0.07084301114082336,
-0.1723562628030777,
0.07980873435735703,
-0.1113438829779625,
-0.08057162165641785,
-0.053033336997032166,
0.14386913180351257,
-0.03602282702922821,
0.04768887162208557,
0.04340768977999687,
0.0015256153419613838,
-0.09524013847112656,
-0.09899450093507767,
0.04656539484858513,
-0.033523961901664734,
0.05574563145637512,
0.025651661679148674,
-0.034033533185720444,
0.004530861042439938,
-0.04648556187748909,
-0.04784296825528145,
0.2554696500301361,
0.16927659511566162,
-0.06868010759353638,
0.17506782710552216,
0.15316765010356903,
-0.08685130625963211,
-0.2822355031967163,
-0.06510160118341446,
-0.1038338765501976,
-0.04402003437280655,
-0.013973170891404152,
-0.13674353063106537,
0.10411533713340759,
0.047947127372026443,
-0.04455464705824852,
0.08103177696466446,
-0.26648637652397156,
-0.05773385986685753,
0.11760254204273224,
0.007731521036475897,
0.23355036973953247,
-0.15302225947380066,
-0.07661473006010056,
-0.04415341839194298,
-0.20025280117988586,
0.22915159165859222,
-0.10521327704191208,
0.0631243884563446,
-0.054554279893636703,
0.11660905927419662,
0.0002001333486987278,
-0.06214873492717743,
0.10738762468099594,
0.04352644458413124,
0.006747878156602383,
-0.06821954995393753,
-0.0125714261084795,
0.10451433807611465,
0.012511199340224266,
0.04273431375622749,
-0.09298577159643173,
0.05282100662589073,
-0.18752992153167725,
-0.004933199845254421,
-0.11448496580123901,
0.06447280943393707,
-0.041921526193618774,
-0.07296833395957947,
-0.014774400740861893,
0.005736829247325659,
0.04331735894083977,
-0.005204246379435062,
0.16665568947792053,
-0.032189905643463135,
0.1604209542274475,
0.12943574786186218,
0.17761385440826416,
-0.12205049395561218,
-0.01315988227725029,
-0.08880326896905899,
-0.06804445385932922,
0.06992270052433014,
-0.1193167120218277,
0.04187459498643875,
0.1682077795267105,
-0.00780903734266758,
0.09920499473810196,
0.07190640270709991,
0.008906013332307339,
-0.0025128829292953014,
0.07331641018390656,
-0.18275421857833862,
-0.08531299978494644,
-0.03591373562812805,
0.013931040652096272,
-0.023629717528820038,
0.06909970939159393,
0.1356746107339859,
0.021276572719216347,
-0.03711746260523796,
0.0019170145969837904,
0.0483403205871582,
-0.03358877822756767,
0.09206122905015945,
0.07185130566358566,
0.050699129700660706,
-0.15599356591701508,
0.10278772562742233,
0.01195442769676447,
-0.1141599491238594,
0.0120416060090065,
0.09640223532915115,
-0.0959009975194931,
-0.1317407190799713,
-0.029794232919812202,
0.0692182406783104,
-0.1536448895931244,
-0.062032271176576614,
-0.03976542502641678,
-0.1119646206498146,
0.09480077028274536,
0.1317545771598816,
0.02199062705039978,
0.05397050082683563,
0.007041881792247295,
-0.10204262286424637,
-0.0479678176343441,
0.0619499646127224,
-0.0807981863617897,
0.007407279219478369,
-0.08036892861127853,
0.011342301964759827,
-0.046586569398641586,
0.17334169149398804,
-0.04910804703831673,
-0.036688607186079025,
-0.10773984342813492,
0.037994056940078735,
-0.20556581020355225,
-0.04340636730194092,
-0.10341998934745789,
-0.048548098653554916,
-0.025139424949884415,
-0.06447863578796387,
-0.0740206390619278,
-0.032867807894945145,
-0.10945476591587067,
0.013781342655420303,
0.031013773754239082,
0.08312701433897018,
-0.12333612143993378,
-0.012030419893562794,
0.05180148407816887,
-0.0031426462810486555,
0.12755532562732697,
0.08311059325933456,
-0.09716956317424774,
0.07024254649877548,
-0.15505874156951904,
-0.07282905280590057,
0.010609405115246773,
0.05402247980237007,
0.06555619090795517,
0.02833816222846508,
0.0018479775171726942,
0.09856736660003662,
-0.016495129093527794,
0.07600168883800507,
-0.04691718891263008,
-0.09962381422519684,
0.008766556158661842,
-0.03375929594039917,
-0.05969064310193062,
-0.05746527388691902,
-0.029841246083378792,
0.06696692854166031,
-0.00045494819642044604,
0.14698146283626556,
-0.03954540193080902,
0.10265972465276718,
-0.1413700133562088,
0.015262972563505173,
0.01361755095422268,
-0.0994015634059906,
-0.07817749679088593,
-0.049802277237176895,
0.02912137098610401,
-0.05025134235620499,
0.2327905297279358,
-0.037417758256196976,
-0.008629105053842068,
0.05106867477297783,
0.06373221427202225,
-0.021478915587067604,
-0.022291824221611023,
0.2201968878507614,
0.06961911171674728,
-0.026211854070425034,
-0.010547147132456303,
0.045339539647102356,
-0.055968526750802994,
0.061621248722076416,
0.15876081585884094,
0.09854187816381454,
0.101725734770298,
0.03798932582139969,
0.060607898980379105,
-0.018403436988592148,
-0.016162894666194916,
-0.08641000092029572,
-0.024906864389777184,
0.0706244707107544,
-0.0004280365537852049,
0.01815185137093067,
0.14429347217082977,
-0.06090305373072624,
0.01113453134894371,
-0.060151420533657074,
-0.03848932683467865,
-0.1560528576374054,
-0.14587683975696564,
-0.08675258606672287,
-0.0707755759358406,
0.03581821173429489,
-0.15049730241298676,
0.04149698466062546,
0.046295441687107086,
0.07806549966335297,
-0.08841078728437424,
-0.009572810493409634,
0.040115438401699066,
-0.05449022352695465,
0.04482800513505936,
-0.013446537777781487,
0.056261420249938965,
-0.05587777867913246,
0.055141936987638474,
-0.006953174713999033,
-0.007623029872775078,
0.0010305567411705852,
0.05057565122842789,
-0.03968945890665054,
0.038760051131248474,
-0.13142237067222595,
-0.09259180724620819,
-0.045582953840494156,
0.05419114604592323,
0.01942279189825058,
0.16885706782341003,
0.021739089861512184,
0.03877992555499077,
0.056361209601163864,
0.21419738233089447,
-0.04356284812092781,
-0.09870528429746628,
-0.08780694752931595,
0.11339858174324036,
-0.002692936919629574,
0.03941669687628746,
0.020664693787693977,
0.0002857254585251212,
-0.037688035517930984,
0.2764173150062561,
0.2894574701786041,
-0.11573059111833572,
0.011662698350846767,
0.006323687266558409,
0.02712511457502842,
0.04209006577730179,
0.06666610389947891,
0.06715844571590424,
0.2183363288640976,
-0.08410847187042236,
0.007864568382501602,
-0.06058114767074585,
-0.023784559220075607,
-0.013221729546785355,
0.10366883128881454,
0.02570922300219536,
-0.06763271242380142,
-0.005363956559449434,
0.11297785490751266,
-0.20770816504955292,
0.07475702464580536,
-0.0891847312450409,
-0.13711825013160706,
-0.09236816316843033,
-0.011185760609805584,
0.09729740023612976,
0.04147334396839142,
0.022693639621138573,
-0.014284657314419746,
-0.01764804869890213,
0.07151853293180466,
-0.004579850472509861,
-0.1862146407365799,
0.01652713678777218,
0.0953638032078743,
-0.09160517901182175,
0.07825525850057602,
0.015997014939785004,
0.09753362089395523,
0.0803174152970314,
0.027514036744832993,
-0.10947717726230621,
0.05329173430800438,
0.06339804828166962,
-0.04319280385971069,
0.017364811152219772,
0.014176696538925171,
0.062173545360565186,
0.0017811685102060437,
0.09909883141517639,
-0.09400340914726257,
0.04412633180618286,
0.005009696818888187,
-0.027515193447470665,
-0.0657898560166359,
0.0752192884683609,
-0.020569410175085068,
0.1087682694196701,
0.07346315681934357,
-0.05321197956800461,
-0.004038912244141102,
-0.0630466416478157,
0.006718810182064772,
0.004600311629474163,
-0.07988685369491577,
-0.04068097472190857,
-0.09570153057575226,
-0.034701157361269,
-0.0019021049374714494,
0.0011868956498801708,
-0.20853202044963837,
0.01149475947022438,
-0.09145748615264893,
-0.02992337942123413,
-0.1283048689365387,
0.06040877103805542,
0.11797400563955307,
0.0015367266023531556,
-0.008213466964662075,
-0.010946597903966904,
0.016218135133385658,
0.07569082826375961,
-0.10279960930347443,
-0.06707992404699326
] |
null | null | transformers |
[Google's mT5](https://github.com/google-research/multilingual-t5)
This is a model for generating questions from Thai texts. It was fine-tuned on NSC2018 corpus
```python
from transformers import T5Tokenizer, MT5ForConditionalGeneration
tokenizer = T5Tokenizer.from_pretrained("Pollawat/mt5-small-thai-qg")
model = MT5ForConditionalGeneration.from_pretrained("Pollawat/mt5-small-thai-qg")
text = "กรุงเทพมหานคร เป็นเมืองหลวงและนครที่มีประชากรมากที่สุดของประเทศไทย เป็นศูนย์กลางการปกครอง การศึกษา การคมนาคมขนส่ง การเงินการธนาคาร การพาณิชย์ การสื่อสาร และความเจริญของประเทศ เป็นเมืองที่มีชื่อยาวที่สุดในโลก ตั้งอยู่บนสามเหลี่ยมปากแม่น้ำเจ้าพระยา มีแม่น้ำเจ้าพระยาไหลผ่านและแบ่งเมืองออกเป็น 2 ฝั่ง คือ ฝั่งพระนครและฝั่งธนบุรี กรุงเทพมหานครมีพื้นที่ทั้งหมด 1,568.737 ตร.กม. มีประชากรตามทะเบียนราษฎรกว่า 5 ล้านคน ทำให้กรุงเทพมหานครเป็นเอกนคร (Primate City) จัด มีผู้กล่าวว่า กรุงเทพมหานครเป็น 'เอกนครที่สุดในโลก' เพราะมีประชากรมากกว่านครที่มีประชากรมากเป็นอันดับ 2 ถึง 40 เท่า[3]"
input_ids = tokenizer.encode(text, return_tensors='pt')
beam_output = model.generate(
input_ids,
max_length=50,
num_beams=5,
early_stopping=True
)
print(tokenizer.decode(beam_output[0], skip_special_tokens=True))
>> <extra_id_0>ของกรุงเทพมหานครเป็นเมืองหลวงของประเทศใด
``` | {"language": ["thai", "th"], "license": "mit", "tags": ["question-generation"], "datasets": ["NSC2018"]} | text2text-generation | Pollawat/mt5-small-thai-qg | [
"transformers",
"pytorch",
"mt5",
"text2text-generation",
"question-generation",
"dataset:NSC2018",
"license:mit",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"text-generation-inference",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [
"thai",
"th"
] | TAGS
#transformers #pytorch #mt5 #text2text-generation #question-generation #dataset-NSC2018 #license-mit #autotrain_compatible #endpoints_compatible #has_space #text-generation-inference #region-us
|
Google's mT5
This is a model for generating questions from Thai texts. It was fine-tuned on NSC2018 corpus
| [] | [
"TAGS\n#transformers #pytorch #mt5 #text2text-generation #question-generation #dataset-NSC2018 #license-mit #autotrain_compatible #endpoints_compatible #has_space #text-generation-inference #region-us \n"
] | [
71
] | [
"passage: TAGS\n#transformers #pytorch #mt5 #text2text-generation #question-generation #dataset-NSC2018 #license-mit #autotrain_compatible #endpoints_compatible #has_space #text-generation-inference #region-us \n"
] | [
-0.016732338815927505,
0.12303881347179413,
-0.002764277858659625,
0.029858509078621864,
0.12273790687322617,
0.009473557583987713,
0.14367367327213287,
0.14818693697452545,
0.028379948809742928,
0.01286797784268856,
0.1830979585647583,
0.1898416429758072,
-0.012527498416602612,
0.08275530487298965,
-0.11522078514099121,
-0.1873430460691452,
0.057986143976449966,
0.05323968082666397,
0.024755137041211128,
0.1268526017665863,
0.10086997598409653,
-0.0839267447590828,
0.08705917000770569,
-0.046066876500844955,
-0.12818890810012817,
0.032450657337903976,
0.01671616919338703,
-0.10396433621644974,
0.10266988724470139,
0.031224533915519714,
0.06544336676597595,
0.08043920248746872,
-0.03423665836453438,
-0.14016281068325043,
0.03510384261608124,
0.011398259550333023,
-0.07719668000936508,
0.0882841944694519,
0.07129616290330887,
-0.03165895864367485,
0.11811380088329315,
0.02257665991783142,
-0.019008690491318703,
0.06926050782203674,
-0.13505445420742035,
-0.024414673447608948,
-0.08151129633188248,
0.03418761119246483,
0.06684663146734238,
0.12005268037319183,
-0.01330081932246685,
0.15890838205814362,
-0.1282261461019516,
0.1004846841096878,
0.13681942224502563,
-0.33535072207450867,
-0.0022853370755910873,
0.12868832051753998,
0.1124984472990036,
0.08168374747037888,
-0.02441112883388996,
0.05684886872768402,
0.06693930178880692,
0.015841418877243996,
-0.00011053164780605584,
-0.05887710675597191,
-0.15331701934337616,
0.07072763890028,
-0.08312968164682388,
-0.08689024299383163,
0.3158520758152008,
0.005495375022292137,
0.07813363522291183,
-0.05431126803159714,
-0.07608462125062943,
-0.03430517762899399,
0.019274577498435974,
0.007205617614090443,
-0.03933151066303253,
0.049733638763427734,
-0.019655834883451462,
-0.05838516727089882,
-0.14060448110103607,
-0.022129962220788002,
-0.18656744062900543,
-0.006834901869297028,
-0.026291580870747566,
0.06499344855546951,
-0.18613316118717194,
0.06035810336470604,
0.03146185725927353,
-0.11904314905405045,
0.008810284547507763,
-0.06676436215639114,
0.012834443710744381,
-0.015309898182749748,
-0.06862958520650864,
-0.030112024396657944,
0.09398676455020905,
0.1535869538784027,
-0.00413293344900012,
-0.010427054017782211,
-0.06373515725135803,
0.09214819222688675,
0.03552845120429993,
0.03820791095495224,
-0.03339574113488197,
-0.056138407438993454,
0.012970587238669395,
-0.09883762896060944,
0.0022103250958025455,
-0.07135280966758728,
-0.16000910103321075,
-0.0537041537463665,
0.011668072082102299,
0.09212954342365265,
0.11339983344078064,
0.07607336342334747,
-0.0007639050600118935,
-0.013698115013539791,
0.11256977170705795,
-0.044088300317525864,
0.025032948702573776,
0.02327417954802513,
-0.0071756173856556416,
0.05111324414610863,
0.009684832766652107,
0.042612332850694656,
-0.07585525512695312,
0.03581749647855759,
-0.08645929396152496,
-0.02459547109901905,
-0.04479111358523369,
-0.07012364268302917,
0.09842458367347717,
-0.09199550747871399,
0.002331059193238616,
-0.1443025916814804,
-0.20307640731334686,
-0.02137734182178974,
0.0012006511678919196,
-0.02744426764547825,
-0.08108607679605484,
-0.0036764773540198803,
-0.06832173466682434,
0.06366300582885742,
-0.09211676567792892,
0.06102171540260315,
-0.06982947140932083,
0.07299792021512985,
-0.07883832603693008,
0.0638648197054863,
-0.15453186631202698,
0.06398221850395203,
-0.11419644951820374,
-0.004754368681460619,
0.06289001554250717,
0.03790125250816345,
-0.028544213622808456,
0.09629148989915848,
-0.050279904156923294,
-0.046429675072431564,
-0.07158465683460236,
0.02014741487801075,
-0.026946289464831352,
0.18756254017353058,
-0.1312030702829361,
-0.04337841272354126,
0.1384066492319107,
-0.05555533245205879,
-0.21911804378032684,
0.08239544183015823,
0.006397153716534376,
0.0219721682369709,
0.06001807004213333,
0.20718473196029663,
0.02465668134391308,
-0.03606539964675903,
-0.005909148138016462,
0.09575523436069489,
-0.07147634029388428,
-0.16696958243846893,
0.049237433820962906,
-0.02301812171936035,
-0.0484330989420414,
0.03365740925073624,
0.06849243491888046,
0.025176964700222015,
-0.05980595946311951,
-0.0817641094326973,
-0.06808547675609589,
-0.028306197375059128,
0.03816911205649376,
0.029013652354478836,
0.0810035988688469,
-0.07938473671674728,
-0.008595908060669899,
0.016809826716780663,
-0.005368595942854881,
0.004781919997185469,
0.024072229862213135,
-0.042281899601221085,
0.11182655394077301,
-0.057949498295784,
0.01509613636881113,
-0.16061381995677948,
-0.0962616577744484,
-0.029455311596393585,
0.1033683493733406,
0.014614489860832691,
0.12543435394763947,
0.03551517054438591,
-0.05935227870941162,
-0.011821264401078224,
-0.0022430485114455223,
0.1333586871623993,
0.03319220989942551,
-0.08646051585674286,
-0.08514939993619919,
0.004695910960435867,
-0.05675382539629936,
0.004112423397600651,
-0.08719176799058914,
0.018452538177371025,
0.011202184483408928,
0.11140064150094986,
-0.03717362880706787,
0.053881313651800156,
-0.001666553784161806,
0.026954809203743935,
-0.07184506207704544,
0.019824624061584473,
0.09687685966491699,
-0.0028787923511117697,
-0.06096437945961952,
0.1966298222541809,
-0.17801059782505035,
0.19390299916267395,
0.1941460222005844,
-0.18292614817619324,
0.04731437936425209,
-0.05726911500096321,
-0.05849006772041321,
0.001859273063018918,
0.006113097071647644,
-0.016086479648947716,
-0.02614118531346321,
-0.011513716541230679,
0.1293288916349411,
-0.06619970500469208,
-0.048805076628923416,
-0.012415040284395218,
-0.04534276947379112,
-0.044660963118076324,
0.07752685248851776,
0.15435273945331573,
-0.18766625225543976,
0.21461555361747742,
0.24774619936943054,
0.016026893630623817,
0.17639370262622833,
-0.02238440327346325,
-0.039032578468322754,
0.01700267381966114,
-0.041559699922800064,
-0.05609541013836861,
-0.008923784829676151,
-0.16160713136196136,
0.012804358266294003,
0.12120860815048218,
0.04593326151371002,
0.09372418373823166,
-0.10080299526453018,
-0.06702909618616104,
-0.014207243919372559,
-0.022904744371771812,
-0.05949459597468376,
0.10972665250301361,
0.06620989739894867,
0.1385553479194641,
-0.022337399423122406,
-0.010989315807819366,
0.11008670926094055,
-0.013394859619438648,
-0.12271980196237564,
0.1732560694217682,
-0.14019758999347687,
-0.2527574300765991,
-0.11734461784362793,
-0.09020384401082993,
-0.07147873193025589,
0.031017286702990532,
0.12787258625030518,
-0.062593474984169,
-0.015845950692892075,
-0.05450218915939331,
-0.035052794963121414,
-0.06177595630288124,
-0.007512771058827639,
-0.06045025587081909,
0.04234788939356804,
-0.0506678968667984,
-0.11539700627326965,
-0.04279332980513573,
0.004874320235103369,
-0.059637296944856644,
0.13016283512115479,
-0.06633585691452026,
0.10813593864440918,
0.16681137681007385,
0.001315565314143896,
0.03876986354589462,
-0.0393386036157608,
0.19566088914871216,
-0.06934591382741928,
0.011173659935593605,
0.2372620403766632,
0.03461948409676552,
0.057216815650463104,
0.16794037818908691,
0.00957251712679863,
-0.05194558948278427,
0.03914020583033562,
-0.026687797158956528,
-0.09176503866910934,
-0.28594106435775757,
-0.14479590952396393,
-0.1272345334291458,
0.08308196067810059,
0.04932102933526039,
0.06957169622182846,
0.16205136477947235,
0.07279081642627716,
-0.04413649067282677,
-0.05903467908501625,
-0.008401107974350452,
0.0718282088637352,
0.211348757147789,
-0.016443192958831787,
0.15359848737716675,
-0.08572350442409515,
-0.06288132071495056,
0.08688148856163025,
0.06687703728675842,
0.10486038029193878,
0.0812271386384964,
0.0578901544213295,
0.06476607918739319,
0.12799616158008575,
0.11104339361190796,
0.09036431461572647,
0.0731128603219986,
-0.02639029547572136,
-0.011765467934310436,
-0.03722039610147476,
-0.02018113061785698,
0.06272197514772415,
0.0615820549428463,
-0.13703246414661407,
-0.041064128279685974,
-0.10073085129261017,
0.06681216508150101,
0.09380212426185608,
0.10105595737695694,
-0.23685689270496368,
0.002872351324185729,
0.08276307582855225,
-0.005476135294884443,
-0.07987416535615921,
0.08330826461315155,
0.057494811713695526,
-0.11996859312057495,
0.043272074311971664,
-0.009553679265081882,
0.14783655107021332,
0.05205870792269707,
0.06768343597650528,
-0.037927206605672836,
-0.12595395743846893,
0.022395258769392967,
0.1365552842617035,
-0.3223528563976288,
0.2296876311302185,
0.002363553736358881,
-0.05559727922081947,
-0.11730656772851944,
-0.02937294729053974,
-0.022169815376400948,
0.10153066366910934,
0.1134830117225647,
0.008932722732424736,
-0.11345794796943665,
-0.03984956070780754,
-0.025277316570281982,
0.03588327765464783,
0.07331538200378418,
0.0029760226607322693,
0.00022526939574163407,
-0.016517579555511475,
0.011582255363464355,
0.011609908193349838,
0.10329733043909073,
-0.04530390352010727,
-0.17796000838279724,
0.058080531656742096,
0.10289719700813293,
0.06491682678461075,
-0.005604760255664587,
-0.02550271339714527,
-0.13108420372009277,
0.18805035948753357,
-0.044124964624643326,
-0.05067931115627289,
-0.13234566152095795,
-0.0622997023165226,
0.07936560362577438,
-0.05556430295109749,
0.009726953692734241,
-0.052758775651454926,
0.032496728003025055,
-0.06839371472597122,
-0.1966942399740219,
0.10401808470487595,
-0.1001279279589653,
-0.0699772760272026,
-0.04047872871160507,
0.14879614114761353,
-0.07437159866094589,
0.04854224622249603,
0.0530163012444973,
0.04419289901852608,
-0.13110919296741486,
-0.11041329801082611,
0.011553130112588406,
-0.018144331872463226,
0.09811744093894958,
0.024501707404851913,
-0.051707491278648376,
-0.003511210670694709,
0.020386014133691788,
-0.022092176601290703,
0.2794368863105774,
0.17320683598518372,
-0.09708048403263092,
0.17459829151630402,
0.1136450469493866,
-0.0828881487250328,
-0.3168490529060364,
-0.07889479398727417,
-0.11579981446266174,
-0.04436793178319931,
0.01484556496143341,
-0.10146515816450119,
0.07160263508558273,
0.03615395352244377,
-0.06275440752506256,
0.0671844333410263,
-0.24265001714229584,
-0.08594236522912979,
0.12658271193504333,
-0.041730403900146484,
0.27845796942710876,
-0.14845293760299683,
-0.09206204116344452,
-0.06996294856071472,
-0.18550418317317963,
0.23015433549880981,
-0.1086060106754303,
0.07176375389099121,
-0.03620629012584686,
0.10249985009431839,
0.015193275175988674,
-0.08363765478134155,
0.12775634229183197,
0.029899314045906067,
0.0255754217505455,
-0.10148748010396957,
0.0039016548544168472,
0.11756724119186401,
-0.019683068618178368,
0.06363300234079361,
-0.08181989938020706,
0.044108856469392776,
-0.18356996774673462,
0.00000736355787012144,
-0.11102776974439621,
0.08657555282115936,
-0.018760578706860542,
-0.050642356276512146,
-0.025448346510529518,
-0.032384879887104034,
0.0417436808347702,
-0.004217499867081642,
0.21895207464694977,
-0.051290690898895264,
0.20717701315879822,
0.2011493295431137,
0.13180723786354065,
-0.14326642453670502,
0.029247265309095383,
-0.04037274792790413,
-0.0783512145280838,
0.06171826645731926,
-0.12476900964975357,
0.051648810505867004,
0.12936840951442719,
-0.029039284214377403,
0.08892890065908432,
0.0889657512307167,
0.03724951669573784,
-0.017982613295316696,
0.11946796625852585,
-0.21033525466918945,
-0.025769218802452087,
-0.034774940460920334,
0.013107064180076122,
0.0009374122018925846,
0.020777514204382896,
0.13996565341949463,
0.015723180025815964,
-0.022772133350372314,
0.007020921912044287,
0.0357683002948761,
-0.047973040491342545,
0.06942366063594818,
0.06467442214488983,
0.04960256442427635,
-0.13293279707431793,
0.06710204482078552,
0.0418163426220417,
-0.1280498504638672,
0.014496571384370327,
0.1247415691614151,
-0.1058838963508606,
-0.14931106567382812,
-0.024023916572332382,
0.036013245582580566,
-0.14277109503746033,
-0.05701334774494171,
-0.021869588643312454,
-0.10548139363527298,
0.09284594655036926,
0.11686733365058899,
0.04768547788262367,
0.06568172574043274,
0.004133080597966909,
-0.08169244229793549,
-0.03816276043653488,
0.021746262907981873,
-0.06444688886404037,
0.035893477499485016,
-0.07973475009202957,
0.043301500380039215,
-0.05270940065383911,
0.13468672335147858,
-0.07334981858730316,
-0.03134358674287796,
-0.13533319532871246,
0.02250358648598194,
-0.143463596701622,
-0.06609242409467697,
-0.11849843710660934,
-0.07043404132127762,
-0.009577051736414433,
-0.05106262117624283,
-0.08547917753458023,
-0.036422792822122574,
-0.12225906550884247,
-0.007341485004872084,
-0.02484443597495556,
0.08764253556728363,
-0.1303938627243042,
-0.023814206942915916,
0.06273606419563293,
-0.006605882663279772,
0.0972801148891449,
0.09228143095970154,
-0.10279765725135803,
0.06303033977746964,
-0.1528644859790802,
-0.10739484429359436,
0.06557968258857727,
0.05169232562184334,
0.05091310665011406,
0.04562756046652794,
-0.002666750457137823,
0.12421908229589462,
0.021409351378679276,
0.06232791766524315,
-0.05345902591943741,
-0.1201181411743164,
0.0022874881979078054,
-0.061819735914468765,
-0.1117481216788292,
-0.04344568029046059,
-0.053580429404973984,
0.06332118064165115,
0.007047520484775305,
0.1434321105480194,
-0.04333251342177391,
0.08794756978750229,
-0.12039278447628021,
0.013865480199456215,
-0.004587026312947273,
-0.1249186173081398,
-0.07953383028507233,
-0.06311972439289093,
0.04038494825363159,
-0.020744895562529564,
0.26753556728363037,
0.03366810455918312,
-0.04356691986322403,
0.05907316878437996,
0.094585120677948,
0.03675520420074463,
-0.006009276490658522,
0.25452834367752075,
0.0782548114657402,
-0.038895100355148315,
-0.01911812461912632,
0.05939365550875664,
-0.030696239322423935,
0.03749839961528778,
0.15091019868850708,
0.10550231486558914,
0.05108056217432022,
0.06035832688212395,
0.05205843225121498,
-0.01832648366689682,
-0.055783215910196304,
-0.11494290828704834,
-0.00979514792561531,
0.09202031046152115,
-0.04509690776467323,
0.006780486088246107,
0.16100016236305237,
-0.07258487492799759,
0.019216977059841156,
-0.06300599128007889,
-0.03426127880811691,
-0.17566771805286407,
-0.1689620018005371,
-0.09917178004980087,
-0.12134283781051636,
0.01595338061451912,
-0.12700411677360535,
0.07330816984176636,
0.10925912111997604,
0.058164577931165695,
-0.06613975763320923,
0.0070497868582606316,
0.0174102783203125,
-0.05150187015533447,
0.005122811533510685,
-0.04027840122580528,
0.09066271781921387,
-0.062174078077077866,
0.007460818160325289,
-0.0281455609947443,
-0.028234045952558517,
-0.001845132908783853,
0.05838620662689209,
-0.019912544637918472,
0.017689298838377,
-0.13861478865146637,
-0.09482070058584213,
-0.05437975749373436,
0.06927771866321564,
0.007631062529981136,
0.19997987151145935,
0.00736586470156908,
0.004833076614886522,
0.04987475648522377,
0.23029886186122894,
-0.08870245516300201,
-0.09727335721254349,
-0.056025657802820206,
0.1798146367073059,
0.04569375514984131,
0.05831773579120636,
-0.014302636496722698,
-0.010432909242808819,
-0.07151549309492111,
0.25864115357398987,
0.33013150095939636,
-0.07656551152467728,
0.031122373417019844,
0.0021376495715230703,
0.024862509220838547,
0.07207397371530533,
0.08593827486038208,
0.08320153504610062,
0.21674184501171112,
-0.07083208858966827,
0.0027148218359798193,
-0.040716297924518585,
-0.01854092627763748,
-0.08879993110895157,
0.0956454947590828,
0.009172295220196247,
-0.10757334530353546,
0.010709847323596478,
0.10582228004932404,
-0.19521726667881012,
0.1104414090514183,
-0.05555739626288414,
-0.16816256940364838,
-0.05816249921917915,
0.032885801047086716,
0.13649515807628632,
-0.004096912685781717,
0.053382836282253265,
-0.03603226691484451,
-0.041708141565322876,
0.02626687102019787,
-0.0162833072245121,
-0.19536267220973969,
0.009279076009988785,
0.07377510517835617,
-0.07092135399580002,
0.05251563340425491,
-0.003901907242834568,
0.08007408678531647,
0.09833817929029465,
0.0385119803249836,
-0.10006323456764221,
0.08626922965049744,
0.05448251590132713,
-0.0631776973605156,
0.035642191767692566,
-0.0034123151563107967,
0.013619130477309227,
-0.03781193867325783,
0.08331925421953201,
-0.15942645072937012,
0.035240620374679565,
-0.01850583404302597,
-0.004131359979510307,
-0.056949030607938766,
0.016596093773841858,
-0.021436253562569618,
0.10376919060945511,
0.04409776255488396,
-0.04567786678671837,
-0.025034567341208458,
-0.05288667604327202,
-0.02158624678850174,
-0.00894453190267086,
-0.12333741784095764,
-0.07494334131479263,
-0.1010018140077591,
-0.05817900970578194,
0.05577048659324646,
0.010927058756351471,
-0.1964353770017624,
0.006197096314281225,
-0.08192966878414154,
-0.0014282695483416319,
-0.1303800493478775,
0.06949006766080856,
0.10188306123018265,
-0.03274499252438545,
-0.016594277694821358,
-0.06807320564985275,
0.03188527747988701,
0.08896652609109879,
-0.12207575142383575,
-0.08115258812904358
] |
null | null | transformers | Shrek, with all 4 scripts! | {"tags": ["conversational"]} | text-generation | Poly-Pixel/shrek-medium-full | [
"transformers",
"pytorch",
"safetensors",
"gpt2",
"text-generation",
"conversational",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [] | TAGS
#transformers #pytorch #safetensors #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
| Shrek, with all 4 scripts! | [] | [
"TAGS\n#transformers #pytorch #safetensors #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n"
] | [
56
] | [
"passage: TAGS\n#transformers #pytorch #safetensors #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n"
] | [
-0.028994612395763397,
0.03717942163348198,
-0.007205516565591097,
0.004361928440630436,
0.14950066804885864,
-0.013941794633865356,
0.11986828595399857,
0.1182805597782135,
-0.03048190474510193,
-0.010174466297030449,
0.14877668023109436,
0.1851094663143158,
-0.013957205228507519,
0.09307502955198288,
-0.09557180106639862,
-0.2245006561279297,
0.08883897960186005,
0.027438592165708542,
0.035971157252788544,
0.1274057924747467,
0.08356550335884094,
-0.06823034584522247,
0.06669498234987259,
-0.03636123612523079,
-0.12532266974449158,
0.0028578240890055895,
0.05958588048815727,
-0.1302003562450409,
0.12235744297504425,
0.03003060445189476,
0.10374733805656433,
0.043939314782619476,
-0.07261740416288376,
-0.15499833226203918,
0.03386516869068146,
0.03463131934404373,
-0.06291534006595612,
0.046142611652612686,
0.07618991285562515,
-0.093361496925354,
0.06937185674905777,
0.06204131990671158,
-0.012534034438431263,
0.05895763263106346,
-0.15713968873023987,
-0.04276531934738159,
-0.014816577546298504,
0.013026992790400982,
0.07869639992713928,
0.10855725407600403,
-0.03201347962021828,
0.14759007096290588,
-0.0851001888513565,
0.12150996178388596,
0.1239863857626915,
-0.32539400458335876,
-0.0010325099574401975,
0.07500352710485458,
0.042798060923814774,
0.0700213834643364,
-0.04765492305159569,
0.051571447402238846,
0.031836334615945816,
0.004194476641714573,
0.014558027498424053,
-0.05954115092754364,
-0.12596048414707184,
0.018643176183104515,
-0.09720765799283981,
-0.06140025332570076,
0.20382985472679138,
-0.0590277723968029,
0.051872193813323975,
-0.07813733071088791,
-0.12088852375745773,
-0.029457518830895424,
-0.018522758036851883,
-0.0014233867404982448,
-0.07191982120275497,
0.0697530135512352,
0.008914402686059475,
-0.048782069236040115,
-0.14170318841934204,
-0.03263361006975174,
-0.16206398606300354,
0.18776893615722656,
0.03174376115202904,
0.04489494860172272,
-0.19151368737220764,
0.09325390309095383,
0.040709663182497025,
-0.10056023299694061,
0.030602479353547096,
-0.10420837253332138,
0.0518723800778389,
0.013859134167432785,
-0.03356955200433731,
-0.07820286601781845,
0.12177305668592453,
0.10705813020467758,
-0.08302552998065948,
0.029910054057836533,
-0.0518684946000576,
0.07312697917222977,
0.02555178292095661,
0.06321559846401215,
0.0005397886270657182,
0.009785126894712448,
0.06927945464849472,
-0.08632822334766388,
0.029176287353038788,
-0.06690286844968796,
-0.12896926701068878,
-0.0064486730843782425,
0.09108948707580566,
0.11376697570085526,
0.012237145565450191,
0.12357603758573532,
-0.043104153126478195,
0.01189747080206871,
0.06860891729593277,
-0.06891996413469315,
-0.022417569532990456,
0.03244988992810249,
0.03606780990958214,
0.06666674464941025,
-0.006634891033172607,
0.02604227140545845,
-0.13079532980918884,
0.04091726988554001,
-0.0643448606133461,
-0.01847873069345951,
-0.02537871152162552,
-0.054979365319013596,
0.021013228222727776,
-0.05385638400912285,
0.0016376969870179892,
-0.17925472557544708,
-0.15229609608650208,
0.01025520171970129,
-0.02450299635529518,
-0.01414579525589943,
-0.04215415567159653,
-0.05549995228648186,
-0.038427844643592834,
0.025479240342974663,
-0.07407741248607635,
-0.0541660450398922,
-0.06597121059894562,
0.11745881289243698,
-0.03170236200094223,
0.0638091042637825,
-0.09621044993400574,
0.055343348532915115,
-0.1376529037952423,
-0.010226898826658726,
-0.07068921625614166,
0.06311444938182831,
-0.015582656487822533,
0.11676833033561707,
0.0014907608274370432,
-0.023834871128201485,
-0.0757647231221199,
0.05509824678301811,
-0.025720594450831413,
0.22712132334709167,
-0.06154930591583252,
-0.09178899973630905,
0.31081268191337585,
-0.11809492111206055,
-0.15338093042373657,
0.13023294508457184,
0.011604922823607922,
0.019147438928484917,
0.12250597029924393,
0.20399682223796844,
0.008194107562303543,
-0.010573046281933784,
0.0578976534307003,
0.09868865460157394,
-0.12109559029340744,
-0.03384355455636978,
0.0019291907083243132,
-0.027920013293623924,
-0.13384109735488892,
0.045637402683496475,
0.08118539303541183,
0.06196809560060501,
-0.05529056116938591,
-0.03310706093907356,
-0.032375507056713104,
-0.007541123311966658,
0.09128022193908691,
0.0010877466993406415,
0.08752452582120895,
-0.08452648669481277,
-0.04017898812890053,
-0.05404124781489372,
-0.005262788850814104,
-0.04702545702457428,
0.018141234293580055,
-0.06270471960306168,
0.11833393573760986,
-0.0026014288887381554,
0.06254255026578903,
-0.14835688471794128,
-0.11584768444299698,
-0.009780634194612503,
0.12789341807365417,
-0.01780831813812256,
0.04813294857740402,
0.06763497740030289,
0.00098732253536582,
-0.0011714716674759984,
-0.03454669937491417,
0.17249971628189087,
-0.012706821784377098,
-0.05934740975499153,
-0.06328863650560379,
0.09847947955131531,
-0.06798075139522552,
0.05744375288486481,
-0.07993299514055252,
0.025816364213824272,
0.06514972448348999,
0.09963318705558777,
0.008724996820092201,
0.028547506779432297,
-0.0013018660247325897,
0.004906942136585712,
-0.05867992341518402,
-0.00611899746581912,
0.10396798700094223,
0.011229002848267555,
-0.0740957260131836,
0.20066019892692566,
-0.21398082375526428,
0.24392159283161163,
0.19056081771850586,
-0.24192918837070465,
0.00981980562210083,
-0.09797576069831848,
-0.035239703953266144,
0.0204030629247427,
0.04077638313174248,
-0.043731484562158585,
0.11199628561735153,
-0.00907069444656372,
0.18191012740135193,
-0.0733426883816719,
-0.04988950863480568,
-0.0024024536833167076,
-0.06362885236740112,
-0.006515666376799345,
0.08168808370828629,
0.0420350655913353,
-0.14488229155540466,
0.19209423661231995,
0.1648620218038559,
0.03738253936171532,
0.18765953183174133,
-0.007599308155477047,
-0.0005212257383391261,
0.08825082331895828,
0.056330155581235886,
-0.0332566499710083,
-0.0683663859963417,
-0.20980527997016907,
-0.015365404076874256,
0.0680062547326088,
0.0452900193631649,
0.10409069806337357,
-0.1285317838191986,
-0.0476088747382164,
-0.02153710089623928,
-0.03222563490271568,
0.021310606971383095,
0.07516232877969742,
0.04943285137414932,
0.1463354378938675,
-0.03315149247646332,
-0.029199428856372833,
0.10530391335487366,
0.0018139067105948925,
-0.10505969077348709,
0.21006526052951813,
-0.12959107756614685,
-0.38478371500968933,
-0.12295942008495331,
-0.12465260177850723,
-0.05466725677251816,
0.05989878252148628,
0.11986995488405228,
-0.12428545951843262,
-0.026873940601944923,
-0.027530189603567123,
0.06363729387521744,
-0.03752683103084564,
0.01928102970123291,
-0.062066853046417236,
0.0373716838657856,
-0.06338763982057571,
-0.10153484344482422,
-0.04917367920279503,
-0.029283059760928154,
-0.10798602551221848,
0.16745351254940033,
-0.07413849234580994,
0.05456981807947159,
0.18967147171497345,
0.024961886927485466,
0.03510000556707382,
-0.053832922130823135,
0.1711120754480362,
-0.08332394063472748,
-0.019603034481406212,
0.19940707087516785,
-0.057942941784858704,
0.08007334917783737,
0.12594826519489288,
-0.015241099521517754,
-0.0892653539776802,
0.050643131136894226,
-0.033212289214134216,
-0.08026967942714691,
-0.23528216779232025,
-0.13507108390331268,
-0.08512603491544724,
0.11327257007360458,
0.018413247540593147,
0.06799782812595367,
0.1529865264892578,
0.07597990334033966,
-0.0492875799536705,
-0.044824402779340744,
0.0672951489686966,
0.07731044292449951,
0.1831343173980713,
-0.048216793686151505,
0.14399567246437073,
-0.04114715754985809,
-0.15805895626544952,
0.07400382310152054,
0.04073971137404442,
0.10478080809116364,
0.037516117095947266,
0.039527639746665955,
0.02364072948694229,
0.09820537269115448,
0.13571609556674957,
0.11845608055591583,
0.007558451034128666,
-0.039035581052303314,
-0.016731536015868187,
-0.03634432330727577,
-0.0670543760061264,
0.011746753938496113,
-0.010773980990052223,
-0.12803319096565247,
-0.06190115585923195,
-0.06618185341358185,
0.11947073042392731,
0.082245372235775,
0.054481759667396545,
-0.21059565246105194,
-0.009993447922170162,
0.09027878940105438,
-0.019997427240014076,
-0.12006835639476776,
0.10920628160238266,
0.04952099546790123,
-0.12109538167715073,
0.03819160908460617,
-0.04868942126631737,
0.09576742351055145,
-0.06145532801747322,
0.0944039523601532,
-0.09457848221063614,
-0.06062381714582443,
0.002222648821771145,
0.11850869655609131,
-0.2925296127796173,
0.2048967033624649,
-0.00892670638859272,
-0.025700481608510017,
-0.1052609458565712,
-0.000636346114333719,
0.002383036306127906,
0.0990152508020401,
0.13844019174575806,
-0.013388436287641525,
-0.018852530047297478,
-0.07502689957618713,
-0.04686504229903221,
0.043244775384664536,
0.12532752752304077,
-0.015467319637537003,
-0.00834830105304718,
-0.04835360869765282,
-0.0023646291811019182,
-0.018859002739191055,
-0.09980562329292297,
-0.006937874481081963,
-0.16253407299518585,
0.06168588250875473,
0.0530114583671093,
0.10794179141521454,
-0.013860982842743397,
-0.00854520220309496,
-0.11860840767621994,
0.22573307156562805,
-0.07790760695934296,
-0.10792146623134613,
-0.09785399585962296,
-0.06658799201250076,
0.020297683775424957,
-0.06326670944690704,
0.04842204228043556,
-0.07062158733606339,
0.04524936527013779,
-0.06655782461166382,
-0.18382880091667175,
0.11711519211530685,
-0.10763727873563766,
-0.07890895754098892,
-0.021923266351222992,
0.22087989747524261,
-0.049244802445173264,
-0.01755031757056713,
0.03608519211411476,
0.016882948577404022,
-0.09164203703403473,
-0.10763014107942581,
0.019699513912200928,
-0.017170218750834465,
0.06290043145418167,
0.04854103550314903,
-0.05951589718461037,
-0.09417112171649933,
-0.04073793813586235,
-0.006547942757606506,
0.32037001848220825,
0.19119004905223846,
-0.04242495819926262,
0.1651287078857422,
0.16958652436733246,
-0.05205022543668747,
-0.3448276221752167,
-0.10939286649227142,
-0.12942685186862946,
-0.06017755717039108,
-0.0542730912566185,
-0.13424967229366302,
0.07734859734773636,
0.03176725283265114,
-0.02984931506216526,
0.11289031058549881,
-0.2495477944612503,
-0.0830443948507309,
0.1708361953496933,
0.02978249453008175,
0.36560356616973877,
-0.15687896311283112,
-0.10042843967676163,
-0.04485407844185829,
-0.11313583701848984,
0.1638716757297516,
-0.10624252259731293,
0.08359898626804352,
0.0005173089448362589,
0.07685405761003494,
0.05813150852918625,
-0.05476165935397148,
0.08167819678783417,
-0.027241511270403862,
0.009587266482412815,
-0.11564526706933975,
-0.02990633435547352,
0.03377249091863632,
0.011620084755122662,
0.03751041740179062,
-0.062149230390787125,
0.04695598781108856,
-0.06142954155802727,
-0.0440547950565815,
-0.08005185425281525,
0.05958639085292816,
0.029754646122455597,
-0.06617877632379532,
0.0050559998489916325,
-0.065475232899189,
-0.00020704269991256297,
0.016612045466899872,
0.2049017995595932,
-0.06282583624124527,
0.19598184525966644,
0.12754030525684357,
0.13884544372558594,
-0.14263051748275757,
0.04750414565205574,
-0.05133480206131935,
-0.06960497051477432,
0.0812823697924614,
-0.06672589480876923,
0.06808197498321533,
0.09437867254018784,
-0.03965180739760399,
0.0742703527212143,
0.10401890426874161,
0.014220776036381721,
-0.0025037124287337065,
0.12674778699874878,
-0.2911945581436157,
-0.06027824059128761,
-0.05841008946299553,
0.0267078448086977,
0.0765918642282486,
0.12912607192993164,
0.18127864599227905,
0.01996750570833683,
-0.03108501434326172,
-0.018294807523489,
0.03991484269499779,
-0.027562161907553673,
0.06695644557476044,
0.004677923396229744,
0.029333269223570824,
-0.1450662910938263,
0.08025793731212616,
0.00001413424797647167,
-0.1466224044561386,
0.024967879056930542,
0.14475053548812866,
-0.13803942501544952,
-0.13489209115505219,
-0.04678435996174812,
0.09309624135494232,
-0.05887448415160179,
-0.04888263717293739,
-0.04676072299480438,
-0.15312238037586212,
0.055112265050411224,
0.14994119107723236,
0.052975382655858994,
0.10418030619621277,
-0.017471758648753166,
-0.017556704580783844,
-0.048256851732730865,
0.017785103991627693,
-0.0009936511050909758,
0.0022541533689945936,
-0.09053029865026474,
0.0744970515370369,
-0.01958519034087658,
0.10916736721992493,
-0.09695246815681458,
-0.06790593266487122,
-0.17211103439331055,
0.03275161236524582,
-0.09633186459541321,
-0.05772961676120758,
-0.09038986265659332,
-0.03978794440627098,
-0.010782967321574688,
-0.014750530011951923,
-0.025427671149373055,
-0.043428484350442886,
-0.09523502737283707,
0.04417814314365387,
-0.03300096467137337,
0.007264059968292713,
-0.10051228106021881,
0.00016086101823020726,
0.07025935500860214,
-0.03757777437567711,
0.16884271800518036,
0.13381977379322052,
-0.10865480452775955,
0.1102571040391922,
-0.21270614862442017,
-0.07441695034503937,
0.12528401613235474,
-0.006233865395188332,
0.015086804516613483,
0.07555849105119705,
0.017773650586605072,
0.09126552194356918,
0.00828908197581768,
0.05829174071550369,
0.03726828843355179,
-0.11326239258050919,
0.07704365253448486,
-0.01599319651722908,
-0.1294003278017044,
-0.04773728922009468,
-0.0730040892958641,
0.02420450933277607,
-0.010505115613341331,
0.12641505897045135,
-0.07865285873413086,
0.0906529352068901,
-0.06671575456857681,
0.024498600512742996,
0.026367289945483208,
-0.1884288191795349,
-0.08302449434995651,
-0.04471737518906593,
0.044647034257650375,
0.006506338249891996,
0.23851554095745087,
0.0235122200101614,
-0.010749533772468567,
0.034232404083013535,
0.05822164937853813,
0.059870053082704544,
0.025883706286549568,
0.19520971179008484,
0.08231765776872635,
-0.072471983730793,
-0.11401300132274628,
0.03745274990797043,
0.01811346225440502,
-0.060790419578552246,
0.12814179062843323,
0.036005791276693344,
-0.027478119358420372,
0.07087187469005585,
-0.01174217090010643,
0.015425745397806168,
-0.0816754475235939,
-0.14454202353954315,
-0.05953683704137802,
0.03253600746393204,
-0.02498267963528633,
0.1120469719171524,
0.1812705099582672,
-0.0013609747402369976,
0.015817290171980858,
-0.03041967749595642,
-0.05926791578531265,
-0.1756991595029831,
-0.14313766360282898,
-0.08775309473276138,
-0.12457139790058136,
0.014500590041279793,
-0.12164600193500519,
0.025044914335012436,
0.053533174097537994,
0.06415745615959167,
-0.06100451946258545,
0.14366786181926727,
0.07371728122234344,
-0.08218565583229065,
0.058793265372514725,
-0.02419847622513771,
0.07684588432312012,
0.01638939045369625,
-0.03992878273129463,
-0.07652272284030914,
0.011380000039935112,
0.005251304712146521,
0.062121931463479996,
-0.03596027195453644,
0.03569943830370903,
-0.1459643691778183,
-0.08668382465839386,
-0.03907736763358116,
0.0812806636095047,
-0.044997770339250565,
0.13921087980270386,
0.018269622698426247,
-0.027931133285164833,
0.05132003873586655,
0.2326163798570633,
-0.06087464466691017,
-0.09179307520389557,
-0.04085228219628334,
0.22599206864833832,
0.022856563329696655,
0.12190650403499603,
-0.009267003275454044,
-0.016304370015859604,
-0.05399281904101372,
0.34328603744506836,
0.2938794195652008,
-0.07042541354894638,
0.038264770060777664,
-0.023749660700559616,
0.036720097064971924,
0.09899816662073135,
0.12958809733390808,
0.11624071002006531,
0.3169666826725006,
-0.06609135121107101,
-0.021776242181658745,
-0.0068347458727657795,
-0.0052537512965500355,
-0.11657443642616272,
0.08425644785165787,
0.025790410116314888,
-0.043594297021627426,
-0.04758237302303314,
0.09097945690155029,
-0.22038395702838898,
0.11581593006849289,
-0.13331608474254608,
-0.15514566004276276,
-0.04885316640138626,
0.0155342323705554,
0.13479961454868317,
-0.0050827935338020325,
0.08605613559484482,
-0.0002486487210262567,
-0.09584292024374008,
0.03018942102789879,
0.021486632525920868,
-0.18027843534946442,
0.01188130583614111,
0.03497105464339256,
-0.05494539812207222,
0.05150381475687027,
-0.010404348373413086,
0.059441130608320236,
0.07335782796144485,
0.027757329866290092,
-0.03716479241847992,
0.0907382071018219,
0.0018811057088896632,
-0.07033063471317291,
0.024655047804117203,
0.03311711549758911,
0.023353727534413338,
-0.08417163044214249,
0.06316978484392166,
-0.15417876839637756,
0.04319261014461517,
-0.0027546107303351164,
-0.0546049028635025,
-0.015047412365674973,
0.028999146074056625,
-0.052553869783878326,
0.04826204851269722,
0.060669150203466415,
-0.015072288922965527,
0.01514197327196598,
-0.05358981341123581,
-0.006741818506270647,
-0.03718702495098114,
-0.08331039547920227,
-0.05798166245222092,
-0.1719702184200287,
-0.08313801884651184,
0.12965653836727142,
0.002070409944280982,
-0.22290737926959991,
0.026626020669937134,
-0.10338432341814041,
0.07114361226558685,
-0.19193696975708008,
0.06663238257169724,
0.08630798757076263,
0.018926870077848434,
-0.002647911896929145,
-0.011747708544135094,
0.04613884165883064,
0.10495155304670334,
-0.07489542663097382,
-0.0890762060880661
] |
null | null | transformers | Shrek | {"tags": ["conversational"]} | text-generation | Poly-Pixel/shrek-medium | [
"transformers",
"pytorch",
"gpt2",
"text-generation",
"conversational",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [] | TAGS
#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
| Shrek | [] | [
"TAGS\n#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n"
] | [
51
] | [
"passage: TAGS\n#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n"
] | [
-0.009697278961539268,
0.03208012506365776,
-0.007204889785498381,
0.004809224978089333,
0.16726240515708923,
0.014898733235895634,
0.09765533357858658,
0.13672804832458496,
-0.007841327227652073,
-0.031050153076648712,
0.14490588009357452,
0.20411323010921478,
-0.006439372431486845,
0.0661218985915184,
-0.07572533935308456,
-0.2683109939098358,
0.05759621039032936,
0.046649303287267685,
0.016515716910362244,
0.1200079694390297,
0.08573378622531891,
-0.05473608896136284,
0.08714032918214798,
-0.014583407901227474,
-0.150366872549057,
0.017733458429574966,
0.043394338339567184,
-0.12260226160287857,
0.11910516023635864,
0.05462685227394104,
0.07063519209623337,
0.014929565601050854,
-0.07541623711585999,
-0.1631229966878891,
0.03031250834465027,
0.01425902172923088,
-0.0594632662832737,
0.04757995903491974,
0.059961482882499695,
-0.10165371745824814,
0.10819483548402786,
0.09530027210712433,
-0.013078106567263603,
0.06798283755779266,
-0.16849711537361145,
-0.020869607105851173,
-0.01446688175201416,
0.009899779222905636,
0.05550243332982063,
0.09964893013238907,
-0.03413357585668564,
0.10497362166643143,
-0.09214533120393753,
0.11017382889986038,
0.10932035744190216,
-0.32057443261146545,
-0.005767723545432091,
0.09167823940515518,
0.039358653128147125,
0.07352814823389053,
-0.04467793554067612,
0.06258884817361832,
0.018015462905168533,
0.017986174672842026,
-0.014015024527907372,
-0.07283061742782593,
-0.11612214148044586,
0.04717336222529411,
-0.08668071031570435,
-0.059868961572647095,
0.2244078367948532,
-0.05464440956711769,
0.06881742179393768,
-0.05281897634267807,
-0.10522868484258652,
-0.04308144748210907,
-0.029833965003490448,
0.00475557055324316,
-0.07660607248544693,
0.08692064881324768,
0.00869679357856512,
-0.09547875821590424,
-0.1376667022705078,
-0.02496783249080181,
-0.1776352822780609,
0.16140350699424744,
0.02465328387916088,
0.05232657864689827,
-0.2027255892753601,
0.09623090922832489,
0.017906051129102707,
-0.08045592904090881,
0.022091427817940712,
-0.10046248883008957,
0.029131146147847176,
0.013760408386588097,
-0.04754498973488808,
-0.061387211084365845,
0.0843690037727356,
0.11199145019054413,
-0.01731434464454651,
0.025486016646027565,
-0.039331406354904175,
0.08100687712430954,
0.03553595021367073,
0.09077847748994827,
0.007288969587534666,
-0.028338588774204254,
0.025842782109975815,
-0.13719046115875244,
-0.003647835226729512,
-0.07116208970546722,
-0.16572439670562744,
-0.021088803187012672,
0.02994808368384838,
0.08289173990488052,
0.015449047088623047,
0.11682453751564026,
-0.03272046521306038,
-0.025152435526251793,
0.03602350503206253,
-0.047656361013650894,
-0.012649794109165668,
0.016648368909955025,
0.013163427822291851,
0.12399329990148544,
-0.0022096503525972366,
0.03235051408410072,
-0.13653022050857544,
0.031423524022102356,
-0.06793295592069626,
-0.003740974934771657,
-0.03486552834510803,
-0.040637075901031494,
0.009043924510478973,
-0.06862333416938782,
0.003486064961180091,
-0.15030112862586975,
-0.15063877403736115,
0.007587034720927477,
-0.007836631499230862,
-0.04107699543237686,
-0.06370922178030014,
-0.06952770054340363,
-0.013550350442528725,
0.04251532256603241,
-0.07093454152345657,
-0.011352915316820145,
-0.06403283774852753,
0.11004766076803207,
-0.03197755664587021,
0.07921615242958069,
-0.11953279376029968,
0.08390819281339645,
-0.11260783672332764,
-0.02386913076043129,
-0.060801517218351364,
0.09317506104707718,
-0.0006014376995153725,
0.09549830108880997,
-0.006563255097717047,
-0.017931854352355003,
-0.07981178909540176,
0.06445012241601944,
-0.042872510850429535,
0.21701598167419434,
-0.0615808479487896,
-0.11181682348251343,
0.28781595826148987,
-0.052628401666879654,
-0.1370542049407959,
0.11647392809391022,
0.008682746440172195,
0.05777018144726753,
0.10703510791063309,
0.19733482599258423,
-0.015276194550096989,
0.004040541127324104,
0.09471915662288666,
0.11263324320316315,
-0.11276852339506149,
-0.033160366117954254,
0.013019153848290443,
-0.04081077128648758,
-0.10867965966463089,
0.04689536616206169,
0.09810488671064377,
0.07090286910533905,
-0.04786505550146103,
-0.03377414867281914,
-0.01366397924721241,
0.0052589005790650845,
0.08885077387094498,
-0.007157256826758385,
0.10962837189435959,
-0.05819983780384064,
-0.03796621412038803,
-0.029282379895448685,
-0.012126247398555279,
-0.03951939567923546,
0.03137664496898651,
-0.043376367539167404,
0.10821941494941711,
-0.011204327456653118,
0.06364280730485916,
-0.16185984015464783,
-0.07691477984189987,
-0.017002692446112633,
0.1581239402294159,
0.024538565427064896,
0.09859629720449448,
0.0552486926317215,
-0.040398042649030685,
-0.0012767292791977525,
0.012792680412530899,
0.15581141412258148,
-0.022091681137681007,
-0.065607450902462,
-0.052166227251291275,
0.08642971515655518,
-0.05641226842999458,
0.04504093527793884,
-0.05937713757157326,
0.012367865070700645,
0.05064384639263153,
0.10342344641685486,
-0.00018274025933351368,
0.03323284164071083,
-0.008164864964783192,
0.002145637758076191,
-0.058205123990774155,
0.007405933458358049,
0.10799351334571838,
0.00036868182360194623,
-0.07365862280130386,
0.22074243426322937,
-0.17796069383621216,
0.1765957772731781,
0.1893044263124466,
-0.299345999956131,
0.017949223518371582,
-0.10759581625461578,
-0.04561871662735939,
0.014407722279429436,
0.05567655712366104,
-0.0454222597181797,
0.1703362911939621,
-0.009871348738670349,
0.18874616920948029,
-0.04946064203977585,
-0.04464937001466751,
-0.0200483538210392,
-0.05118836089968681,
-0.0024189651012420654,
0.07781197130680084,
0.10685696452856064,
-0.13992026448249817,
0.1964332014322281,
0.1621224284172058,
0.048237916082143784,
0.19945049285888672,
0.015346456319093704,
-0.011589210480451584,
0.0909530371427536,
0.005220826715230942,
-0.058739423751831055,
-0.07409929484128952,
-0.2594851851463318,
-0.030033592134714127,
0.07992640137672424,
0.0422382652759552,
0.1212305948138237,
-0.11349532753229141,
-0.038956157863140106,
-0.01763172075152397,
-0.023146281018853188,
0.021672505885362625,
0.0914369598031044,
0.06075398623943329,
0.13201528787612915,
-0.001710098935291171,
-0.007300339173525572,
0.10524573177099228,
0.01783694699406624,
-0.09354141354560852,
0.18308524787425995,
-0.13652534782886505,
-0.37097251415252686,
-0.13911493122577667,
-0.18057456612586975,
-0.05449081212282181,
0.05712554603815079,
0.11679314076900482,
-0.12011238187551498,
-0.018752124160528183,
0.01578843593597412,
0.10931742936372757,
-0.08449502289295197,
0.0021454424131661654,
-0.06880278885364532,
0.0321490578353405,
-0.10310184955596924,
-0.09194442629814148,
-0.055416494607925415,
-0.031392451375722885,
-0.08001253753900528,
0.1423761546611786,
-0.10777941346168518,
0.04476889222860336,
0.20262959599494934,
0.04653622955083847,
0.05625178664922714,
-0.044105201959609985,
0.19377262890338898,
-0.11264272034168243,
-0.01661740615963936,
0.19215328991413116,
-0.048360925167798996,
0.07476246356964111,
0.1232115849852562,
-0.006348740309476852,
-0.08765771239995956,
0.03011748194694519,
-0.02085109055042267,
-0.07988511025905609,
-0.23219464719295502,
-0.13938382267951965,
-0.12429051846265793,
0.09477275609970093,
0.028005298227071762,
0.056365787982940674,
0.17219258844852448,
0.06577219814062119,
-0.038416244089603424,
0.006410336587578058,
0.02959546446800232,
0.08237514644861221,
0.23417828977108002,
-0.06035616248846054,
0.1364797055721283,
-0.03420931473374367,
-0.14982740581035614,
0.08169995993375778,
0.0713929831981659,
0.10213395953178406,
0.06678459793329239,
0.0804823637008667,
0.0149586396291852,
0.06188136339187622,
0.1311223804950714,
0.08191446959972382,
0.019586285576224327,
-0.02480296604335308,
-0.03388110175728798,
-0.025523077696561813,
-0.05937909707427025,
0.040128443390131,
0.06589099019765854,
-0.16763372719287872,
-0.039227183908224106,
-0.09338314831256866,
0.09657008945941925,
0.0873042419552803,
0.06609832495450974,
-0.1842060089111328,
-0.008006223477423191,
0.08488986641168594,
-0.03854905813932419,
-0.13727426528930664,
0.09535189718008041,
0.01523482333868742,
-0.15144726634025574,
0.03139317408204079,
-0.04061909019947052,
0.12188644707202911,
-0.07804752141237259,
0.09809603542089462,
-0.08108244836330414,
-0.07448557764291763,
0.02123199962079525,
0.1261177361011505,
-0.30527687072753906,
0.20240111649036407,
-0.0024993624538183212,
-0.06486981362104416,
-0.1243603527545929,
-0.0032166161108762026,
0.002410882618278265,
0.07357452809810638,
0.10519039630889893,
-0.007196315098553896,
0.001897757756523788,
-0.06300821900367737,
-0.01829923689365387,
0.032471053302288055,
0.13080233335494995,
-0.0401318334043026,
-0.021158374845981598,
-0.050194524228572845,
-0.001653497340157628,
-0.03173094615340233,
-0.06934895366430283,
0.02002747356891632,
-0.19509181380271912,
0.08751901984214783,
0.04166261479258537,
0.09648149460554123,
0.029994789510965347,
0.004265148192644119,
-0.09651939570903778,
0.24698667228221893,
-0.07148019969463348,
-0.10072879493236542,
-0.10919588059186935,
-0.046813901513814926,
0.03569883480668068,
-0.05628936365246773,
0.04309194162487984,
-0.0788632407784462,
0.028997479006648064,
-0.06352769583463669,
-0.19235502183437347,
0.12410202622413635,
-0.09027006477117538,
-0.04412810131907463,
-0.02371402643620968,
0.2110891044139862,
-0.05598580464720726,
0.010335659608244896,
0.02930437959730625,
0.01208863127976656,
-0.11645778268575668,
-0.09678568691015244,
0.031018631532788277,
-0.007351789623498917,
0.050603240728378296,
0.041841957718133926,
-0.05915454775094986,
-0.017138581722974777,
-0.052199993282556534,
-0.022926922887563705,
0.3496883809566498,
0.14231905341148376,
-0.043836336582899094,
0.19347235560417175,
0.12347975373268127,
-0.07452994585037231,
-0.3159443140029907,
-0.1066238060593605,
-0.10937739163637161,
-0.04680149629712105,
-0.07012093812227249,
-0.2002030611038208,
0.06474938243627548,
0.00662544509395957,
-0.013415241613984108,
0.12749312818050385,
-0.2561831772327423,
-0.07571036368608475,
0.15906259417533875,
-0.017980827018618584,
0.3745945692062378,
-0.1168576180934906,
-0.10926306992769241,
-0.03950892388820648,
-0.14175476133823395,
0.16968177258968353,
-0.01989765651524067,
0.11221715062856674,
-0.009765521623194218,
0.14388824999332428,
0.05548359826207161,
-0.023479344323277473,
0.08544106781482697,
0.004999885335564613,
-0.03290518373250961,
-0.10304180532693863,
-0.05676887184381485,
0.007092386484146118,
0.02477436140179634,
0.018026655539870262,
-0.041834570467472076,
0.02227151393890381,
-0.11731979995965958,
-0.04657655209302902,
-0.08982590585947037,
0.04431166127324104,
0.03899754583835602,
-0.07325074821710587,
-0.002380647463724017,
-0.07165111601352692,
-0.012272949330508709,
0.022334342822432518,
0.20356793701648712,
-0.08029330521821976,
0.16448934376239777,
0.09239562600851059,
0.12419285625219345,
-0.14376309514045715,
-0.00019283240544609725,
-0.0762530043721199,
-0.05611240118741989,
0.07737895101308823,
-0.09433035552501678,
0.058893077075481415,
0.10901971161365509,
-0.04567738622426987,
0.08828683942556381,
0.10377411544322968,
0.008936077356338501,
0.003213887568563223,
0.10916902124881744,
-0.2667325437068939,
-0.0296600554138422,
-0.07532413303852081,
0.000883326749317348,
0.09092561900615692,
0.08562852442264557,
0.18840822577476501,
0.025361526757478714,
-0.04293036088347435,
-0.002770674182102084,
0.028597986325621605,
-0.039021048694849014,
0.051667019724845886,
0.001123449532315135,
0.01947369985282421,
-0.1530752182006836,
0.072522833943367,
0.01490565575659275,
-0.15215420722961426,
0.021316176280379295,
0.16572684049606323,
-0.11656328290700912,
-0.1283872276544571,
-0.06520111113786697,
0.08313824236392975,
-0.11755692958831787,
-0.01578943058848381,
-0.03279297426342964,
-0.13145680725574493,
0.07992171496152878,
0.12629036605358124,
0.05557859688997269,
0.0972496047616005,
-0.06061713397502899,
-0.020469192415475845,
-0.018721895292401314,
-0.014099318534135818,
-0.012384648434817791,
-0.007667020428925753,
-0.055978111922740936,
0.0590752474963665,
-0.026677248999476433,
0.1425808072090149,
-0.09221141785383224,
-0.1037059873342514,
-0.16142144799232483,
0.0374140702188015,
-0.11013076454401016,
-0.08825794607400894,
-0.08821134269237518,
-0.050188567489385605,
0.002360827289521694,
-0.019856395199894905,
-0.04037635400891304,
-0.05829505994915962,
-0.12300454825162888,
0.0338277705013752,
-0.040771447122097015,
0.024727050215005875,
-0.07512269169092178,
0.015856385231018066,
0.08507686108350754,
-0.03285100311040878,
0.15655414760112762,
0.1450488418340683,
-0.1006515845656395,
0.10741901397705078,
-0.14806775748729706,
-0.09138492494821548,
0.11116421222686768,
0.015329592861235142,
0.0449691042304039,
0.09723787009716034,
0.013362943194806576,
0.0635865181684494,
0.032776717096567154,
0.05308786407113075,
0.027619892731308937,
-0.11959987878799438,
0.06483134627342224,
-0.03626115620136261,
-0.14700546860694885,
-0.049338050186634064,
-0.05282869189977646,
0.01647452637553215,
0.013054544106125832,
0.09622690081596375,
-0.05301849544048309,
0.10698331147432327,
-0.04055701196193695,
0.0346808135509491,
0.017554637044668198,
-0.1730053424835205,
-0.03816922754049301,
-0.08538098633289337,
0.03681723028421402,
0.014741539023816586,
0.25266793370246887,
0.030072299763560295,
0.012416383251547813,
0.032671261578798294,
0.08285367488861084,
0.03899408504366875,
0.010228337720036507,
0.17482228577136993,
0.1162426546216011,
-0.06621865928173065,
-0.10445023328065872,
0.0729617029428482,
0.016332454979419708,
0.01286179106682539,
0.13617953658103943,
0.008365051820874214,
0.005795429926365614,
0.08649782836437225,
-0.016865963116288185,
0.009968153201043606,
-0.10052056610584259,
-0.13426925241947174,
-0.022176474332809448,
0.05151832848787308,
-0.04655967652797699,
0.11727844923734665,
0.1406494379043579,
-0.01806013658642769,
0.03222079202532768,
-0.021771740168333054,
-0.05699979141354561,
-0.1683429479598999,
-0.1429590880870819,
-0.06883849948644638,
-0.13416796922683716,
0.00897989235818386,
-0.11180389672517776,
0.05395037308335304,
0.06001098081469536,
0.06750501692295074,
-0.06899319589138031,
0.10220931470394135,
0.04626858979463577,
-0.11440542340278625,
0.06264589726924896,
-0.0296088308095932,
0.09430401772260666,
-0.02759445086121559,
-0.019505485892295837,
-0.09039592742919922,
0.014574515633285046,
0.011419114656746387,
0.06245238706469536,
-0.04707273095846176,
0.007463190704584122,
-0.14696238934993744,
-0.08972041308879852,
-0.0523175448179245,
0.0718572810292244,
-0.050409089773893356,
0.14282815158367157,
0.00775480642914772,
-0.0170906875282526,
0.039554283022880554,
0.22787313163280487,
-0.07476283609867096,
-0.04778539761900902,
-0.05269690603017807,
0.20717895030975342,
0.02975541539490223,
0.1171872541308403,
-0.022938819602131844,
-0.006106364540755749,
-0.0919521227478981,
0.3764844834804535,
0.30030161142349243,
-0.09031439572572708,
0.011794124729931355,
0.02137952297925949,
0.04502861574292183,
0.1316293478012085,
0.1216534823179245,
0.10318691283464432,
0.3006802201271057,
-0.07452366501092911,
-0.04653361067175865,
-0.012629742734134197,
-0.023858042433857918,
-0.09059546142816544,
0.1021224707365036,
0.04839762672781944,
-0.06382183730602264,
-0.03313443064689636,
0.0954432487487793,
-0.25862133502960205,
0.1277991235256195,
-0.12311873584985733,
-0.17578600347042084,
-0.06654827296733856,
0.009760108776390553,
0.10465722531080246,
0.015642458572983742,
0.0946015790104866,
0.007128213066607714,
-0.11252258718013763,
0.06305865943431854,
0.03397420793771744,
-0.22762253880500793,
0.0006893770187161863,
0.06642123311758041,
-0.07006710022687912,
-0.0024247700348496437,
-0.026499588042497635,
0.05657242611050606,
0.0656052976846695,
0.054629553109407425,
-0.00971333310008049,
0.03816632181406021,
0.0034184439573436975,
-0.0585215799510479,
0.016623929142951965,
0.05121519789099693,
0.02472509816288948,
-0.09763528406620026,
0.06927435845136642,
-0.1574270874261856,
0.04766253009438515,
-0.0030655991286039352,
-0.04124255105853081,
0.006064958870410919,
0.008823691867291927,
-0.06491616368293762,
0.05165379121899605,
0.07916834205389023,
-0.0016257909592241049,
-0.0062433634884655476,
-0.057178743183612823,
-0.02632102556526661,
-0.027755750343203545,
-0.09291748702526093,
-0.10495562851428986,
-0.14682936668395996,
-0.11640441417694092,
0.09368976950645447,
-0.01011267676949501,
-0.1848134547472,
0.022154374048113823,
-0.08606051653623581,
0.08319322764873505,
-0.1670055389404297,
0.08040720224380493,
0.07041648775339127,
0.013038921169936657,
-0.0031511052511632442,
-0.02002427540719509,
0.054132770746946335,
0.086809903383255,
-0.10407156497240067,
-0.07400695979595184
] |
null | null | transformers |
# Shrek Small DialoGPT Model | {"tags": ["conversational"]} | text-generation | Poly-Pixel/shrek-test-small | [
"transformers",
"pytorch",
"safetensors",
"gpt2",
"text-generation",
"conversational",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [] | TAGS
#transformers #pytorch #safetensors #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
# Shrek Small DialoGPT Model | [
"# Shrek Small DialoGPT Model"
] | [
"TAGS\n#transformers #pytorch #safetensors #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n",
"# Shrek Small DialoGPT Model"
] | [
56,
9
] | [
"passage: TAGS\n#transformers #pytorch #safetensors #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n# Shrek Small DialoGPT Model"
] | [
-0.011032748967409134,
0.0620645247399807,
-0.005824731662869453,
0.001361696282401681,
0.13200294971466064,
-0.03271833807229996,
0.18520192801952362,
0.12119568139314651,
-0.10152306407690048,
-0.041385166347026825,
0.11203137040138245,
0.10196205228567123,
0.016667116433382034,
0.11609421670436859,
-0.05730152130126953,
-0.3101560175418854,
0.0876057893037796,
0.006219089031219482,
-0.0339878685772419,
0.11426828801631927,
0.09090276062488556,
-0.058429889380931854,
0.05556447058916092,
-0.012104322202503681,
-0.12426526844501495,
-0.03662148490548134,
0.01152569055557251,
-0.138597771525383,
0.11311649531126022,
0.02514776960015297,
0.060146670788526535,
0.03444840386509895,
-0.03690358251333237,
-0.08815377950668335,
0.05007842928171158,
0.007550427690148354,
0.003920039627701044,
0.04727772995829582,
0.021267998963594437,
-0.036463115364313126,
0.08873622119426727,
0.0795993059873581,
-0.001128596719354391,
0.045374609529972076,
-0.07497437298297882,
-0.12137055397033691,
0.04234273359179497,
0.11312371492385864,
0.08430349826812744,
0.10690093040466309,
-0.031849347054958344,
0.16096153855323792,
-0.08808985352516174,
0.10909228026866913,
0.1423802524805069,
-0.2855090796947479,
-0.013385591097176075,
0.12960395216941833,
0.018959488719701767,
0.06947751343250275,
-0.06924112141132355,
0.06406539678573608,
0.004541178699582815,
0.005009872373193502,
-0.006660541985183954,
-0.058071330189704895,
-0.05788550153374672,
-0.007568982429802418,
-0.12127958983182907,
-0.0012411285424605012,
0.21325746178627014,
-0.06816157698631287,
0.05190594121813774,
-0.10893363505601883,
-0.08240413665771484,
-0.03184233233332634,
-0.06328271329402924,
-0.028337419033050537,
-0.06624671071767807,
0.056760694831609726,
-0.061947256326675415,
-0.049023937433958054,
-0.13070547580718994,
-0.04215376824140549,
-0.1039237305521965,
0.16181698441505432,
0.06327545642852783,
0.04734569415450096,
-0.24410927295684814,
0.051566220819950104,
-0.025791633874177933,
-0.09242420643568039,
0.01898539997637272,
-0.09475954622030258,
0.05599434673786163,
0.03541687875986099,
-0.050443485379219055,
-0.026651399210095406,
0.1499810814857483,
0.17141999304294586,
-0.038023196160793304,
0.0354791060090065,
0.011715185828506947,
0.06133810058236122,
0.02824629470705986,
0.05886209011077881,
-0.028776980936527252,
-0.053309254348278046,
0.11086837202310562,
-0.019294003024697304,
0.0560564249753952,
-0.0539635494351387,
-0.16154801845550537,
-0.01863030157983303,
0.09469616413116455,
0.06350192427635193,
0.04277834668755531,
0.14054590463638306,
0.04923274740576744,
-0.014256968162953854,
0.10212037712335587,
-0.036173369735479355,
-0.057571448385715485,
0.05208602175116539,
0.007511556148529053,
0.10545345395803452,
-0.025307027623057365,
0.0493173822760582,
-0.08995290100574493,
0.05836142599582672,
-0.020876174792647362,
-0.011903972364962101,
0.024194540455937386,
-0.05602983385324478,
0.0013978509232401848,
0.018228663131594658,
0.007357047870755196,
-0.1970468908548355,
-0.13634450733661652,
0.0324961319565773,
-0.0750766396522522,
-0.02329869568347931,
-0.0905226469039917,
-0.07357247918844223,
-0.10519514232873917,
0.020554112270474434,
-0.020808251574635506,
-0.0891459658741951,
-0.044015660881996155,
0.0893847867846489,
-0.03450733795762062,
0.0990687906742096,
-0.11768418550491333,
0.04000169411301613,
-0.12810087203979492,
-0.029631296172738075,
-0.1421128213405609,
0.08936507254838943,
-0.0022147931158542633,
0.09031001478433609,
-0.007864626124501228,
-0.048670414835214615,
-0.12366687506437302,
0.037322960793972015,
-0.04180391505360603,
0.22154894471168518,
-0.09872928261756897,
-0.10393824428319931,
0.27921047806739807,
-0.14233648777008057,
-0.12813298404216766,
0.178837850689888,
0.008712011389434338,
0.053665708750486374,
0.1187736988067627,
0.23638711869716644,
0.04888484254479408,
0.0627022236585617,
0.02256295643746853,
0.10987381637096405,
-0.12066175043582916,
0.003951724618673325,
0.0551501028239727,
0.020559722557663918,
-0.049857016652822495,
0.03762727603316307,
0.0386841744184494,
0.056633368134498596,
-0.03052118979394436,
-0.023774970322847366,
-0.010018469765782356,
-0.042733240872621536,
0.10378237813711166,
-0.025033846497535706,
0.15721403062343597,
-0.056464191526174545,
-0.03407779708504677,
-0.04887598007917404,
0.04276343062520027,
-0.03408960625529289,
0.036982517689466476,
-0.08179526776075363,
0.09354235231876373,
0.01949279196560383,
0.0827501192688942,
-0.12156827747821808,
-0.039167094975709915,
-0.041847243905067444,
0.16363367438316345,
0.04386631399393082,
0.07346503436565399,
0.07177364826202393,
-0.028723303228616714,
-0.02063095010817051,
0.007004182785749435,
0.18016308546066284,
-0.009001762606203556,
-0.055681340396404266,
-0.08450833708047867,
0.0956621989607811,
-0.05175597965717316,
0.11685699224472046,
-0.05719653517007828,
0.05132697522640228,
0.0405241958796978,
0.09060229361057281,
-0.03182552009820938,
0.03376368433237076,
0.0331881120800972,
-0.03068915382027626,
-0.056728582829236984,
-0.026903629302978516,
0.11294283717870712,
0.028772274032235146,
-0.07578091323375702,
0.19755052030086517,
-0.22745157778263092,
0.17860303819179535,
0.19692057371139526,
-0.144758939743042,
-0.04054991528391838,
-0.07532285153865814,
-0.02169521525502205,
0.012825790792703629,
0.03026789426803589,
-0.06888284534215927,
0.15333665907382965,
-0.008294274099171162,
0.1840997040271759,
-0.05264447629451752,
-0.04605567082762718,
-0.033846259117126465,
-0.082530178129673,
0.06315290927886963,
0.0911012515425682,
0.03714242950081825,
-0.14327043294906616,
0.1603998988866806,
0.07002414762973785,
0.009433084167540073,
0.19723138213157654,
0.015152460895478725,
0.03287162631750107,
0.08963863551616669,
0.03135669231414795,
0.0026206488255411386,
-0.1009741798043251,
-0.15757176280021667,
-0.027002418413758278,
0.0807136669754982,
0.01923779956996441,
0.07684224098920822,
-0.11940542608499527,
-0.0331965833902359,
0.009757688269019127,
-0.0056038787588477135,
-0.0018616266315802932,
0.12133646756410599,
0.01967289298772812,
0.13887497782707214,
-0.01717408001422882,
-0.07571470737457275,
0.059644293040037155,
0.0067357695661485195,
-0.0851324275135994,
0.17477665841579437,
-0.10710034519433975,
-0.35006555914878845,
-0.13105516135692596,
-0.047786131501197815,
-0.041556499898433685,
0.047932978719472885,
0.11905870586633682,
-0.15366075932979584,
-0.006987552158534527,
-0.03442820534110069,
0.07366809993982315,
-0.10635724663734436,
0.018405700102448463,
-0.03472340479493141,
0.043256741017103195,
-0.11432152986526489,
-0.10192587226629257,
-0.0331403948366642,
-0.011753941886126995,
-0.0920872688293457,
0.14099079370498657,
-0.1012079045176506,
-0.02178410068154335,
0.23771528899669647,
0.022894984111189842,
0.04983862116932869,
-0.06784506887197495,
0.20578783750534058,
-0.06810179352760315,
0.024961959570646286,
0.1993042528629303,
-0.08246109634637833,
0.06862692534923553,
0.13172225654125214,
-0.008257263340055943,
-0.04791416972875595,
0.06148887798190117,
-0.05633055046200752,
-0.11086924374103546,
-0.1636681705713272,
-0.08219771832227707,
-0.06574881821870804,
0.1881849616765976,
-0.009370876476168633,
0.06028733775019646,
0.15633484721183777,
0.0913538932800293,
-0.053873803466558456,
-0.02494223229587078,
0.08624083548784256,
0.09468862414360046,
0.21570444107055664,
-0.02066592127084732,
0.1564064472913742,
-0.05033242702484131,
-0.13932748138904572,
0.04498329013586044,
-0.016424113884568214,
0.06491860747337341,
0.0005753927980549634,
0.05658211186528206,
0.0009814583463594317,
0.05677253007888794,
0.1712852567434311,
0.06915958225727081,
0.015660163015127182,
-0.04509299248456955,
-0.012422852218151093,
-0.043710850179195404,
-0.03786957263946533,
0.044523607939481735,
-0.008390008471906185,
-0.10860826820135117,
-0.031888384371995926,
0.08654243499040604,
0.06834879517555237,
0.10034987330436707,
0.03371984139084816,
-0.16809837520122528,
-0.05535218492150307,
0.049471765756607056,
-0.02295721508562565,
-0.030633384361863136,
0.0959494411945343,
0.10111084580421448,
-0.14002537727355957,
-0.007025524042546749,
-0.01415868941694498,
0.06395450979471207,
-0.01598627306520939,
0.09327467530965805,
-0.11252526938915253,
-0.0247216634452343,
-0.0017571767093613744,
0.08364837616682053,
-0.1891450434923172,
0.13815629482269287,
-0.030453357845544815,
-0.018312405794858932,
-0.08229368180036545,
-0.020391158759593964,
0.07111121714115143,
0.09618432074785233,
0.09588464349508286,
-0.008868580684065819,
-0.05073831230401993,
-0.0674324706196785,
-0.07805008441209793,
0.0261822622269392,
0.12983781099319458,
-0.04504964128136635,
0.004345201421529055,
-0.0857463851571083,
-0.011374366469681263,
0.004987125750631094,
-0.05013906955718994,
0.04558251053094864,
-0.13779886066913605,
0.05750900134444237,
0.13364651799201965,
0.0687749907374382,
-0.010471872054040432,
-0.0467088408768177,
-0.11413228511810303,
0.16320979595184326,
0.011931652203202248,
-0.13661471009254456,
-0.059149742126464844,
-0.02899710275232792,
0.03850659728050232,
-0.08060407638549805,
0.054673630744218826,
-0.06977421045303345,
0.06874657422304153,
-0.05486967787146568,
-0.16810347139835358,
0.06384365260601044,
-0.10985010117292404,
-0.1457812339067459,
0.0287010557949543,
0.20373298227787018,
-0.045803871005773544,
0.05083804950118065,
0.04202967882156372,
0.02155146934092045,
-0.07135812938213348,
-0.08899243175983429,
-0.0385221466422081,
0.021401692181825638,
-0.006250970531255007,
0.01831040158867836,
0.02548898383975029,
-0.07852452993392944,
-0.06672189384698868,
-0.011312706395983696,
0.3553096354007721,
0.19163745641708374,
-0.06180335581302643,
0.149009570479393,
0.12330317497253418,
0.009173615835607052,
-0.33261969685554504,
-0.1237817034125328,
-0.0844239890575409,
-0.08098592609167099,
-0.04188666492700577,
-0.10442807525396347,
0.12267822027206421,
-0.033235613256692886,
-0.020004715770483017,
0.041118305176496506,
-0.2997839152812958,
-0.11392934620380402,
0.16160166263580322,
-0.014965703710913658,
0.3713735044002533,
-0.16589048504829407,
-0.05461098998785019,
-0.04420185834169388,
-0.12491235882043839,
0.10358893126249313,
-0.08645112067461014,
0.11648572236299515,
-0.003964538220316172,
0.10036144405603409,
0.04334494099020958,
-0.035964660346508026,
0.06774544715881348,
-0.016775617375969887,
-0.044858064502477646,
-0.10856764018535614,
-0.07916993647813797,
0.008705047890543938,
0.010750454850494862,
0.04764707386493683,
-0.08939048647880554,
0.03819628432393074,
-0.028130169957876205,
-0.04034449905157089,
-0.09172777831554413,
0.025396090000867844,
0.03869692236185074,
-0.06759826093912125,
-0.052120331674814224,
-0.044862065464258194,
0.010553467087447643,
0.0076807718724012375,
0.158280149102211,
-0.07446789741516113,
0.15737107396125793,
0.08102075755596161,
0.137179896235466,
-0.09262080490589142,
-0.008068820461630821,
-0.06460881978273392,
-0.07627830654382706,
0.05999055877327919,
-0.08431990444660187,
0.02275010384619236,
0.08072999119758606,
0.0043370663188397884,
0.04480159655213356,
0.11029518395662308,
0.015644382685422897,
0.023569462820887566,
0.08008591085672379,
-0.19482380151748657,
-0.14447946846485138,
-0.05326031893491745,
0.09970492124557495,
0.033933330327272415,
0.12093669921159744,
0.18691514432430267,
-0.05969991534948349,
-0.0506364107131958,
0.0006347365560941398,
0.022852882742881775,
-0.009881395846605301,
0.0995681956410408,
0.0180511437356472,
0.055335670709609985,
-0.15621976554393768,
0.07773245871067047,
-0.02492818608880043,
-0.02884616330265999,
0.07991345971822739,
0.12855486571788788,
-0.1389310657978058,
-0.10720331221818924,
-0.02548409439623356,
0.04406435042619705,
-0.037917815148830414,
-0.03627398610115051,
-0.08121854811906815,
-0.14341841638088226,
0.022322293370962143,
0.10624201595783234,
0.05281020328402519,
0.05048219859600067,
-0.08485093712806702,
-0.002270268741995096,
-0.0741005465388298,
0.03969033434987068,
0.0647285208106041,
-0.0035417175386101007,
-0.10155943036079407,
0.14643719792366028,
-0.028851870447397232,
0.0678510069847107,
-0.09245184063911438,
-0.07824936509132385,
-0.1664889007806778,
0.024628017097711563,
-0.12568619847297668,
-0.023589853197336197,
-0.1149059534072876,
-0.06201196834445,
-0.00617999816313386,
-0.009121952578425407,
-0.02818787656724453,
-0.003754166653379798,
-0.10670121014118195,
0.022813761606812477,
-0.06115766242146492,
-0.024579109624028206,
-0.06686843931674957,
0.046336930245161057,
0.04619351401925087,
-0.025693699717521667,
0.15929506719112396,
0.13554400205612183,
-0.14975492656230927,
0.09025827795267105,
-0.1319979876279831,
-0.09721855819225311,
0.10647830367088318,
-0.015556017868220806,
-0.002716252813115716,
0.0696418285369873,
0.0018623985815793276,
0.04217076674103737,
0.0347275473177433,
0.06870009750127792,
0.09286986291408539,
-0.05702143907546997,
0.06608155369758606,
-0.06515245884656906,
-0.04551496356725693,
-0.04174782708287239,
-0.09958674758672714,
0.016447842121124268,
0.04293189197778702,
0.1298827826976776,
-0.09156700223684311,
0.08855006098747253,
-0.06158545985817909,
0.018834920600056648,
-0.004858822096139193,
-0.1843750923871994,
-0.035847652703523636,
-0.042980000376701355,
0.047283053398132324,
0.007675106171518564,
0.13464787602424622,
-0.01737707294523716,
-0.03317202255129814,
0.03223307058215141,
0.08217281848192215,
0.07585401087999344,
0.020614368841052055,
0.2260395735502243,
0.10750209540128708,
-0.08625097572803497,
-0.07543151825666428,
0.06163462996482849,
0.08519263565540314,
-0.061281681060791016,
0.1586948186159134,
0.02951762266457081,
-0.056892890483140945,
0.06965278089046478,
-0.01902048848569393,
0.05958963558077812,
-0.10399556905031204,
-0.12414437532424927,
-0.05449027195572853,
0.024884991347789764,
-0.06779053807258606,
0.1282300055027008,
0.2182368040084839,
0.03703421354293823,
0.004711734130978584,
-0.08912355452775955,
-0.041635602712631226,
-0.19354695081710815,
-0.16286994516849518,
-0.08312975615262985,
-0.15657958388328552,
-0.007303647231310606,
-0.10101176798343658,
-0.0327237993478775,
0.06145532429218292,
0.10066954791545868,
-0.045039102435112,
0.13478949666023254,
-0.01077490858733654,
-0.058287668973207474,
0.0660851001739502,
-0.06179952621459961,
0.022248271852731705,
0.04220573976635933,
-0.042963117361068726,
-0.01608189195394516,
0.028117166832089424,
0.035696785897016525,
0.02965637296438217,
-0.05873190239071846,
0.008836697787046432,
-0.1466657817363739,
-0.09902548789978027,
-0.03846542909741402,
0.05844458192586899,
-0.0026262239553034306,
0.1262177675962448,
0.044726938009262085,
-0.05125446617603302,
0.022075556218624115,
0.2613993287086487,
-0.06257399916648865,
-0.19051335752010345,
-0.09237807989120483,
0.15065082907676697,
-0.031200015917420387,
0.05412950739264488,
-0.034129608422517776,
-0.03185770660638809,
-0.07613587379455566,
0.32738450169563293,
0.3026042878627777,
-0.07515297830104828,
0.0503559336066246,
-0.029642554000020027,
0.02864375151693821,
0.026843519881367683,
0.12838374078273773,
0.12209834158420563,
0.2667533755302429,
-0.07037682831287384,
-0.019153021275997162,
-0.013966933824121952,
-0.040080536156892776,
-0.12270078808069229,
-0.016257476061582565,
0.049529626965522766,
-0.0404089130461216,
-0.03697849437594414,
0.07274554669857025,
-0.25355419516563416,
0.04578898102045059,
-0.21084155142307281,
-0.13777627050876617,
-0.03793540596961975,
0.015948595479130745,
0.08766528218984604,
0.032353512942790985,
0.08383201062679291,
0.0020887269638478756,
-0.03104100376367569,
0.02203151397407055,
-0.0026837841141968966,
-0.14508120715618134,
0.03917520120739937,
0.06711368262767792,
-0.12108750641345978,
-0.003034020308405161,
-0.03494821488857269,
0.034718021750450134,
0.09215616434812546,
0.044604621827602386,
-0.04581061378121376,
0.068061962723732,
-0.003104517003521323,
-0.042076386511325836,
0.009958869777619839,
0.11263947188854218,
0.0018929977668449283,
-0.045172810554504395,
0.08418877422809601,
-0.1474544107913971,
0.02377478964626789,
0.030801130458712578,
0.009420802816748619,
-0.048167433589696884,
0.03243091329932213,
-0.059332557022571564,
0.06361503899097443,
0.0696469396352768,
-0.05285325273871422,
-0.004117202013731003,
0.002132975962013006,
-0.00767045421525836,
-0.0020008389838039875,
-0.10490439832210541,
-0.06387653201818466,
-0.22079892456531525,
-0.09516943246126175,
0.028751567006111145,
0.012819026596844196,
-0.20761865377426147,
-0.000018396563973510638,
-0.15523761510849,
0.05938750505447388,
-0.13973243534564972,
0.06249226629734039,
0.09991084784269333,
0.02022942341864109,
0.01181462500244379,
0.004547593183815479,
0.010096197947859764,
0.1096363291144371,
-0.1298813670873642,
-0.10261042416095734
] |
null | null | transformers | This model generate the time shift's text of Norbit Company also generate the same ending of the textes of any phrases like base gpt model. | {} | text-generation | PolyakovMaxim/ModelGptTS | [
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [] | TAGS
#transformers #pytorch #jax #gpt2 #text-generation #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
| This model generate the time shift's text of Norbit Company also generate the same ending of the textes of any phrases like base gpt model. | [] | [
"TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n"
] | [
50
] | [
"passage: TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n"
] | [
-0.02061169408261776,
0.03669698163866997,
-0.007248206064105034,
0.012762377969920635,
0.1690322607755661,
0.03350625932216644,
0.09912695735692978,
0.1378507763147354,
0.006931114010512829,
-0.02730877511203289,
0.16016589105129242,
0.210490420460701,
-0.0018047182820737362,
0.07203522324562073,
-0.0626848116517067,
-0.27152273058891296,
0.051494795829057693,
0.06287337839603424,
-0.006572056096047163,
0.12476035207509995,
0.07706263661384583,
-0.058810923248529434,
0.09187597781419754,
-0.01914438046514988,
-0.17445695400238037,
0.023767365142703056,
0.05130406841635704,
-0.12089582532644272,
0.11222874373197556,
0.04935329034924507,
0.09586816281080246,
0.014551094733178616,
-0.061403561383485794,
-0.14227746427059174,
0.027511093765497208,
0.02571028470993042,
-0.06026925519108772,
0.06976787000894547,
0.10209541767835617,
-0.09534682333469391,
0.09160638600587845,
0.08213244378566742,
-0.02483462356030941,
0.054132550954818726,
-0.1637229174375534,
-0.08184890449047089,
-0.030146123841404915,
0.009252084419131279,
0.06836725026369095,
0.09418372809886932,
-0.014005177654325962,
0.11512795090675354,
-0.08648983389139175,
0.10137014836072922,
0.15515094995498657,
-0.3112814128398895,
-0.0025897729210555553,
0.08596665412187576,
0.056812409311532974,
0.05083649232983589,
-0.028057346120476723,
0.059737250208854675,
0.02277030050754547,
0.023691991344094276,
0.024889623746275902,
-0.08345351368188858,
-0.12221888452768326,
0.04607483372092247,
-0.08558143675327301,
-0.07045682519674301,
0.24011977016925812,
-0.06557567417621613,
0.05927315354347229,
-0.026034124195575714,
-0.09983419626951218,
-0.04822950065135956,
-0.025522518903017044,
0.0027340995147824287,
-0.05865494906902313,
0.08325810730457306,
0.03113705664873123,
-0.07104028761386871,
-0.12954817712306976,
-0.03331276401877403,
-0.1671232432126999,
0.16637788712978363,
0.017353568226099014,
0.059134289622306824,
-0.19971254467964172,
0.10599949955940247,
0.009768630377948284,
-0.09360894560813904,
0.031420230865478516,
-0.0966353788971901,
0.04421459138393402,
-0.002475510584190488,
-0.05345337465405464,
-0.07802210748195648,
0.07758434861898422,
0.1328297108411789,
0.005376500077545643,
0.0312935933470726,
-0.022861171513795853,
0.09267520904541016,
0.03198305144906044,
0.09125647693872452,
0.003466499038040638,
-0.02334674261510372,
0.05431690812110901,
-0.128060445189476,
-0.007325596176087856,
-0.07051635533571243,
-0.15026308596134186,
-0.0425361804664135,
0.059593502432107925,
0.08932037651538849,
0.019110143184661865,
0.08286140859127045,
-0.052690550684928894,
-0.031684745103120804,
0.05481104552745819,
-0.06541909277439117,
-0.0023946587461978197,
-0.0066881631501019,
0.028208354488015175,
0.13478249311447144,
-0.008862287737429142,
0.024262670427560806,
-0.12114688754081726,
0.05839782580733299,
-0.08044246584177017,
-0.0018729495350271463,
-0.04188903048634529,
-0.04908997192978859,
0.01962810568511486,
-0.08889313787221909,
0.023544808849692345,
-0.14767387509346008,
-0.17289353907108307,
0.014023632742464542,
0.015207161195576191,
-0.02661287784576416,
-0.055295780301094055,
-0.03635844588279724,
-0.02752472460269928,
0.05103516951203346,
-0.06349530816078186,
0.0072977435775101185,
-0.05553026497364044,
0.10184666514396667,
-0.03134933114051819,
0.06921332329511642,
-0.10158300399780273,
0.07680372148752213,
-0.12065500766038895,
-0.010678648948669434,
-0.09030061960220337,
0.0667608305811882,
-0.005207765847444534,
0.12495583295822144,
-0.02772742323577404,
-0.023418201133608818,
-0.06870874017477036,
0.052683956921100616,
-0.03466503322124481,
0.198461651802063,
-0.0751492977142334,
-0.12632763385772705,
0.2507952153682709,
-0.0663582980632782,
-0.14219015836715698,
0.09786481410264969,
0.011805753223598003,
0.04386255890130997,
0.09383031725883484,
0.1752246767282486,
0.02929861843585968,
-0.002063382649794221,
0.08640838414430618,
0.10012904554605484,
-0.10108412057161331,
-0.08980478346347809,
0.023798564448952675,
-0.02894214354455471,
-0.14016631245613098,
0.056434664875268936,
0.06615027785301208,
0.08355093747377396,
-0.04800274223089218,
-0.031239774078130722,
-0.0360761322081089,
0.00971299409866333,
0.055595025420188904,
0.014840207993984222,
0.12724317610263824,
-0.054204441606998444,
-0.03147004917263985,
-0.029795076698064804,
-0.010151172988116741,
-0.01988917589187622,
0.03365359082818031,
-0.021863849833607674,
0.13083113729953766,
-0.05663840472698212,
0.058342233300209045,
-0.18144778907299042,
-0.08069596439599991,
0.0038181261625140905,
0.12040943652391434,
-0.007066712249070406,
0.07556058466434479,
0.05802106857299805,
-0.03010028414428234,
-0.005254245363175869,
-0.011689078994095325,
0.1494489461183548,
-0.025504015386104584,
-0.0695682018995285,
-0.06697690486907959,
0.05832945927977562,
-0.059016112238168716,
-0.011436658911406994,
-0.06229717284440994,
0.014011423103511333,
0.032067783176898956,
0.10360895842313766,
0.004289672710001469,
0.028894901275634766,
-0.020906507968902588,
0.010454483330249786,
-0.08038664609193802,
0.004400757607072592,
0.09865622967481613,
-0.010968429036438465,
-0.052063871175050735,
0.2027982473373413,
-0.15390464663505554,
0.233114555478096,
0.19188177585601807,
-0.28178542852401733,
0.013452456332743168,
-0.057999882847070694,
-0.02458072640001774,
0.014820392243564129,
0.04814935103058815,
-0.02250209078192711,
0.11650584638118744,
-0.0003065110940951854,
0.18351061642169952,
-0.054600246250629425,
-0.05317075923085213,
0.003565638791769743,
-0.05164698138833046,
-0.0039711822755634785,
0.07296860963106155,
0.12584854662418365,
-0.14359883964061737,
0.19494381546974182,
0.20664817094802856,
0.03114785999059677,
0.16297315061092377,
0.006141228135675192,
-0.025006834417581558,
0.0717167779803276,
-0.020189831033349037,
-0.03723941743373871,
-0.06850926578044891,
-0.18136908113956451,
-0.028368933126330376,
0.08009158074855804,
0.04700847342610359,
0.0970572903752327,
-0.12121459096670151,
-0.04919075593352318,
-0.017726639285683632,
-0.0037948081735521555,
-0.001169883762486279,
0.09482266008853912,
0.04720060154795647,
0.11524532735347748,
-0.013394813053309917,
0.00007184622518252581,
0.10807308554649353,
0.017171379178762436,
-0.09143665432929993,
0.1945890337228775,
-0.12493730336427689,
-0.3475200831890106,
-0.15314428508281708,
-0.1694491058588028,
-0.03699450567364693,
0.05181937664747238,
0.1012069508433342,
-0.11282069236040115,
-0.028340332210063934,
0.017789945006370544,
0.09569206833839417,
-0.09760142862796783,
0.021629052236676216,
-0.08391507714986801,
0.04361793026328087,
-0.08041521906852722,
-0.07167459279298782,
-0.056972403079271317,
-0.021237103268504143,
-0.05630851536989212,
0.14537808299064636,
-0.10566911846399307,
0.04946158453822136,
0.1748725324869156,
0.041469868272542953,
0.052667371928691864,
-0.026084311306476593,
0.20342226326465607,
-0.10122719407081604,
-0.008247622288763523,
0.1996638923883438,
-0.03767416626214981,
0.07793038338422775,
0.1047254279255867,
0.008436969481408596,
-0.08692540973424911,
0.014031565748155117,
-0.03050277940928936,
-0.08918496966362,
-0.23751021921634674,
-0.10732308030128479,
-0.1305837780237198,
0.06702837347984314,
0.06369546800851822,
0.06166477128863335,
0.1605178564786911,
0.08636368811130524,
-0.02265070751309395,
0.051190104335546494,
0.005209398455917835,
0.08386892825365067,
0.19134031236171722,
-0.018688667565584183,
0.13220134377479553,
-0.047730229794979095,
-0.1284172236919403,
0.08534674346446991,
0.06895700097084045,
0.13253210484981537,
0.07302940636873245,
0.06437722593545914,
0.006450139917433262,
0.09158173203468323,
0.14580334722995758,
0.11763523519039154,
0.013675099238753319,
-0.02294873259961605,
-0.035648882389068604,
-0.014953047037124634,
-0.053832754492759705,
0.041831664741039276,
0.03041837364435196,
-0.14036796987056732,
-0.0522465854883194,
-0.11898676306009293,
0.07560022920370102,
0.10437406599521637,
0.06208758428692818,
-0.23143552243709564,
0.012645904906094074,
0.0920325219631195,
-0.033739786595106125,
-0.12553022801876068,
0.08267304301261902,
-0.02773478254675865,
-0.148908331990242,
0.04800388216972351,
-0.0624145083129406,
0.13054151833057404,
-0.06850045919418335,
0.08056096732616425,
-0.04174954071640968,
-0.057825472205877304,
0.02301604673266411,
0.11933999508619308,
-0.29898545145988464,
0.19811242818832397,
0.00232495809905231,
-0.055925529450178146,
-0.1032438725233078,
0.013398502953350544,
0.01214119978249073,
0.1131071001291275,
0.11060462892055511,
0.004075071774423122,
-0.054734937846660614,
-0.09965869784355164,
-0.025381751358509064,
0.032791636884212494,
0.11453904956579208,
-0.0661320760846138,
-0.008024964481592178,
-0.04672384262084961,
-0.005813688039779663,
-0.03299189358949661,
-0.04443271830677986,
0.006998900789767504,
-0.17404834926128387,
0.08311133086681366,
0.021644821390509605,
0.09458605200052261,
0.01686358079314232,
-0.020177142694592476,
-0.0930124819278717,
0.22706744074821472,
-0.07374903559684753,
-0.10036850720643997,
-0.11869792640209198,
-0.05789005383849144,
0.06681498885154724,
-0.07128075510263443,
0.04802818223834038,
-0.08300348371267319,
0.024066224694252014,
-0.051810044795274734,
-0.2141922116279602,
0.12876592576503754,
-0.09854454547166824,
-0.0426534079015255,
-0.04808230325579643,
0.18427522480487823,
-0.07560548186302185,
0.007441832683980465,
0.014283985830843449,
0.03377733752131462,
-0.12088590115308762,
-0.09571054577827454,
0.03321368247270584,
-0.0054462980479002,
0.050365518778562546,
0.02398831397294998,
-0.06449484080076218,
0.013354619033634663,
-0.027336502447724342,
-0.007998697459697723,
0.31066620349884033,
0.16510078310966492,
-0.04284334182739258,
0.17947670817375183,
0.11826936155557632,
-0.09111739695072174,
-0.3014727234840393,
-0.09573464840650558,
-0.10117041319608688,
-0.03300357609987259,
-0.04562511295080185,
-0.21774791181087494,
0.0805845707654953,
0.02981768362224102,
-0.016042208299040794,
0.15843164920806885,
-0.24728429317474365,
-0.07573903352022171,
0.14059032499790192,
-0.0033480850979685783,
0.36902642250061035,
-0.1301729530096054,
-0.10661002993583679,
-0.049214623868465424,
-0.14853917062282562,
0.16302582621574402,
-0.006122821941971779,
0.09923173487186432,
-0.03277380019426346,
0.0951908603310585,
0.04296580329537392,
-0.04584183171391487,
0.09171538800001144,
0.006283751223236322,
-0.00015285445260815322,
-0.09827379137277603,
-0.026431895792484283,
0.04596385359764099,
0.006172254215925932,
0.015664206817746162,
-0.05151783674955368,
0.024717671796679497,
-0.13768158853054047,
-0.04218921437859535,
-0.08312345296144485,
0.05613607540726662,
0.0378592424094677,
-0.06245051324367523,
0.012870991602540016,
-0.06329932808876038,
-0.015552804805338383,
0.006438991520553827,
0.22675301134586334,
-0.033524125814437866,
0.16034509241580963,
0.06924859434366226,
0.0977221205830574,
-0.13415151834487915,
-0.0030943630263209343,
-0.07359358668327332,
-0.0585312657058239,
0.0910639688372612,
-0.12273677438497543,
0.05962078645825386,
0.11274706572294235,
-0.04377524182200432,
0.06500444561243057,
0.11012120544910431,
0.004117070697247982,
-0.003137144260108471,
0.12473985552787781,
-0.25908008217811584,
0.015647539868950844,
-0.07494742423295975,
-0.024182267487049103,
0.08232303708791733,
0.07148353010416031,
0.16246774792671204,
0.024140827357769012,
-0.05674751475453377,
-0.0011056308867409825,
0.012844313867390156,
-0.04069126397371292,
0.0659845769405365,
0.010237254202365875,
0.020188752561807632,
-0.14966484904289246,
0.07341181486845016,
0.020635386928915977,
-0.1390792727470398,
0.00947505235671997,
0.16722126305103302,
-0.13189104199409485,
-0.11809033155441284,
-0.024501128122210503,
0.10207509994506836,
-0.12840311229228973,
-0.01538072805851698,
-0.0530657097697258,
-0.12485695630311966,
0.08780299127101898,
0.11384674161672592,
0.07223537564277649,
0.08963492512702942,
-0.046553220599889755,
-0.03367399424314499,
-0.03635541722178459,
-0.012573964893817902,
-0.005112584214657545,
0.023647962138056755,
-0.08710068464279175,
0.020647486671805382,
-0.014909865334630013,
0.1446734219789505,
-0.08746474981307983,
-0.07236666977405548,
-0.15646818280220032,
0.03405974432826042,
-0.10871405899524689,
-0.07473097741603851,
-0.08840084820985794,
-0.051927600055933,
-0.011308991350233555,
-0.0237440038472414,
-0.04571209102869034,
-0.04849827662110329,
-0.12665939331054688,
0.012751449830830097,
-0.04703500121831894,
0.02897617593407631,
-0.06266307830810547,
-0.003384709358215332,
0.09275015443563461,
-0.04187775403261185,
0.13208845257759094,
0.13593345880508423,
-0.07172486186027527,
0.12254272401332855,
-0.1241949051618576,
-0.08505946397781372,
0.10708841681480408,
0.018022065982222557,
0.039706673473119736,
0.06602154672145844,
0.027937114238739014,
0.06251784414052963,
0.023262426257133484,
0.0474555529654026,
-0.0034392299130558968,
-0.1278984397649765,
0.026975933462381363,
-0.026938216760754585,
-0.15527313947677612,
-0.05361103639006615,
-0.045469336211681366,
0.04235553741455078,
0.01838851533830166,
0.1166498214006424,
-0.03758706524968147,
0.11192993074655533,
-0.06871756166219711,
0.02620946429669857,
-0.0016389728989452124,
-0.19030974805355072,
-0.06729240715503693,
-0.08063044399023056,
0.027664629742503166,
0.01087124552577734,
0.2522471249103546,
0.03634938970208168,
0.018269522115588188,
0.022480595856904984,
0.08995942771434784,
0.03782389685511589,
0.01657526195049286,
0.20189790427684784,
0.1176978051662445,
-0.05808348208665848,
-0.09545315057039261,
0.08679695427417755,
0.024981701746582985,
0.011926673352718353,
0.12499669194221497,
0.02542710304260254,
0.015265305526554585,
0.09474188834428787,
-0.027163418009877205,
0.011014972813427448,
-0.09048577398061752,
-0.12104497104883194,
-0.012157267890870571,
0.06607268750667572,
-0.0005205549532547593,
0.09510093182325363,
0.15160807967185974,
-0.01614399254322052,
0.03269355744123459,
-0.01990874856710434,
-0.044518157839775085,
-0.17706270515918732,
-0.15518343448638916,
-0.07865231484174728,
-0.12877032160758972,
0.006389371585100889,
-0.10272641479969025,
0.0437617301940918,
0.06735121458768845,
0.055535778403282166,
-0.061280637979507446,
0.08018633723258972,
0.0908743217587471,
-0.10815306007862091,
0.06396238505840302,
-0.03378984332084656,
0.05308237299323082,
-0.010930529795587063,
-0.012166712433099747,
-0.0929059311747551,
-0.004868659656494856,
-0.008535288274288177,
0.04750073328614235,
-0.05832274630665779,
0.02753910794854164,
-0.15148714184761047,
-0.11618328839540482,
-0.04823146015405655,
0.06859103590250015,
-0.0567200593650341,
0.10416112095117569,
0.0015089651569724083,
-0.014318077825009823,
0.03907795622944832,
0.214002326130867,
-0.0627247542142868,
-0.030963636934757233,
-0.03922291100025177,
0.22120635211467743,
0.040702469646930695,
0.10107730329036713,
-0.013604391366243362,
0.008694403804838657,
-0.07015835493803024,
0.35390228033065796,
0.29603973031044006,
-0.070329949259758,
0.010815161280333996,
0.030772194266319275,
0.030525315552949905,
0.12354307621717453,
0.13173502683639526,
0.08878594636917114,
0.26048561930656433,
-0.08683076500892639,
-0.033120445907115936,
-0.024800274521112442,
-0.015878379344940186,
-0.0880107581615448,
0.10199514776468277,
0.0479244664311409,
-0.06818252801895142,
-0.0319574736058712,
0.09539607912302017,
-0.23561538755893707,
0.1563696265220642,
-0.08077402412891388,
-0.1598164439201355,
-0.06550628691911697,
0.0020895323250442743,
0.11123234033584595,
0.011316700838506222,
0.08370231091976166,
-0.011379145085811615,
-0.09968001395463943,
0.06325476616621017,
0.023829400539398193,
-0.23007269203662872,
-0.023487107828259468,
0.06765052676200867,
-0.0536172091960907,
0.012333041988313198,
-0.020697347819805145,
0.046463679522275925,
0.06474317610263824,
0.044528279453516006,
-0.04973433166742325,
0.014270495623350143,
-0.008215518668293953,
-0.03517768532037735,
0.018696090206503868,
0.04877908155322075,
0.02597798965871334,
-0.10980300605297089,
0.06050827354192734,
-0.11806613951921463,
0.040548477321863174,
-0.06373075395822525,
-0.03647841513156891,
0.00396439665928483,
0.0009551440016366541,
-0.055222492665052414,
0.0567188560962677,
0.08332794904708862,
0.00017091783229261637,
-0.01667456328868866,
-0.07881699502468109,
-0.009257469326257706,
0.0017395061440765858,
-0.06986054033041,
-0.10769271850585938,
-0.13010230660438538,
-0.10757219046354294,
0.11583194881677628,
-0.011943322606384754,
-0.19320523738861084,
0.014474074356257915,
-0.09977549314498901,
0.04724235460162163,
-0.17902211844921112,
0.08024095743894577,
0.0743112713098526,
0.016707729548215866,
-0.003201504237949848,
-0.03637106716632843,
0.050247397273778915,
0.08070636540651321,
-0.1089664176106453,
-0.08408985286951065
] |
null | null | transformers |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# wav2vec2-base-timit-demo-colab-1
This model is a fine-tuned version of [facebook/wav2vec2-base](https://huggingface.co/facebook/wav2vec2-base) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.3857
- Wer: 0.3874
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 1000
- num_epochs: 10
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Wer |
|:-------------:|:-----:|:----:|:---------------:|:------:|
| 3.4285 | 2.01 | 500 | 1.4732 | 0.9905 |
| 0.7457 | 4.02 | 1000 | 0.5278 | 0.4960 |
| 0.3463 | 6.02 | 1500 | 0.4245 | 0.4155 |
| 0.2034 | 8.03 | 2000 | 0.3857 | 0.3874 |
### Framework versions
- Transformers 4.11.3
- Pytorch 1.10.0+cu111
- Datasets 1.13.3
- Tokenizers 0.10.3
| {"license": "apache-2.0", "tags": ["generated_from_trainer"], "model-index": [{"name": "wav2vec2-base-timit-demo-colab-1", "results": []}]} | automatic-speech-recognition | Prasadi/wav2vec2-base-timit-demo-colab-1 | [
"transformers",
"pytorch",
"tensorboard",
"wav2vec2",
"automatic-speech-recognition",
"generated_from_trainer",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [] | TAGS
#transformers #pytorch #tensorboard #wav2vec2 #automatic-speech-recognition #generated_from_trainer #license-apache-2.0 #endpoints_compatible #region-us
| wav2vec2-base-timit-demo-colab-1
================================
This model is a fine-tuned version of facebook/wav2vec2-base on the None dataset.
It achieves the following results on the evaluation set:
* Loss: 0.3857
* Wer: 0.3874
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 0.0001
* train\_batch\_size: 16
* eval\_batch\_size: 8
* seed: 42
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* lr\_scheduler\_warmup\_steps: 1000
* num\_epochs: 10
* mixed\_precision\_training: Native AMP
### Training results
### Framework versions
* Transformers 4.11.3
* Pytorch 1.10.0+cu111
* Datasets 1.13.3
* Tokenizers 0.10.3
| [
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 0.0001\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 8\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 1000\n* num\\_epochs: 10\n* mixed\\_precision\\_training: Native AMP",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.11.3\n* Pytorch 1.10.0+cu111\n* Datasets 1.13.3\n* Tokenizers 0.10.3"
] | [
"TAGS\n#transformers #pytorch #tensorboard #wav2vec2 #automatic-speech-recognition #generated_from_trainer #license-apache-2.0 #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 0.0001\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 8\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 1000\n* num\\_epochs: 10\n* mixed\\_precision\\_training: Native AMP",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.11.3\n* Pytorch 1.10.0+cu111\n* Datasets 1.13.3\n* Tokenizers 0.10.3"
] | [
56,
130,
4,
33
] | [
"passage: TAGS\n#transformers #pytorch #tensorboard #wav2vec2 #automatic-speech-recognition #generated_from_trainer #license-apache-2.0 #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 0.0001\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 8\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 1000\n* num\\_epochs: 10\n* mixed\\_precision\\_training: Native AMP### Training results### Framework versions\n\n\n* Transformers 4.11.3\n* Pytorch 1.10.0+cu111\n* Datasets 1.13.3\n* Tokenizers 0.10.3"
] | [
-0.10893619805574417,
0.09748413413763046,
-0.0033303694799542427,
0.06409821659326553,
0.10900694131851196,
-0.020108716562390327,
0.12768511474132538,
0.15096279978752136,
-0.09062102437019348,
0.0752064511179924,
0.12604914605617523,
0.15106390416622162,
0.042435579001903534,
0.14634254574775696,
-0.0498492531478405,
-0.2829059660434723,
0.04722132906317711,
0.034879349172115326,
-0.011889349669218063,
0.12555208802223206,
0.08372936397790909,
-0.1250658631324768,
0.057686947286129,
0.03502152860164642,
-0.1575699746608734,
-0.0047253877855837345,
-0.0051316022872924805,
-0.1049635261297226,
0.12320539355278015,
0.005658661015331745,
0.07052726298570633,
0.049272533506155014,
0.06469093263149261,
-0.21807821094989777,
0.006466241087764502,
0.04404636472463608,
0.02831733040511608,
0.07468938827514648,
0.05992172658443451,
-0.02912953682243824,
0.10265522450208664,
-0.07468295842409134,
0.08150901645421982,
0.03793526068329811,
-0.10531166195869446,
-0.2924458682537079,
-0.08553381264209747,
0.04771994426846504,
0.06928925216197968,
0.08745841681957245,
-0.012644752860069275,
0.14285778999328613,
-0.05583849176764488,
0.11022614687681198,
0.28150391578674316,
-0.3127036988735199,
-0.04497503116726875,
-0.03872185945510864,
0.058110594749450684,
0.05958019196987152,
-0.10073870420455933,
-0.018236825242638588,
0.01567312888801098,
0.044484157115221024,
0.1388005018234253,
-0.015712914988398552,
-0.061082128435373306,
-0.007137814536690712,
-0.14910557866096497,
-0.059290651232004166,
0.11490578204393387,
0.022137820720672607,
-0.038639556616544724,
-0.09953931719064713,
-0.055695079267024994,
-0.21424438059329987,
-0.06801192462444305,
-0.01688453182578087,
0.04275430738925934,
-0.043299056589603424,
-0.10308994352817535,
-0.011606071144342422,
-0.0666663646697998,
-0.07369814068078995,
-0.039051104336977005,
0.1886546015739441,
0.05735180154442787,
-0.0019916207529604435,
-0.03780898451805115,
0.075741246342659,
-0.021936384961009026,
-0.1383534073829651,
-0.024282075464725494,
0.03601384907960892,
-0.022005988284945488,
-0.01541538443416357,
-0.04161302372813225,
-0.05788735672831535,
0.022197214886546135,
0.16212975978851318,
-0.10018875449895859,
0.09657566249370575,
-0.020131563767790794,
0.03920581564307213,
-0.10327023267745972,
0.2082051783800125,
-0.042556703090667725,
0.018738072365522385,
-0.009068741463124752,
0.0565560907125473,
0.029071981087327003,
-0.02644362300634384,
-0.09466023743152618,
0.03152843564748764,
0.12237554043531418,
0.047011733055114746,
-0.04668863117694855,
0.06635940074920654,
-0.03343292325735092,
-0.010302945040166378,
0.0019734331872314215,
-0.11283060163259506,
0.03690129145979881,
0.02067277394235134,
-0.0655788704752922,
0.004199628252536058,
0.014218246564269066,
0.007240237668156624,
-0.05468326434493065,
0.08409249037504196,
-0.0622343048453331,
0.03297043219208717,
-0.05709724500775337,
-0.12716546654701233,
0.025953983888030052,
-0.1180858239531517,
-0.0029842816293239594,
-0.1004144698381424,
-0.10076632350683212,
-0.011973535642027855,
0.036605991423130035,
-0.0379963181912899,
-0.025814196094870567,
-0.07901480793952942,
-0.09221591800451279,
0.045018214732408524,
-0.03388407826423645,
0.07167600095272064,
-0.07447849214076996,
0.09302101284265518,
0.03341663256287575,
0.0879642590880394,
-0.013920977711677551,
0.0606360025703907,
-0.0713067278265953,
0.027442889288067818,
-0.20110902190208435,
0.07530193030834198,
-0.08992845565080643,
0.06013756990432739,
-0.12428472936153412,
-0.11433404684066772,
0.022415408864617348,
-0.0070470180362463,
0.09852556884288788,
0.09813623130321503,
-0.17135722935199738,
-0.08855998516082764,
0.20746810734272003,
-0.08236775547266006,
-0.08407244086265564,
0.12507781386375427,
-0.024745116010308266,
-0.0005834067706018686,
0.056645166128873825,
0.2595701217651367,
0.0479736365377903,
-0.12648408114910126,
0.006545466836541891,
-0.04136360064148903,
0.0428619384765625,
-0.035115111619234085,
0.05823848769068718,
-0.026728210970759392,
0.06831666827201843,
0.019184034317731857,
-0.003203786676749587,
0.03668622300028801,
-0.08694063127040863,
-0.07700842618942261,
-0.04427528381347656,
-0.07793698459863663,
0.031221216544508934,
0.03363581374287605,
0.06364709883928299,
-0.1177477017045021,
-0.10841552168130875,
0.03823309764266014,
0.08075226843357086,
-0.1042558029294014,
0.0711970329284668,
-0.1197207123041153,
0.08519934862852097,
-0.015124100260436535,
-0.004495002795010805,
-0.1897118091583252,
0.035331886261701584,
0.03849118947982788,
-0.030142661184072495,
0.03903232887387276,
-0.063440702855587,
0.07815370708703995,
0.04582173749804497,
-0.026184862479567528,
-0.0468401201069355,
-0.00838792696595192,
0.010924738831818104,
-0.09054312109947205,
-0.20683030784130096,
-0.038235973566770554,
-0.03839433565735817,
0.0792163610458374,
-0.1408541351556778,
0.034842632710933685,
0.07732132822275162,
0.09215155988931656,
0.03296269103884697,
-0.031181663274765015,
-0.0006306541617959738,
0.08889157325029373,
-0.020956512540578842,
-0.06442900747060776,
0.057869356125593185,
0.018877321854233742,
-0.08620324730873108,
0.03875327482819557,
-0.15053309500217438,
0.1272881031036377,
0.14747904241085052,
-0.014298365451395512,
-0.06618554145097733,
0.0009469652432017028,
-0.04804397001862526,
-0.03460896387696266,
-0.002967613050714135,
0.03225143998861313,
0.21514055132865906,
0.013149063102900982,
0.14323540031909943,
-0.08929020911455154,
-0.042573828250169754,
0.05096318945288658,
-0.020757142454385757,
-0.006678491830825806,
0.11791913211345673,
0.045171163976192474,
-0.05179613083600998,
0.11928196251392365,
0.09184478968381882,
-0.07880926877260208,
0.11978532373905182,
-0.0603591613471508,
-0.07462400943040848,
-0.01928740181028843,
0.005229495000094175,
0.023649290204048157,
0.09791860729455948,
-0.16485168039798737,
-0.04030673950910568,
0.026498092338442802,
0.02577969618141651,
0.02042793482542038,
-0.20846062898635864,
0.014863141812384129,
0.028193091973662376,
-0.0845721960067749,
-0.043795641511678696,
0.0019942636135965586,
0.013135980814695358,
0.09443571418523788,
0.01169179193675518,
-0.09433363378047943,
0.010282471776008606,
0.00454594474285841,
-0.07296998798847198,
0.1763066202402115,
-0.11735270172357559,
-0.17690977454185486,
-0.10361144691705704,
-0.09231559932231903,
-0.03990723192691803,
-0.0026228756178170443,
0.08972781896591187,
-0.09227596968412399,
-0.039980698376894,
-0.0840628445148468,
-0.015716342255473137,
-0.0271257683634758,
0.041482504457235336,
0.030358515679836273,
-0.010796762071549892,
0.06319975852966309,
-0.11728695034980774,
-0.02124706283211708,
-0.03969604894518852,
-0.0023546197917312384,
0.054932527244091034,
0.03751185163855553,
0.10824059695005417,
0.1587626338005066,
-0.010328183881938457,
0.04973471164703369,
-0.047008220106363297,
0.18648792803287506,
-0.07515628635883331,
-0.0360233448445797,
0.10947850346565247,
-0.00590214179828763,
0.06881111860275269,
0.11970304697751999,
0.04749004915356636,
-0.09761743992567062,
-0.01420553494244814,
0.0032007901463657618,
-0.04491254314780235,
-0.21625371277332306,
-0.03467411547899246,
-0.044869400560855865,
0.001109856995753944,
0.10609694570302963,
0.041352372616529465,
0.03960799798369408,
0.022825470194220543,
0.03134377673268318,
0.006251488346606493,
0.004256733227521181,
0.09642934799194336,
0.13062861561775208,
0.03935437649488449,
0.13368739187717438,
-0.0381806381046772,
-0.03645402938127518,
0.02933439239859581,
0.004963126499205828,
0.23229184746742249,
0.01910192333161831,
0.1909266710281372,
0.05665363371372223,
0.17481307685375214,
0.041214972734451294,
0.06749255955219269,
-0.0012006196193397045,
-0.011317015625536442,
0.011407146230340004,
-0.052117783576250076,
-0.0382419154047966,
0.02339126169681549,
0.02412588894367218,
0.010270847007632256,
-0.11469676345586777,
-0.0125151127576828,
0.04755556210875511,
0.34973379969596863,
0.02872643992304802,
-0.33791691064834595,
-0.09075122326612473,
-0.012254809960722923,
-0.08586983382701874,
-0.02996229939162731,
0.045837707817554474,
0.08956113457679749,
-0.08048947155475616,
0.06479861587285995,
-0.06274877488613129,
0.09030976891517639,
-0.06264236569404602,
0.034844834357500076,
0.03798186033964157,
0.072503961622715,
0.004078423138707876,
0.034549418836832047,
-0.2935788631439209,
0.2808564007282257,
0.005426778458058834,
0.07752112299203873,
-0.06185652315616608,
0.008198012597858906,
0.02550392784178257,
0.01733372174203396,
0.08714032173156738,
-0.02617994137108326,
-0.12219885736703873,
-0.17587143182754517,
-0.0930035412311554,
0.011769886128604412,
0.12849532067775726,
0.013459344394505024,
0.1103069931268692,
-0.010746510699391365,
-0.01776973344385624,
0.0495564267039299,
-0.0956457182765007,
-0.06551291048526764,
-0.09204288572072983,
0.01248364057391882,
0.08087654411792755,
0.03273778036236763,
-0.07206133008003235,
-0.10326563566923141,
-0.09122885763645172,
0.1481870710849762,
-0.05346466228365898,
-0.043166931718587875,
-0.11885076761245728,
0.008261908777058125,
0.11038659512996674,
-0.07957877218723297,
0.06237580254673958,
0.009383801370859146,
0.1041044071316719,
0.011021970771253109,
-0.06887367367744446,
0.12022120505571365,
-0.0647083967924118,
-0.1686258614063263,
-0.02869221568107605,
0.14605985581874847,
0.028707917779684067,
0.06037687137722969,
-0.007426967844367027,
0.038613952696323395,
-0.021677030250430107,
-0.07700444012880325,
0.04073549807071686,
0.02826051227748394,
0.04390014708042145,
-0.014594045467674732,
-0.019695790484547615,
-0.008685760200023651,
-0.09158625453710556,
-0.01702594757080078,
0.20610924065113068,
0.24277961254119873,
-0.09664013236761093,
0.0935550108551979,
0.07000548392534256,
-0.04267077147960663,
-0.17243610322475433,
-0.003995796199887991,
0.06566494703292847,
-0.0005223318003118038,
-0.025130730122327805,
-0.1928097903728485,
0.024274196475744247,
0.07071875035762787,
-0.020608052611351013,
0.08487506210803986,
-0.31850752234458923,
-0.14090962707996368,
0.1365574449300766,
0.11392008513212204,
0.058142323046922684,
-0.14685478806495667,
-0.05534921586513519,
-0.0099282031878829,
-0.10290876775979996,
0.09427142143249512,
-0.07493823766708374,
0.13544930517673492,
-0.023395756259560585,
0.0887022390961647,
0.011829501949250698,
-0.058248791843652725,
0.10610758513212204,
0.013199776411056519,
0.060008302330970764,
-0.045864176005125046,
0.01636222004890442,
0.04929564148187637,
-0.0629962906241417,
0.054831281304359436,
-0.07915598899126053,
0.02726616896688938,
-0.07984086126089096,
-0.03256363049149513,
-0.08483263105154037,
0.014325597323477268,
-0.009682360105216503,
-0.033181726932525635,
-0.037045225501060486,
0.0010644580470398068,
0.0626981109380722,
-0.010412238538265228,
0.15496332943439484,
-0.026745470240712166,
0.12715983390808105,
0.1609535664319992,
0.10072589665651321,
-0.105362668633461,
-0.07713943719863892,
0.005980517249554396,
-0.03397730365395546,
0.054879676550626755,
-0.11843237280845642,
0.03789779543876648,
0.1363283395767212,
0.031383905559778214,
0.12198742479085922,
0.07022903859615326,
-0.06575959920883179,
0.03380357474088669,
0.04158738628029823,
-0.13717585802078247,
-0.12838798761367798,
0.013546413742005825,
0.023037029430270195,
-0.07240943610668182,
0.07448292523622513,
0.11450471729040146,
-0.056713882833719254,
-0.01436548214405775,
-0.0026638703420758247,
0.014628480188548565,
-0.03957747295498848,
0.19612711668014526,
0.03685164824128151,
0.06215417757630348,
-0.12474563717842102,
0.081478051841259,
0.03986138105392456,
-0.13672837615013123,
0.06014205887913704,
0.10593532025814056,
-0.09512370079755783,
-0.029635854065418243,
0.02838340401649475,
0.11150173097848892,
-0.022760681807994843,
-0.07361126691102982,
-0.14189280569553375,
-0.14425808191299438,
0.10931460559368134,
0.20534740388393402,
0.056035712361335754,
0.01696191541850567,
-0.05738924816250801,
0.017872536554932594,
-0.11903311312198639,
0.07067105919122696,
0.040828097611665726,
0.06063232570886612,
-0.13004913926124573,
0.14649340510368347,
0.01743873953819275,
0.04037591814994812,
-0.014748967252671719,
-0.01086017582565546,
-0.11261040717363358,
0.038833316415548325,
-0.1263439804315567,
0.005859901662915945,
-0.06752847135066986,
0.0007610032334923744,
0.004332386422902346,
-0.04993284121155739,
-0.06395282596349716,
0.033621907234191895,
-0.12005016207695007,
-0.02301926724612713,
0.0012078031431883574,
0.03714817017316818,
-0.12874962389469147,
-0.010465186089277267,
0.014220330864191055,
-0.09467309713363647,
0.09779947251081467,
0.08687611669301987,
-0.03317098319530487,
0.05088706687092781,
-0.06249592825770378,
-0.026516227051615715,
0.07895702123641968,
-0.006769316270947456,
0.05132270231842995,
-0.1302960067987442,
-0.018985098227858543,
0.010569444857537746,
0.034819912165403366,
0.02374717779457569,
0.11060106754302979,
-0.11603942513465881,
-0.00007896741590229794,
-0.028150958940386772,
-0.052189167588949203,
-0.06854104995727539,
0.04985252395272255,
0.10809578001499176,
0.025234339758753777,
0.16444939374923706,
-0.0929667204618454,
0.0298151932656765,
-0.1667831391096115,
0.005627328064292669,
-0.015836799517273903,
-0.12312457710504532,
-0.04940648749470711,
-0.03152232617139816,
0.07874302566051483,
-0.06389956921339035,
0.12996424734592438,
-0.030500225722789764,
0.026765689253807068,
0.037261202931404114,
-0.07591051608324051,
-0.054611463099718094,
0.04130050912499428,
0.2045167088508606,
0.038793206214904785,
-0.042847637087106705,
0.07304735481739044,
0.021297002211213112,
0.08175473660230637,
0.12787438929080963,
0.17338140308856964,
0.1603381335735321,
0.06166326627135277,
0.11710401624441147,
0.05404984951019287,
-0.05392390862107277,
-0.1737736463546753,
0.09349136054515839,
-0.0611550509929657,
0.1317937672138214,
-0.014396735467016697,
0.24096262454986572,
0.12153232097625732,
-0.15366144478321075,
0.06646513938903809,
-0.020097846165299416,
-0.08933904767036438,
-0.115386962890625,
-0.06598813086748123,
-0.08695618808269501,
-0.17615842819213867,
0.00864709634333849,
-0.1020166352391243,
0.06180248782038689,
0.047598715871572495,
0.03791351616382599,
0.016541091725230217,
0.1380651295185089,
0.01590622030198574,
0.003762580454349518,
0.09157715737819672,
-0.002985724713653326,
-0.05644422397017479,
-0.07292294502258301,
-0.08419039100408554,
0.03452179953455925,
-0.01316397450864315,
0.0579698346555233,
-0.00459741335362196,
-0.06967590749263763,
0.047912903130054474,
-0.03935248404741287,
-0.09698675572872162,
0.02361983060836792,
0.02094593271613121,
0.06938973814249039,
0.04962220415472984,
0.03428085520863533,
-0.042379073798656464,
-0.002084972569718957,
0.1956450343132019,
-0.09404097497463226,
-0.09328385442495346,
-0.10976632684469223,
0.2520409822463989,
0.03998861089348793,
-0.016703994944691658,
0.022078927606344223,
-0.06093935668468475,
-0.03085613064467907,
0.2138718217611313,
0.17371921241283417,
-0.00820072554051876,
0.004060343839228153,
-0.014259271323680878,
-0.006265906151384115,
-0.03680772706866264,
0.07996837794780731,
0.14681364595890045,
0.06110847368836403,
-0.06303247809410095,
-0.051524143666028976,
-0.04927130416035652,
-0.03517991304397583,
-0.06782756745815277,
0.07578691095113754,
0.006599199492484331,
-0.025364641100168228,
-0.04391542822122574,
0.06406264007091522,
-0.09399997442960739,
-0.08220457285642624,
0.024676483124494553,
-0.1957794725894928,
-0.1490885615348816,
0.004907464142888784,
0.07192020863294601,
0.012925762683153152,
0.035296786576509476,
0.003482057247310877,
-0.010423154570162296,
0.08088105916976929,
-0.001673146616667509,
-0.0811004489660263,
-0.06551624089479446,
0.08339935541152954,
-0.13201962411403656,
0.1672319769859314,
-0.042543355375528336,
0.04790891706943512,
0.12325328588485718,
0.08863534033298492,
-0.0810258761048317,
0.08532547205686569,
0.04333293437957764,
-0.10723839700222015,
0.02234266884624958,
0.15278860926628113,
-0.03369292989373207,
0.09253628551959991,
0.030926251783967018,
-0.1151500716805458,
0.015358681790530682,
-0.09025710076093674,
-0.0378841832280159,
-0.040321849286556244,
-0.048412568867206573,
-0.04303225502371788,
0.10876388847827911,
0.1632472723722458,
-0.04399731382727623,
0.003374726977199316,
-0.05285883694887161,
0.01097825076431036,
0.046334974467754364,
-0.0008881237590685487,
-0.06149919331073761,
-0.2792009711265564,
0.011444668285548687,
0.03645234555006027,
0.003696479368954897,
-0.2550848126411438,
-0.09571481496095657,
0.012797256000339985,
-0.04355285316705704,
-0.08937468379735947,
0.08525782078504562,
0.07536923885345459,
0.04594158008694649,
-0.05271485075354576,
-0.05805967375636101,
-0.03571630269289017,
0.1898808628320694,
-0.17503690719604492,
-0.05965632572770119
] |
null | null | transformers |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# xlm-roberta-base-finetuned-marc-en
This model is a fine-tuned version of [xlm-roberta-base](https://huggingface.co/xlm-roberta-base) on the amazon_reviews_multi dataset.
It achieves the following results on the evaluation set:
- Loss: 0.9575
- Mae: 0.5488
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 2
### Training results
| Training Loss | Epoch | Step | Validation Loss | Mae |
|:-------------:|:-----:|:----:|:---------------:|:------:|
| 1.1253 | 1.0 | 235 | 0.9960 | 0.5366 |
| 0.9708 | 2.0 | 470 | 0.9575 | 0.5488 |
### Framework versions
- Transformers 4.11.3
- Pytorch 1.9.0+cu111
- Datasets 1.14.0
- Tokenizers 0.10.3
| {"license": "mit", "tags": ["generated_from_trainer"], "datasets": ["amazon_reviews_multi"], "model-index": [{"name": "xlm-roberta-base-finetuned-marc-en", "results": []}]} | text-classification | Pratibha/xlm-roberta-base-finetuned-marc-en | [
"transformers",
"pytorch",
"tensorboard",
"xlm-roberta",
"text-classification",
"generated_from_trainer",
"dataset:amazon_reviews_multi",
"license:mit",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [] | TAGS
#transformers #pytorch #tensorboard #xlm-roberta #text-classification #generated_from_trainer #dataset-amazon_reviews_multi #license-mit #autotrain_compatible #endpoints_compatible #region-us
| xlm-roberta-base-finetuned-marc-en
==================================
This model is a fine-tuned version of xlm-roberta-base on the amazon\_reviews\_multi dataset.
It achieves the following results on the evaluation set:
* Loss: 0.9575
* Mae: 0.5488
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 2e-05
* train\_batch\_size: 16
* eval\_batch\_size: 16
* seed: 42
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* num\_epochs: 2
### Training results
### Framework versions
* Transformers 4.11.3
* Pytorch 1.9.0+cu111
* Datasets 1.14.0
* Tokenizers 0.10.3
| [
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 2",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.11.3\n* Pytorch 1.9.0+cu111\n* Datasets 1.14.0\n* Tokenizers 0.10.3"
] | [
"TAGS\n#transformers #pytorch #tensorboard #xlm-roberta #text-classification #generated_from_trainer #dataset-amazon_reviews_multi #license-mit #autotrain_compatible #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 2",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.11.3\n* Pytorch 1.9.0+cu111\n* Datasets 1.14.0\n* Tokenizers 0.10.3"
] | [
67,
98,
4,
34
] | [
"passage: TAGS\n#transformers #pytorch #tensorboard #xlm-roberta #text-classification #generated_from_trainer #dataset-amazon_reviews_multi #license-mit #autotrain_compatible #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 2### Training results### Framework versions\n\n\n* Transformers 4.11.3\n* Pytorch 1.9.0+cu111\n* Datasets 1.14.0\n* Tokenizers 0.10.3"
] | [
-0.09092789888381958,
0.08008227497339249,
-0.0020140453707426786,
0.11630697548389435,
0.18312716484069824,
0.042973749339580536,
0.15040470659732819,
0.11954569816589355,
-0.09022784978151321,
-0.0003494977136142552,
0.11352355778217316,
0.17042438685894012,
0.007949714548885822,
0.1317906379699707,
-0.06562875211238861,
-0.25790008902549744,
-0.012251557782292366,
0.05035068839788437,
-0.04488401114940643,
0.1443592607975006,
0.10154645889997482,
-0.1380293369293213,
0.09442190825939178,
-0.0014341471251100302,
-0.19770415127277374,
-0.006765956524759531,
0.029228247702121735,
-0.06890206784009933,
0.13384534418582916,
0.03764583170413971,
0.13645893335342407,
0.008102459833025932,
0.07276447862386703,
-0.19063866138458252,
0.020796533674001694,
0.040146905928850174,
0.00358709879219532,
0.0915832370519638,
0.030548246577382088,
-0.01468250248581171,
0.1342829167842865,
-0.060973599553108215,
0.07154899835586548,
0.018368558958172798,
-0.11795462667942047,
-0.2320529818534851,
-0.08308214694261551,
0.035912688821554184,
0.056772612035274506,
0.09991798549890518,
-0.010324102826416492,
0.15634198486804962,
-0.07674280554056168,
0.10339420288801193,
0.23605166375637054,
-0.2893300950527191,
-0.07612571865320206,
0.032290682196617126,
0.043305903673172,
0.08403892815113068,
-0.10349797457456589,
-0.023395158350467682,
0.05919168144464493,
0.05649252235889435,
0.12055753171443939,
-0.0452197901904583,
-0.0962030366063118,
0.01583736389875412,
-0.1441667675971985,
-0.02332693338394165,
0.2023565173149109,
0.03447432816028595,
-0.0476268008351326,
-0.051082272082567215,
-0.032434288412332535,
-0.15748977661132812,
-0.03979404643177986,
-0.0009673985186964273,
0.050246383994817734,
-0.06319781392812729,
-0.08705104142427444,
-0.013781961984932423,
-0.11613631248474121,
-0.05173107236623764,
-0.06630995124578476,
0.1457367241382599,
0.04109196364879608,
0.01682303659617901,
-0.03500403091311455,
0.10437536239624023,
0.021311579272150993,
-0.10318823158740997,
0.012504742480814457,
0.007507571950554848,
-0.010289235971868038,
-0.047606464475393295,
-0.05751515179872513,
-0.07956288009881973,
0.002544892020523548,
0.11920338124036789,
-0.04774501919746399,
0.03242870792746544,
0.03772571310400963,
0.057246528565883636,
-0.07498431205749512,
0.19655898213386536,
-0.028955459594726562,
-0.005452427081763744,
-0.004732458386570215,
0.04949004575610161,
0.015602247789502144,
-0.010551849380135536,
-0.12953022122383118,
0.007022026460617781,
0.08074092119932175,
0.013663754798471928,
-0.07587581127882004,
0.06431995332241058,
-0.06985332071781158,
-0.04672382026910782,
-0.007498918566852808,
-0.07484535127878189,
0.031198130920529366,
-0.008710284717381,
-0.06582239270210266,
-0.02350885048508644,
0.023388126865029335,
0.017721518874168396,
-0.011746599338948727,
0.13322429358959198,
-0.08970562368631363,
0.0364038459956646,
-0.09379757940769196,
-0.10690733790397644,
0.021213319152593613,
-0.07686057686805725,
0.0376054085791111,
-0.10856878012418747,
-0.16822496056556702,
-0.03304174169898033,
0.0522976890206337,
-0.018100610002875328,
-0.060430899262428284,
-0.03577180206775665,
-0.06308238208293915,
0.01012183167040348,
-0.014289181679487228,
0.1470746546983719,
-0.07050348073244095,
0.11098764836788177,
0.03432513028383255,
0.05846457928419113,
-0.04605408012866974,
0.04961748793721199,
-0.09303298592567444,
-0.008509560488164425,
-0.15352317690849304,
0.03393903747200966,
-0.04447499290108681,
0.058807726949453354,
-0.07169647514820099,
-0.11825202405452728,
0.013603618368506432,
0.019700555130839348,
0.04256633669137955,
0.07442475855350494,
-0.1713005006313324,
-0.07580258697271347,
0.14970633387565613,
-0.06509901583194733,
-0.12265316396951675,
0.11653491109609604,
-0.08050192892551422,
0.06815876066684723,
0.07918455451726913,
0.16007547080516815,
0.07368943095207214,
-0.07665113359689713,
0.02364281751215458,
-0.009748673066496849,
0.030511032789945602,
-0.06656751781702042,
0.07645123451948166,
0.023808009922504425,
-0.011088239029049873,
0.031931594014167786,
-0.03572938218712807,
0.036782167851924896,
-0.09431610256433487,
-0.08854455500841141,
-0.03681464493274689,
-0.09542662650346756,
0.05960068479180336,
0.07206001877784729,
0.07265763729810715,
-0.11765731126070023,
-0.07257198542356491,
0.07150136679410934,
0.0861012265086174,
-0.055003076791763306,
0.018849531188607216,
-0.05219917744398117,
0.06374433636665344,
-0.034731317311525345,
-0.022515803575515747,
-0.17951369285583496,
-0.029770378023386,
0.014603286981582642,
0.005661679431796074,
0.032073505222797394,
0.040834296494722366,
0.05372710898518562,
0.04150041192770004,
-0.07131427526473999,
-0.011015200987458229,
-0.050375696271657944,
-0.00942130945622921,
-0.1230582743883133,
-0.19584792852401733,
-0.018969720229506493,
-0.023339437320828438,
0.11454646289348602,
-0.224257692694664,
0.03413281589746475,
-0.04092243313789368,
0.05761338770389557,
0.041867028921842575,
-0.010956901125609875,
-0.02053735964000225,
0.0860079899430275,
-0.03713130205869675,
-0.0327489897608757,
0.07592474669218063,
0.012195399962365627,
-0.10368473827838898,
-0.007822113111615181,
-0.09257585555315018,
0.19031088054180145,
0.1289455145597458,
-0.09699749946594238,
-0.0888260006904602,
0.010719056241214275,
-0.054551877081394196,
-0.03350850194692612,
-0.08110085129737854,
0.03831710293889046,
0.1832561194896698,
-0.00408615218475461,
0.1422782838344574,
-0.08589011430740356,
-0.04746617004275322,
0.027460463345050812,
-0.04416185989975929,
0.026127975434064865,
0.14056192338466644,
0.12522448599338531,
-0.0920635238289833,
0.1394202560186386,
0.14817063510417938,
-0.07915978133678436,
0.1658279448747635,
-0.03801234811544418,
-0.059139613062143326,
-0.024806562811136246,
-0.03590410575270653,
-0.011826027184724808,
0.1085469201207161,
-0.12760300934314728,
0.00472189811989665,
0.03235438093543053,
0.009446932934224606,
0.01708807982504368,
-0.23087909817695618,
-0.04802200570702553,
0.035222526639699936,
-0.040130965411663055,
-0.011457022279500961,
0.006225543096661568,
0.01636500284075737,
0.11100597679615021,
-0.00038215177482925355,
-0.061102356761693954,
0.04150799661874771,
0.007206903304904699,
-0.09109006822109222,
0.21807080507278442,
-0.0752849280834198,
-0.18252205848693848,
-0.13199250400066376,
-0.0493457093834877,
-0.04442271217703819,
-0.00279906764626503,
0.06433742493391037,
-0.07138606905937195,
-0.02895044907927513,
-0.06548784673213959,
0.00514746131375432,
-0.006640486419200897,
0.016602864488959312,
-0.018567554652690887,
0.023830769583582878,
0.03936237096786499,
-0.10331819206476212,
-0.012889090925455093,
-0.061911795288324356,
-0.040967509150505066,
0.053883109241724014,
0.04405555874109268,
0.10898144543170929,
0.14961715042591095,
-0.025291262194514275,
-0.003893762594088912,
-0.03315175324678421,
0.21485087275505066,
-0.08689753711223602,
-0.04712153226137161,
0.13125620782375336,
-0.009326517581939697,
0.03263324499130249,
0.1212800070643425,
0.0720895454287529,
-0.09237991273403168,
0.017520809546113014,
0.02917098067700863,
-0.03997639939188957,
-0.27003076672554016,
-0.03821174427866936,
-0.053288307040929794,
0.0005041555850766599,
0.07316083461046219,
0.026278546079993248,
0.005705300718545914,
0.06592023372650146,
0.04250522330403328,
0.0648341029882431,
-0.02982121892273426,
0.06391338258981705,
0.1108853667974472,
0.03844940662384033,
0.13148561120033264,
-0.05558411031961441,
-0.06147214397788048,
0.05758168175816536,
-0.00863972119987011,
0.24782785773277283,
0.011279144324362278,
0.1309511810541153,
0.07623305916786194,
0.12350870668888092,
0.017918558791279793,
0.05768585205078125,
0.018591217696666718,
-0.03858204931020737,
-0.019616344943642616,
-0.025811797007918358,
-0.029816756024956703,
0.0286216102540493,
-0.04727308079600334,
0.048704832792282104,
-0.13749583065509796,
-0.01498402375727892,
0.06358642131090164,
0.23906491696834564,
0.016769928857684135,
-0.30908310413360596,
-0.10424860566854477,
0.010606772266328335,
-0.05240930989384651,
-0.009383879601955414,
0.026137301698327065,
0.10281414538621902,
-0.12598705291748047,
0.03643062710762024,
-0.08053163439035416,
0.09221653640270233,
-0.0863085463643074,
0.04050378501415253,
0.0738224908709526,
0.0681130588054657,
-0.003933573141694069,
0.07893651723861694,
-0.307219922542572,
0.2819614112377167,
-0.005618869327008724,
0.060745105147361755,
-0.06372545659542084,
-0.025851668789982796,
0.023402828723192215,
0.05463678762316704,
0.06036457046866417,
-0.005185297690331936,
-0.05821243301033974,
-0.17296744883060455,
-0.029245417565107346,
0.025523608550429344,
0.07566779851913452,
-0.01468990370631218,
0.08854345232248306,
-0.0285579115152359,
0.004089497961103916,
0.05787508934736252,
-0.027434229850769043,
-0.05153360217809677,
-0.09460210800170898,
-0.004334294702857733,
0.020693570375442505,
-0.05909181386232376,
-0.06367843598127365,
-0.13336031138896942,
-0.08024092018604279,
0.13815522193908691,
-0.014427115209400654,
-0.04591428115963936,
-0.09696020931005478,
0.07496039569377899,
0.06935662031173706,
-0.0799306333065033,
0.03762155771255493,
0.014699560590088367,
0.0846717432141304,
0.024481261149048805,
-0.047440964728593826,
0.09554848819971085,
-0.05173030123114586,
-0.1872195154428482,
-0.0632166862487793,
0.11352117359638214,
0.028094131499528885,
0.06719598174095154,
-0.023858340457081795,
0.0004107730055693537,
-0.04823746904730797,
-0.08825484663248062,
0.02258949913084507,
0.007237046025693417,
0.08538832515478134,
0.04420587047934532,
-0.06016400828957558,
0.003088439116254449,
-0.0743371769785881,
-0.05789945647120476,
0.20305874943733215,
0.20633313059806824,
-0.09303376823663712,
0.032080233097076416,
0.01414012722671032,
-0.08177021145820618,
-0.17220793664455414,
0.03629900887608528,
0.07108122855424881,
0.012489903718233109,
0.05826587229967117,
-0.15110467374324799,
0.11386826634407043,
0.09753286093473434,
-0.008590045385062695,
0.13361698389053345,
-0.323248952627182,
-0.13557180762290955,
0.09210297465324402,
0.15564033389091492,
0.12722596526145935,
-0.13530485332012177,
-0.012024758383631706,
-0.029694128781557083,
-0.12655147910118103,
0.13825254142284393,
-0.08200353384017944,
0.14067378640174866,
-0.03298668563365936,
0.10618506371974945,
0.0052995807491242886,
-0.05460384488105774,
0.11506109684705734,
0.01607188954949379,
0.10979824513196945,
-0.05073171481490135,
-0.046968698501586914,
0.018168210983276367,
-0.03173650801181793,
0.017488637939095497,
-0.07388205081224442,
0.019537346437573433,
-0.09553373605012894,
-0.037904515862464905,
-0.07616972178220749,
0.03510139882564545,
-0.04053482040762901,
-0.05432239547371864,
-0.04073890298604965,
0.035612355917692184,
0.02205091342329979,
-0.017490994185209274,
0.14471615850925446,
0.005916844122111797,
0.14710642397403717,
0.06948163360357285,
0.09639938920736313,
-0.05343913659453392,
-0.09279846400022507,
-0.03582580387592316,
-0.021688245236873627,
0.049793485552072525,
-0.15473158657550812,
0.02326696179807186,
0.14285890758037567,
0.012413830496370792,
0.15901656448841095,
0.07501823455095291,
-0.028941627591848373,
0.015591477043926716,
0.06824849545955658,
-0.15109407901763916,
-0.0993746891617775,
-0.015658222138881683,
-0.09098188579082489,
-0.11272766441106796,
0.04547811672091484,
0.11424396187067032,
-0.06779132783412933,
-0.027168378233909607,
-0.013252581469714642,
0.009434499777853489,
-0.04961276799440384,
0.19228704273700714,
0.0712907612323761,
0.049355633556842804,
-0.10086462646722794,
0.08726470172405243,
0.05299781262874603,
-0.07277260720729828,
0.009131514467298985,
0.07398980855941772,
-0.0851946696639061,
-0.06054844334721565,
0.06302937865257263,
0.1840636432170868,
-0.06436847895383835,
-0.05052271485328674,
-0.14428043365478516,
-0.12239868193864822,
0.08020304143428802,
0.15456198155879974,
0.1154261901974678,
0.01174027007073164,
-0.04472504183650017,
-0.009678967297077179,
-0.10332822054624557,
0.10373563319444656,
0.06035935878753662,
0.06799294799566269,
-0.15564770996570587,
0.11893093585968018,
0.0298626646399498,
0.0544048435986042,
-0.021874960511922836,
0.03503105044364929,
-0.11320466548204422,
0.016281502321362495,
-0.11635188013315201,
-0.004599275998771191,
-0.01955498568713665,
0.0156586654484272,
0.00008569054625695571,
-0.056630246341228485,
-0.06948243826627731,
0.011811119504272938,
-0.12271115183830261,
-0.015396937727928162,
0.041357602924108505,
0.07619098573923111,
-0.08720040321350098,
-0.03770965710282326,
0.024497678503394127,
-0.04467649757862091,
0.07077261805534363,
0.04765259474515915,
0.00999519880861044,
0.0638277679681778,
-0.1326751559972763,
0.03493008390069008,
0.05847730115056038,
0.016229216009378433,
0.048695411533117294,
-0.1218823567032814,
0.00844301376491785,
0.004147431813180447,
0.07234194129705429,
0.02527628093957901,
0.06878162175416946,
-0.1595860719680786,
-0.003925286699086428,
-0.011753080412745476,
-0.08088759332895279,
-0.0604778528213501,
0.02060185931622982,
0.06034849211573601,
0.033461686223745346,
0.21250495314598083,
-0.08307280391454697,
0.04318675398826599,
-0.19975832104682922,
0.00521842809394002,
-0.01949070766568184,
-0.1242818534374237,
-0.12428144365549088,
-0.0736192986369133,
0.05655497685074806,
-0.0671464130282402,
0.1680191457271576,
0.04778936877846718,
0.05581874027848244,
0.02484714426100254,
-0.020287757739424706,
-0.0074821035377681255,
0.016732243821024895,
0.17049984633922577,
0.007073113229125738,
-0.04048845171928406,
0.0606084018945694,
0.047959793359041214,
0.1063975840806961,
0.10674457252025604,
0.20010076463222504,
0.1684790700674057,
0.009575174190104008,
0.08692093193531036,
0.03743763640522957,
-0.03279959410429001,
-0.13300663232803345,
0.03713468834757805,
-0.025708554312586784,
0.11290872097015381,
-0.026694100350141525,
0.20042958855628967,
0.07072245329618454,
-0.16473351418972015,
0.04714856669306755,
-0.05892984941601753,
-0.08779802173376083,
-0.11389470845460892,
-0.055804088711738586,
-0.09887007623910904,
-0.1443217545747757,
0.005623009521514177,
-0.130331888794899,
-0.001939242472872138,
0.09170602262020111,
0.007379705086350441,
-0.04041507467627525,
0.11972035467624664,
0.02042819932103157,
0.011828257702291012,
0.08732693642377853,
0.013573730364441872,
-0.03270769864320755,
-0.10997237265110016,
-0.04921284690499306,
-0.03101533092558384,
-0.025611599907279015,
0.023357538506388664,
-0.05341451242566109,
-0.06802772730588913,
0.024218278005719185,
-0.026913153007626534,
-0.10152031481266022,
0.014489524997770786,
0.02225584164261818,
0.07951844483613968,
0.03816826641559601,
0.015252734534442425,
0.008539740927517414,
-0.0018916655099019408,
0.2537987232208252,
-0.06090321019291878,
-0.059095606207847595,
-0.12073633074760437,
0.23759934306144714,
0.04082411155104637,
-0.027152735739946365,
0.0369359627366066,
-0.0620994009077549,
0.004789397120475769,
0.250545471906662,
0.23370525240898132,
-0.07233811914920807,
-0.008881565183401108,
0.016480514779686928,
-0.005681920796632767,
-0.014903892762959003,
0.12409383058547974,
0.11327847838401794,
0.043661732226610184,
-0.07554518431425095,
-0.03618474677205086,
-0.053929403424263,
0.002410672837868333,
-0.017594728618860245,
0.06780397146940231,
0.05220600590109825,
0.005234327167272568,
-0.041317231953144073,
0.0750744640827179,
-0.08238773792982101,
-0.11706630140542984,
0.04748406261205673,
-0.2140689343214035,
-0.17265373468399048,
-0.01564285345375538,
0.09141164273023605,
-0.0005080309347249568,
0.06623675674200058,
-0.025556398555636406,
-0.014778113923966885,
0.07295584678649902,
-0.016154099255800247,
-0.1069135069847107,
-0.08071832358837128,
0.09760671108961105,
-0.1033845841884613,
0.18947070837020874,
-0.05197722837328911,
0.05551624298095703,
0.12156101316213608,
0.06087696552276611,
-0.06552910804748535,
0.07936710119247437,
0.036825064569711685,
-0.040335942059755325,
0.04746859520673752,
0.10013407468795776,
-0.03197331726551056,
0.07261445373296738,
0.05393337458372116,
-0.12573927640914917,
0.016867447644472122,
-0.0939512848854065,
-0.04653635248541832,
-0.056750234216451645,
-0.011542480438947678,
-0.07443743944168091,
0.12872548401355743,
0.23667973279953003,
-0.03721931204199791,
-0.007397593930363655,
-0.05932502821087837,
0.02578439563512802,
0.06336025893688202,
0.041056301444768906,
-0.047882936894893646,
-0.22828209400177002,
0.009885349310934544,
0.07289337366819382,
-0.015281859785318375,
-0.26788604259490967,
-0.070579893887043,
0.0017346341628581285,
-0.07060904800891876,
-0.07644132524728775,
0.08083239942789078,
0.07705751806497574,
0.044927142560482025,
-0.06221795082092285,
-0.06259375810623169,
-0.06772700697183609,
0.1547669768333435,
-0.15244202315807343,
-0.0954475924372673
] |
null | null | transformers |
# ALBERT-base for QA
## Overview
**Language model:** albert-base </br>
**Language:** English </br>
**Downstream-task:** Extractive QA </br>
**Training data:** SQuAD 2.0 </br>
**Eval data:** SQuAD 2.0 </br>
**Code:** <TBD> </br>
## Env Information
`transformers` version: 4.9.1 </br>
Platform: Linux-5.4.104+-x86_64-with-Ubuntu-18.04-bionic </br>
Python version: 3.7.11 </br>
PyTorch version (GPU?): 1.9.0+cu102 (False)</br>
Tensorflow version (GPU?): 2.5.0 (False)</br>
## Hyperparameters
```
max_seq_len=386
doc_stride=128
n_best_size=20
max_answer_length=30
min_null_score=7.0
batch_size=32
n_epochs=3
base_LM_model = "albert-base-v2"
learning_rate=3e-5
adam_epsilon=1e-5
adam_beta1=0.95
adam_beta2=0.999
warmup_steps=300
weight_decay=0.01
optimizer=AdamW
lr_scheduler="polynomial"
```
## Performance
```
"exact": 78.253
"f1": 81.523
"total": 11873
"HasAns_exact": 73.616
"HasAns_f1": 80.165
"HasAns_total": 5928
"NoAns_exact": 82.876
"NoAns_f1": 82.876
"NoAns_total": 5945
```
## Usage
### In Transformers
```python
from transformers import AutoModelForQuestionAnswering, AutoTokenizer, pipeline
model_name = "PremalMatalia/albert-base-best-squad2"
# a) Get predictions
nlp = pipeline('question-answering', model=model_name, tokenizer=model_name)
QA_input = {
'question': 'Which name is also used to describe the Amazon rainforest in English?',
'context': 'The Amazon rainforest (Portuguese: Floresta Amazônica or Amazônia; Spanish: Selva Amazónica, Amazonía or usually Amazonia; French: Forêt amazonienne; Dutch: Amazoneregenwoud), also known in English as Amazonia or the Amazon Jungle, is a moist broadleaf forest that covers most of the Amazon basin of South America. This basin encompasses 7,000,000 square kilometres (2,700,000 sq mi), of which 5,500,000 square kilometres (2,100,000 sq mi) are covered by the rainforest. This region includes territory belonging to nine nations. The majority of the forest is contained within Brazil, with 60% of the rainforest, followed by Peru with 13%, Colombia with 10%, and with minor amounts in Venezuela, Ecuador, Bolivia, Guyana, Suriname and French Guiana. States or departments in four nations contain "Amazonas" in their names. The Amazon represents over half of the planet\'s remaining rainforests, and comprises the largest and most biodiverse tract of tropical rainforest in the world, with an estimated 390 billion individual trees divided into 16,000 species.'
}
res = nlp(QA_input)
print(res)
# b) Load model & tokenizer
model = AutoModelForQuestionAnswering.from_pretrained(model_name)
tokenizer = AutoTokenizer.from_pretrained(model_name)
```
## Authors
Premal Matalia | {"datasets": ["squad_v2"]} | question-answering | PremalMatalia/albert-base-best-squad2 | [
"transformers",
"pytorch",
"albert",
"question-answering",
"dataset:squad_v2",
"endpoints_compatible",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [] | TAGS
#transformers #pytorch #albert #question-answering #dataset-squad_v2 #endpoints_compatible #region-us
|
# ALBERT-base for QA
## Overview
Language model: albert-base </br>
Language: English </br>
Downstream-task: Extractive QA </br>
Training data: SQuAD 2.0 </br>
Eval data: SQuAD 2.0 </br>
Code: <TBD> </br>
## Env Information
'transformers' version: 4.9.1 </br>
Platform: Linux-5.4.104+-x86_64-with-Ubuntu-18.04-bionic </br>
Python version: 3.7.11 </br>
PyTorch version (GPU?): 1.9.0+cu102 (False)</br>
Tensorflow version (GPU?): 2.5.0 (False)</br>
## Hyperparameters
## Performance
## Usage
### In Transformers
## Authors
Premal Matalia | [
"# ALBERT-base for QA",
"## Overview\nLanguage model: albert-base </br>\nLanguage: English </br>\nDownstream-task: Extractive QA </br>\nTraining data: SQuAD 2.0 </br>\nEval data: SQuAD 2.0 </br>\nCode: <TBD> </br>",
"## Env Information\n'transformers' version: 4.9.1 </br>\nPlatform: Linux-5.4.104+-x86_64-with-Ubuntu-18.04-bionic </br>\nPython version: 3.7.11 </br>\nPyTorch version (GPU?): 1.9.0+cu102 (False)</br>\nTensorflow version (GPU?): 2.5.0 (False)</br>",
"## Hyperparameters",
"## Performance",
"## Usage",
"### In Transformers",
"## Authors\nPremal Matalia"
] | [
"TAGS\n#transformers #pytorch #albert #question-answering #dataset-squad_v2 #endpoints_compatible #region-us \n",
"# ALBERT-base for QA",
"## Overview\nLanguage model: albert-base </br>\nLanguage: English </br>\nDownstream-task: Extractive QA </br>\nTraining data: SQuAD 2.0 </br>\nEval data: SQuAD 2.0 </br>\nCode: <TBD> </br>",
"## Env Information\n'transformers' version: 4.9.1 </br>\nPlatform: Linux-5.4.104+-x86_64-with-Ubuntu-18.04-bionic </br>\nPython version: 3.7.11 </br>\nPyTorch version (GPU?): 1.9.0+cu102 (False)</br>\nTensorflow version (GPU?): 2.5.0 (False)</br>",
"## Hyperparameters",
"## Performance",
"## Usage",
"### In Transformers",
"## Authors\nPremal Matalia"
] | [
39,
9,
69,
95,
5,
2,
3,
6,
7
] | [
"passage: TAGS\n#transformers #pytorch #albert #question-answering #dataset-squad_v2 #endpoints_compatible #region-us \n# ALBERT-base for QA## Overview\nLanguage model: albert-base </br>\nLanguage: English </br>\nDownstream-task: Extractive QA </br>\nTraining data: SQuAD 2.0 </br>\nEval data: SQuAD 2.0 </br>\nCode: <TBD> </br>## Env Information\n'transformers' version: 4.9.1 </br>\nPlatform: Linux-5.4.104+-x86_64-with-Ubuntu-18.04-bionic </br>\nPython version: 3.7.11 </br>\nPyTorch version (GPU?): 1.9.0+cu102 (False)</br>\nTensorflow version (GPU?): 2.5.0 (False)</br>## Hyperparameters## Performance## Usage### In Transformers## Authors\nPremal Matalia"
] | [
-0.12088802456855774,
0.12320705503225327,
-0.004423728212714195,
0.047560226172208786,
0.12907204031944275,
0.05344369634985924,
0.07963826507329941,
0.09799083322286606,
-0.07285355031490326,
0.10474026203155518,
0.09223433583974838,
0.07196641713380814,
0.09329385310411453,
0.08922182023525238,
-0.04030032083392143,
-0.18203260004520416,
0.029242243617773056,
0.005760723724961281,
-0.08602353930473328,
0.0949593111872673,
0.09785448759794235,
-0.08711134642362595,
0.0730719342827797,
-0.020780541002750397,
-0.07405740767717361,
0.004487319849431515,
0.0034694490022957325,
-0.07563179731369019,
0.0989084392786026,
0.08254322409629822,
0.08167172968387604,
-0.011388001963496208,
0.049713391810655594,
-0.18189305067062378,
0.007947588339447975,
0.07079056650400162,
0.010884703136980534,
0.09644272178411484,
0.05945444107055664,
-0.010162419639527798,
0.09877891093492508,
-0.05504381284117699,
0.042487483471632004,
0.028337223455309868,
-0.0936969742178917,
-0.22061298787593842,
-0.08100634813308716,
0.06563232094049454,
0.07396619021892548,
0.05792982876300812,
-0.009836044162511826,
0.14523760974407196,
-0.060152459889650345,
0.11585838347673416,
0.18855354189872742,
-0.32012438774108887,
-0.07823488861322403,
-0.04870769381523132,
0.134761780500412,
0.09713304042816162,
-0.031772878021001816,
0.02271592617034912,
0.03407454118132591,
0.07385701686143875,
0.03734026476740837,
-0.0327291414141655,
-0.09926634281873703,
0.017795471474528313,
-0.12840047478675842,
-0.03252269700169563,
0.13870146870613098,
0.0007402123883366585,
-0.050340436398983,
0.010221706703305244,
-0.10904967784881592,
-0.09451080113649368,
-0.027488628402352333,
-0.01647866889834404,
0.01766323857009411,
-0.05958051607012749,
0.015052973292768002,
-0.026579227298498154,
-0.06216137856245041,
-0.1090545579791069,
-0.07617051154375076,
0.18780288100242615,
0.08759846538305283,
0.08067768812179565,
-0.009370673447847366,
0.07929880172014236,
-0.010584372095763683,
-0.1601341813802719,
-0.018863173201680183,
-0.034693725407123566,
-0.05604434758424759,
0.022353386506438255,
-0.01345133874565363,
-0.0038025411777198315,
0.07125283777713776,
0.15137700736522675,
-0.055369243025779724,
0.03746557608246803,
0.024941448122262955,
-0.008653165772557259,
-0.007389203179627657,
0.18131870031356812,
-0.15497173368930817,
-0.08146640658378601,
0.07079149782657623,
0.045570798218250275,
0.0279285479336977,
-0.02490938827395439,
-0.044837623834609985,
-0.024692485108971596,
0.11673478782176971,
0.060520850121974945,
0.017303144559264183,
0.01101232785731554,
-0.03516261652112007,
-0.06656059622764587,
0.05641518160700798,
-0.11883386224508286,
0.012917296029627323,
0.03060435876250267,
-0.06189412623643875,
0.06979115307331085,
0.039862774312496185,
-0.01782812550663948,
-0.10481429845094681,
0.05269285663962364,
-0.06679856032133102,
-0.06625694781541824,
-0.07753890007734299,
-0.1549237072467804,
0.022342927753925323,
-0.066416896879673,
0.014425678178668022,
-0.12518468499183655,
-0.13732728362083435,
0.021411007270216942,
0.06851102411746979,
-0.08764450997114182,
-0.056063901633024216,
0.04744787514209747,
-0.09240344166755676,
0.017084602266550064,
-0.041049063205718994,
0.14495956897735596,
-0.073738232254982,
0.08330769836902618,
0.11253278702497482,
0.048124149441719055,
-0.059937816113233566,
0.045328445732593536,
-0.017923850566148758,
0.01717313751578331,
-0.11141405999660492,
0.06119236350059509,
-0.14045646786689758,
-0.0031745871528983116,
-0.12012851238250732,
-0.07327909767627716,
0.08669557422399521,
-0.008701371029019356,
0.09483049064874649,
0.0859794020652771,
-0.13246792554855347,
-0.04282353073358536,
0.15114504098892212,
-0.06279831379652023,
-0.13919803500175476,
0.14329209923744202,
0.01892322488129139,
-0.08919989317655563,
0.08648928999900818,
0.05067367106676102,
0.060766540467739105,
-0.18850411474704742,
-0.0405043289065361,
0.06408417969942093,
0.1003865897655487,
-0.033497605472803116,
0.14091122150421143,
0.0369708426296711,
0.009295548312366009,
0.034250613301992416,
-0.08908335119485855,
0.034811634570360184,
-0.07237932085990906,
-0.07804042845964432,
-0.037969596683979034,
-0.07550351321697235,
-0.014427636750042439,
0.030389025807380676,
0.03373507410287857,
-0.03489154204726219,
-0.14251568913459778,
-0.05726976692676544,
0.0815320611000061,
-0.056799232959747314,
-0.005329611245542765,
-0.11393377929925919,
0.15389399230480194,
-0.0868666023015976,
0.029274102300405502,
-0.17863525450229645,
-0.03396883234381676,
0.06121620535850525,
-0.044503502547740936,
-0.027509262785315514,
0.031892985105514526,
0.03292242810130119,
0.07036643475294113,
-0.01541405450552702,
0.008192104287445545,
-0.09520208835601807,
-0.026905398815870285,
-0.09705513715744019,
-0.19556757807731628,
-0.008684361353516579,
-0.04483530670404434,
0.07369746267795563,
-0.0835549533367157,
-0.009483100846409798,
-0.006094804499298334,
0.07871425151824951,
-0.02206573449075222,
0.010723374783992767,
-0.04515891522169113,
0.04867442324757576,
-0.038731880486011505,
-0.03440176323056221,
0.01714428886771202,
0.008261961862444878,
-0.027453294023871422,
-0.011369681917130947,
-0.09695552289485931,
0.09812725335359573,
0.1130557730793953,
0.01761813461780548,
-0.03746887668967247,
0.10370426625013351,
-0.026744060218334198,
-0.040837362408638,
-0.02387993596494198,
-0.04956552013754845,
0.20916762948036194,
0.033815037459135056,
0.13287495076656342,
-0.07332964986562729,
-0.018248455598950386,
0.01764984428882599,
-0.01333057601004839,
0.05161317437887192,
0.14388388395309448,
0.04599061235785484,
-0.008924474939703941,
0.07575397193431854,
0.06887976825237274,
-0.15044216811656952,
0.05919014289975166,
-0.051416054368019104,
-0.10383903980255127,
-0.015181544236838818,
-0.01992945186793804,
0.028253791853785515,
0.18412531912326813,
-0.14189903438091278,
0.011214427649974823,
0.0574832446873188,
-0.00660528801381588,
0.047583941370248795,
-0.11307866871356964,
0.053275421261787415,
-0.015760455280542374,
-0.0738784596323967,
-0.046297743916511536,
-0.01687324047088623,
0.029905669391155243,
0.11683197319507599,
0.03882448002696037,
0.002230202779173851,
-0.04236932098865509,
-0.030536232516169548,
-0.07784948498010635,
0.24763011932373047,
-0.08744427561759949,
-0.14634348452091217,
-0.09725337475538254,
-0.043932151049375534,
-0.03863071650266647,
-0.016627337783575058,
0.013313285075128078,
-0.05845935270190239,
-0.09099748730659485,
-0.020852310582995415,
0.13169421255588531,
-0.01730353757739067,
-0.0073641641065478325,
-0.0003649864811450243,
-0.012321239337325096,
0.054576657712459564,
-0.12554089725017548,
0.03626334294676781,
-0.02722228690981865,
-0.07661933451890945,
0.011890423484146595,
0.03180103749036789,
0.11084636300802231,
0.09615027904510498,
-0.009484379552304745,
-0.007806801237165928,
0.006118339486420155,
0.2613220810890198,
-0.036779649555683136,
-0.024648655205965042,
0.1501363217830658,
-0.01669110544025898,
0.0077950311824679375,
0.10616818070411682,
0.021163135766983032,
-0.056268900632858276,
-0.036502037197351456,
0.01847299374639988,
-0.02298673428595066,
-0.28020524978637695,
-0.07449059188365936,
-0.05153408646583557,
0.03244677558541298,
0.032433509826660156,
0.03512509539723396,
-0.06068100407719612,
0.06098293140530586,
-0.020188404247164726,
0.1105932965874672,
-0.08070992678403854,
0.04987916722893715,
0.12095741927623749,
0.028556227684020996,
0.11967097222805023,
-0.03674284368753433,
-0.006047471426427364,
0.08227390795946121,
0.0936044380068779,
0.1807091236114502,
-0.0642978623509407,
0.13716791570186615,
0.03839651867747307,
0.16066548228263855,
0.02002532407641411,
0.09296149760484695,
-0.02149934321641922,
0.025584766641259193,
-0.009489556774497032,
-0.042428869754076004,
-0.10825060307979584,
-0.0072104125283658504,
-0.0029730908572673798,
0.04089045152068138,
-0.05893445760011673,
0.016855226829648018,
0.026947742328047752,
0.28922250866889954,
0.020343953743577003,
-0.22845610976219177,
-0.09667244553565979,
0.05922674760222435,
-0.026484379544854164,
-0.09342405945062637,
0.014645390212535858,
0.06229707598686218,
-0.08512162417173386,
0.01171424612402916,
-0.06625960022211075,
0.1316782683134079,
-0.06852804869413376,
-0.013129507191479206,
0.06583414226770401,
0.11039803922176361,
0.0597941018640995,
0.09050250798463821,
-0.2567419409751892,
0.22077520191669464,
0.036305613815784454,
0.11249589174985886,
-0.03948241099715233,
0.08346884697675705,
-0.004379263613373041,
0.04738006740808487,
0.08949527144432068,
-0.02337849885225296,
0.06940025836229324,
-0.10848800837993622,
-0.16451959311962128,
0.09688334912061691,
0.028589174151420593,
-0.006284572184085846,
0.10163863748311996,
-0.031236961483955383,
0.03250866383314133,
-0.012668145820498466,
-0.022046124562621117,
-0.13491696119308472,
-0.10571281611919403,
0.05306808277964592,
-0.08629029244184494,
-0.04981662333011627,
-0.05012824013829231,
-0.10136110335588455,
-0.06715381145477295,
0.14807969331741333,
-0.20487715303897858,
-0.07760823518037796,
-0.11313747614622116,
0.10002908855676651,
0.11529509723186493,
-0.11790351569652557,
-0.002616911893710494,
-0.08349482715129852,
0.07265152782201767,
0.028599834069609642,
-0.10396300256252289,
0.07819638401269913,
-0.10806529223918915,
-0.1542862057685852,
-0.03149142116308212,
0.07223937660455704,
0.03776438161730766,
0.058809030801057816,
-0.0009097559377551079,
0.02986803464591503,
-0.09711501002311707,
-0.12646807730197906,
-0.01471357699483633,
0.04251724109053612,
0.032782044261693954,
0.06618262082338333,
-0.08945570141077042,
-0.08439934998750687,
-0.028677094727754593,
0.02471296302974224,
0.1289072036743164,
0.15720166265964508,
-0.08725083619356155,
0.005978093016892672,
0.15237362682819366,
-0.06058608368039131,
-0.2361457794904709,
-0.042359042912721634,
0.10763116925954819,
0.06026727706193924,
-0.04154867306351662,
-0.18420423567295074,
0.12361905723810196,
0.09401912987232208,
-0.0375257171690464,
0.08478996157646179,
-0.1943414807319641,
-0.1033610850572586,
0.12732963263988495,
0.09909883141517639,
0.001295861671678722,
-0.2430572658777237,
-0.04987608268857002,
-0.04899907484650612,
-0.20433436334133148,
0.08339443057775497,
-0.11522769927978516,
0.1006644070148468,
-0.03888765349984169,
0.05328163132071495,
-0.014652298763394356,
-0.05305423587560654,
0.15975552797317505,
-0.04109741002321243,
-0.0425601601600647,
-0.012780135497450829,
0.04732641577720642,
0.10824057459831238,
-0.030289223417639732,
0.052033234387636185,
-0.06668862700462341,
0.09922569245100021,
-0.1630774736404419,
-0.0017728118691593409,
-0.06497132033109665,
0.039482735097408295,
-0.06973537057638168,
-0.028804173693060875,
-0.02008788473904133,
0.004482193384319544,
0.036905501037836075,
-0.0460621602833271,
0.0845821276307106,
0.006968288216739893,
0.10695861279964447,
0.11349562555551529,
0.08448698371648788,
-0.014733529649674892,
-0.08473312109708786,
0.022890927270054817,
-0.014896595850586891,
0.08769062906503677,
-0.14233429729938507,
0.03926271200180054,
0.15886346995830536,
0.005796590354293585,
0.05754560977220535,
0.026660747826099396,
-0.088035449385643,
0.031210966408252716,
0.03070145659148693,
-0.19268225133419037,
-0.12266401201486588,
0.03593982756137848,
-0.00850751157850027,
-0.13231943547725677,
0.04043280705809593,
0.1419716626405716,
-0.03389931470155716,
-0.036928143352270126,
-0.0006872119847685099,
0.020184766501188278,
-0.050151485949754715,
0.2309400588274002,
0.07329818606376648,
0.06472968310117722,
-0.1181127279996872,
0.06336838752031326,
0.04948342964053154,
-0.07324744015932083,
0.014483579434454441,
0.09676595777273178,
-0.07197444885969162,
-0.04229053109884262,
0.03653692826628685,
0.13523833453655243,
-0.033165041357278824,
-0.054697658866643906,
-0.10613162815570831,
-0.12499995529651642,
0.05505639687180519,
0.11632729321718216,
0.0553584098815918,
0.0031164062675088644,
-0.05773714557290077,
0.025707153603434563,
-0.07225539535284042,
0.14810985326766968,
0.04515498876571655,
0.04458451271057129,
-0.10046745091676712,
0.02999928779900074,
0.007067045662552118,
0.09678877890110016,
-0.05001848191022873,
0.045529793947935104,
-0.10479376465082169,
0.006262059789150953,
-0.2875460684299469,
0.04898344352841377,
0.005748729221522808,
-0.009163186885416508,
-0.04707697406411171,
-0.08097288757562637,
-0.07265495508909225,
0.0431678332388401,
-0.07087672501802444,
-0.031148558482527733,
-0.04346724972128868,
0.007449804339557886,
-0.14064525067806244,
-0.003631230676546693,
0.0427849255502224,
-0.07865201681852341,
0.1083720400929451,
0.07795927673578262,
0.018378013744950294,
0.07896579056978226,
-0.07303804159164429,
-0.0005253332201391459,
-0.00460504787042737,
0.07643932849168777,
0.0865369364619255,
-0.10881278663873672,
0.03928544744849205,
0.009151611477136612,
0.04917126148939133,
0.0016586438287049532,
0.12789319455623627,
-0.10714590549468994,
-0.02007654309272766,
-0.022889837622642517,
-0.05861362814903259,
-0.036194246262311935,
0.05323709920048714,
0.08772173523902893,
0.09274691343307495,
0.12317094206809998,
-0.0721554234623909,
0.04156414419412613,
-0.15659071505069733,
0.013082724064588547,
0.013233971782028675,
-0.05296832323074341,
-0.14185182750225067,
-0.0005296666640788317,
0.10214769840240479,
-0.05174611508846283,
0.1797855943441391,
0.011873286217451096,
0.13406473398208618,
0.03896503895521164,
-0.04652711749076843,
0.04919223487377167,
0.03969558700919151,
0.1553148478269577,
0.053765878081321716,
0.026415003463625908,
-0.0011698072776198387,
0.007368690799921751,
0.009376449510455132,
0.019357603043317795,
0.15203416347503662,
0.1546715646982193,
0.07215549051761627,
0.1059999018907547,
0.09460444748401642,
-0.08200671523809433,
-0.08681429177522659,
-0.03551173210144043,
-0.06401106715202332,
0.10008352249860764,
-0.013851447030901909,
0.12417082488536835,
0.07343374937772751,
-0.12593534588813782,
0.024634677916765213,
-0.07352887839078903,
-0.10590057820081711,
-0.14617227017879486,
0.08946947008371353,
-0.07763165980577469,
-0.08397839218378067,
-0.007439234759658575,
-0.16178487241268158,
-0.0034378315322101116,
0.10716124624013901,
0.03515951707959175,
-0.013385437428951263,
0.08593171089887619,
0.0712440088391304,
-0.0416535958647728,
0.06134498864412308,
0.04863874241709709,
0.051399730145931244,
0.048368312418460846,
0.0019229913596063852,
0.01128530316054821,
0.05783115699887276,
0.08920193463563919,
0.01608881913125515,
-0.09876143932342529,
0.0929209291934967,
-0.06247483566403389,
-0.07528463006019592,
0.005873832385987043,
0.03084813989698887,
0.0026051762979477644,
0.1688949018716812,
0.013905368745326996,
0.004219177644699812,
0.008278715424239635,
0.2091343253850937,
-0.07947336882352829,
-0.05062117055058479,
-0.17854243516921997,
0.15142059326171875,
-0.03331288322806358,
0.027971232309937477,
0.0538458451628685,
-0.04196779057383537,
-0.09106934815645218,
0.2275560349225998,
0.14793922007083893,
-0.10552533715963364,
-0.04519978538155556,
0.03347117826342583,
-0.0010429304093122482,
-0.025036482140421867,
0.1073789894580841,
0.1370815485715866,
0.21885168552398682,
-0.09491336345672607,
-0.05230886861681938,
-0.036457695066928864,
-0.01980164274573326,
-0.10397756099700928,
0.0440385527908802,
0.026472164317965508,
-0.013457375578582287,
-0.05526610463857651,
0.05588650703430176,
-0.016157569363713264,
-0.14533910155296326,
-0.03999585658311844,
-0.14721576869487762,
-0.20230717957019806,
-0.04442347586154938,
0.09649699181318283,
0.01845400780439377,
0.035608936101198196,
-0.003938908688724041,
0.017944708466529846,
0.17493201792240143,
-0.02802513912320137,
-0.06382986158132553,
-0.06762714684009552,
0.13824927806854248,
-0.028087619692087173,
0.1485537439584732,
-0.005930950399488211,
0.0638880506157875,
0.13544541597366333,
0.01983729749917984,
-0.101019948720932,
0.03920775279402733,
0.060483548790216446,
-0.15426209568977356,
0.06089991703629494,
0.10515042394399643,
-0.015597197227180004,
-0.0020727505907416344,
0.06631634384393692,
0.009412148036062717,
-0.028031446039676666,
-0.08919889479875565,
-0.05179767310619354,
-0.07536371052265167,
0.03479978069663048,
-0.10744572430849075,
0.10573017597198486,
0.205346018075943,
-0.03491142764687538,
-0.014769951812922955,
-0.10533741861581802,
0.03189359977841377,
0.02794777788221836,
0.11753363162279129,
-0.0011080129770562053,
-0.24540358781814575,
-0.0018837302923202515,
0.043461188673973083,
0.037443485110998154,
-0.23147673904895782,
-0.0014728560345247388,
-0.02656707540154457,
0.0015624454244971275,
-0.047910261899232864,
0.11029449105262756,
0.04017546400427818,
0.03903815895318985,
-0.054002661257982254,
-0.13074548542499542,
-0.060743238776922226,
0.09879643470048904,
-0.16287335753440857,
-0.10723985731601715
] |
null | null | transformers |
# ELECTRA-base for QA
## Overview
**Language model:** electra-base </br>
**Language:** English </br>
**Downstream-task:** Extractive QA </br>
**Training data:** SQuAD 2.0 </br>
**Eval data:** SQuAD 2.0 </br>
**Code:** <TBD> </br>
## Env Information
`transformers` version: 4.9.1 </br>
Platform: Linux-5.4.104+-x86_64-with-Ubuntu-18.04-bionic </br>
Python version: 3.7.11 </br>
PyTorch version (GPU?): 1.9.0+cu102 (False)</br>
Tensorflow version (GPU?): 2.5.0 (False)</br>
## Hyperparameters
```
max_seq_len=386
doc_stride=128
n_best_size=20
max_answer_length=30
min_null_score=7.0
batch_size=8
n_epochs=2
base_LM_model = "google/electra-base-discriminator"
learning_rate=1.5e-5
adam_epsilon=1e-5
adam_beta1=0.95
adam_beta2=0.999
warmup_steps=100
weight_decay=0.01
optimizer=AdamW
lr_scheduler="polynomial"
```
##### There is a special threshold value CLS_threshold=-3 used to more accurately identify no answers [Logic will be available in GitHub Repo [TBD]
## Performance
```
"exact": 79.331256
"f1": 83.232347\t
"total": 11873
"HasAns_exact": 76.501350
"HasAns_f1": 84.314719
"HasAns_total": 5928
"NoAns_exact": 82.153070
"NoAns_f1": 82.153070
"NoAns_total": 5945
```
## Usage
### In Transformers
```python
from transformers import AutoModelForQuestionAnswering, AutoTokenizer, pipeline
model_name = "PremalMatalia/electra-base-best-squad2"
# a) Get predictions
nlp = pipeline('question-answering', model=model_name, tokenizer=model_name)
QA_input = {
'question': 'Which name is also used to describe the Amazon rainforest in English?',
'context': 'The Amazon rainforest (Portuguese: Floresta Amazônica or Amazônia; Spanish: Selva Amazónica, Amazonía or usually Amazonia; French: Forêt amazonienne; Dutch: Amazoneregenwoud), also known in English as Amazonia or the Amazon Jungle, is a moist broadleaf forest that covers most of the Amazon basin of South America. This basin encompasses 7,000,000 square kilometres (2,700,000 sq mi), of which 5,500,000 square kilometres (2,100,000 sq mi) are covered by the rainforest. This region includes territory belonging to nine nations. The majority of the forest is contained within Brazil, with 60% of the rainforest, followed by Peru with 13%, Colombia with 10%, and with minor amounts in Venezuela, Ecuador, Bolivia, Guyana, Suriname and French Guiana. States or departments in four nations contain "Amazonas" in their names. The Amazon represents over half of the planet\'s remaining rainforests, and comprises the largest and most biodiverse tract of tropical rainforest in the world, with an estimated 390 billion individual trees divided into 16,000 species.'
}
res = nlp(QA_input)
print(res)
# b) Load model & tokenizer
model = AutoModelForQuestionAnswering.from_pretrained(model_name)
tokenizer = AutoTokenizer.from_pretrained(model_name)
```
## Authors
Premal Matalia | {"datasets": ["squad_v2"]} | question-answering | PremalMatalia/electra-base-best-squad2 | [
"transformers",
"pytorch",
"electra",
"question-answering",
"dataset:squad_v2",
"endpoints_compatible",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [] | TAGS
#transformers #pytorch #electra #question-answering #dataset-squad_v2 #endpoints_compatible #region-us
|
# ELECTRA-base for QA
## Overview
Language model: electra-base </br>
Language: English </br>
Downstream-task: Extractive QA </br>
Training data: SQuAD 2.0 </br>
Eval data: SQuAD 2.0 </br>
Code: <TBD> </br>
## Env Information
'transformers' version: 4.9.1 </br>
Platform: Linux-5.4.104+-x86_64-with-Ubuntu-18.04-bionic </br>
Python version: 3.7.11 </br>
PyTorch version (GPU?): 1.9.0+cu102 (False)</br>
Tensorflow version (GPU?): 2.5.0 (False)</br>
## Hyperparameters
##### There is a special threshold value CLS_threshold=-3 used to more accurately identify no answers [Logic will be available in GitHub Repo [TBD]
## Performance
## Usage
### In Transformers
## Authors
Premal Matalia | [
"# ELECTRA-base for QA",
"## Overview\nLanguage model: electra-base </br>\nLanguage: English </br>\nDownstream-task: Extractive QA </br>\nTraining data: SQuAD 2.0 </br>\nEval data: SQuAD 2.0 </br>\nCode: <TBD> </br>",
"## Env Information\n'transformers' version: 4.9.1 </br>\nPlatform: Linux-5.4.104+-x86_64-with-Ubuntu-18.04-bionic </br>\nPython version: 3.7.11 </br>\nPyTorch version (GPU?): 1.9.0+cu102 (False)</br>\nTensorflow version (GPU?): 2.5.0 (False)</br>",
"## Hyperparameters",
"##### There is a special threshold value CLS_threshold=-3 used to more accurately identify no answers [Logic will be available in GitHub Repo [TBD]",
"## Performance",
"## Usage",
"### In Transformers",
"## Authors\nPremal Matalia"
] | [
"TAGS\n#transformers #pytorch #electra #question-answering #dataset-squad_v2 #endpoints_compatible #region-us \n",
"# ELECTRA-base for QA",
"## Overview\nLanguage model: electra-base </br>\nLanguage: English </br>\nDownstream-task: Extractive QA </br>\nTraining data: SQuAD 2.0 </br>\nEval data: SQuAD 2.0 </br>\nCode: <TBD> </br>",
"## Env Information\n'transformers' version: 4.9.1 </br>\nPlatform: Linux-5.4.104+-x86_64-with-Ubuntu-18.04-bionic </br>\nPython version: 3.7.11 </br>\nPyTorch version (GPU?): 1.9.0+cu102 (False)</br>\nTensorflow version (GPU?): 2.5.0 (False)</br>",
"## Hyperparameters",
"##### There is a special threshold value CLS_threshold=-3 used to more accurately identify no answers [Logic will be available in GitHub Repo [TBD]",
"## Performance",
"## Usage",
"### In Transformers",
"## Authors\nPremal Matalia"
] | [
39,
9,
69,
95,
5,
43,
2,
3,
6,
7
] | [
"passage: TAGS\n#transformers #pytorch #electra #question-answering #dataset-squad_v2 #endpoints_compatible #region-us \n# ELECTRA-base for QA## Overview\nLanguage model: electra-base </br>\nLanguage: English </br>\nDownstream-task: Extractive QA </br>\nTraining data: SQuAD 2.0 </br>\nEval data: SQuAD 2.0 </br>\nCode: <TBD> </br>## Env Information\n'transformers' version: 4.9.1 </br>\nPlatform: Linux-5.4.104+-x86_64-with-Ubuntu-18.04-bionic </br>\nPython version: 3.7.11 </br>\nPyTorch version (GPU?): 1.9.0+cu102 (False)</br>\nTensorflow version (GPU?): 2.5.0 (False)</br>## Hyperparameters##### There is a special threshold value CLS_threshold=-3 used to more accurately identify no answers [Logic will be available in GitHub Repo [TBD]## Performance## Usage### In Transformers## Authors\nPremal Matalia"
] | [
-0.07675384730100632,
0.08471833169460297,
-0.004917416255921125,
0.03368711844086647,
0.137785404920578,
0.02013879455626011,
0.08184420317411423,
0.09049884229898453,
-0.03964062035083771,
0.14363938570022583,
0.08134817332029343,
0.11544782668352127,
0.07487007230520248,
0.11373087018728256,
-0.04066787287592888,
-0.16640792787075043,
0.036858562380075455,
-0.004942987579852343,
-0.05777604877948761,
0.10732650756835938,
0.10631260275840759,
-0.07858018577098846,
0.061284251511096954,
-0.023800022900104523,
-0.07769700884819031,
0.05081166699528694,
0.016461113467812538,
-0.07451927661895752,
0.11248188465833664,
0.05277122184634209,
0.07077839970588684,
-0.021359318867325783,
0.06297674775123596,
-0.20201192796230316,
0.00009769076859811321,
0.09790608286857605,
-0.009383056312799454,
0.06962451338768005,
0.029837481677532196,
-0.044443171471357346,
0.08798052370548248,
-0.07398801296949387,
0.07134348154067993,
0.022024711593985558,
-0.09615891426801682,
-0.20196282863616943,
-0.08346965909004211,
0.07515711337327957,
0.05335640907287598,
0.0701565369963646,
-0.011923030018806458,
0.18557202816009521,
-0.05089926719665527,
0.1080329418182373,
0.22377093136310577,
-0.285143107175827,
-0.05865262448787689,
-0.011005880311131477,
0.11585511267185211,
0.07217103242874146,
-0.03918660059571266,
0.0061989580281078815,
0.048315733671188354,
0.07416269928216934,
0.04504820331931114,
-0.01765293814241886,
-0.06907328963279724,
0.022362254559993744,
-0.1525033414363861,
-0.049948565661907196,
0.14490142464637756,
-0.013447852805256844,
-0.04554912447929382,
-0.006901699583977461,
-0.11717608571052551,
-0.05766243487596512,
0.005754395853728056,
-0.042541585862636566,
-0.00014766801905352622,
-0.03827052190899849,
-0.019941123202443123,
-0.001725356443785131,
-0.07082396745681763,
-0.10464171320199966,
-0.058092813938856125,
0.2020639330148697,
0.07307568192481995,
0.07311124354600906,
-0.011475425213575363,
0.06804263591766357,
-0.0635843351483345,
-0.1326565146446228,
-0.017085323110222816,
-0.007006136234849691,
-0.05738573521375656,
-0.004483221098780632,
-0.013203652575612068,
0.0057079666294157505,
0.06667190790176392,
0.2181268334388733,
-0.08858469128608704,
0.017170706763863564,
0.04069502651691437,
-0.007434237282723188,
0.02523951232433319,
0.08977799862623215,
-0.17167368531227112,
-0.01711517572402954,
0.0569961816072464,
0.015428872779011726,
0.03565819561481476,
-0.0438222736120224,
-0.02068394422531128,
-0.03896937891840935,
0.13084737956523895,
0.06997180730104446,
0.03623383492231369,
0.030124740675091743,
-0.02584662288427353,
-0.039492037147283554,
0.03916242718696594,
-0.14331716299057007,
0.0034512458369135857,
0.007967554964125156,
-0.039202068001031876,
0.01274005975574255,
0.02908102050423622,
-0.010700357146561146,
-0.1143769696354866,
0.07847025990486145,
-0.07805982232093811,
-0.04385931044816971,
-0.06468982249498367,
-0.14914944767951965,
0.02468668296933174,
-0.0559689924120903,
-0.00588394096121192,
-0.1084594801068306,
-0.14892086386680603,
-0.018975598737597466,
0.016990691423416138,
-0.07039850205183029,
-0.04014161601662636,
0.07166095077991486,
-0.08845004439353943,
0.02878445014357567,
-0.05017387494444847,
0.0578182190656662,
-0.06536255776882172,
0.09015113860368729,
0.12337926030158997,
0.06073576956987381,
-0.012725727632641792,
0.017287205904722214,
-0.059978075325489044,
0.0019897930324077606,
-0.18174278736114502,
0.045197680592536926,
-0.10970047861337662,
0.019201483577489853,
-0.13695493340492249,
-0.0814642533659935,
0.053282491862773895,
-0.009063677862286568,
0.10047944635152817,
0.06895913928747177,
-0.12766653299331665,
-0.03567756712436676,
0.1834104210138321,
-0.11145646125078201,
-0.10727038979530334,
0.1123182401061058,
0.018952516838908195,
-0.04559293016791344,
0.0627199336886406,
0.09614327549934387,
0.0323927141726017,
-0.177292138338089,
-0.07324038445949554,
0.013956214301288128,
0.019057393074035645,
0.03172237053513527,
0.10429893434047699,
0.008939106948673725,
0.06796550750732422,
0.004844270646572113,
-0.0606163926422596,
-0.021502790972590446,
-0.07270517200231552,
-0.06627271324396133,
-0.05104926601052284,
-0.03093492053449154,
-0.029788007959723473,
0.012539315037429333,
-0.0033374663908034563,
-0.04494454711675644,
-0.11592549830675125,
0.01894497498869896,
0.09085782617330551,
-0.0532061792910099,
0.017096178606152534,
-0.1136588528752327,
0.10665586590766907,
-0.06624297797679901,
0.017104916274547577,
-0.21565856039524078,
-0.013070883229374886,
0.04853808879852295,
-0.05081576481461525,
0.02902413159608841,
-0.01856403425335884,
0.024783264845609665,
0.04293651878833771,
0.00509747164323926,
0.020198838785290718,
-0.07202233374118805,
-0.013789638876914978,
-0.07003604620695114,
-0.2061142921447754,
-0.015928933396935463,
-0.0285346582531929,
0.03077174909412861,
-0.11791128665208817,
0.014033839106559753,
0.05341951549053192,
0.12226597219705582,
-0.04741581901907921,
-0.012183848768472672,
-0.04831486940383911,
0.04650657996535301,
-0.0562998466193676,
-0.04058082401752472,
-0.008656546473503113,
0.008794750086963177,
-0.03080616146326065,
-0.010931861586868763,
-0.15337827801704407,
0.07151629775762558,
0.11754189431667328,
-0.0033689248375594616,
-0.09485229849815369,
0.09181945025920868,
-0.02727486565709114,
-0.027574313804507256,
-0.037097539752721786,
-0.05555988848209381,
0.15915697813034058,
0.033572252839803696,
0.09260513633489609,
-0.0862564891576767,
-0.03134975582361221,
0.022495577111840248,
-0.039499759674072266,
0.04940887540578842,
0.10638397186994553,
0.07604911178350449,
-0.11479110270738602,
0.07919397205114365,
0.01999811828136444,
-0.12043307721614838,
0.05345948413014412,
-0.03409053012728691,
-0.08148617297410965,
-0.03735105320811272,
0.004000125918537378,
0.033717334270477295,
0.12826338410377502,
-0.0896594375371933,
0.030672311782836914,
0.03575706109404564,
0.008997713215649128,
0.037837617099285126,
-0.12337929755449295,
0.0526614747941494,
-0.002831512363627553,
-0.06385007500648499,
-0.05977204069495201,
-0.015554274432361126,
0.026938481256365776,
0.10569439828395844,
0.028577987104654312,
0.02943241596221924,
-0.036042287945747375,
-0.02584335207939148,
-0.1180415153503418,
0.2450709342956543,
-0.09553173929452896,
-0.14414435625076294,
-0.09600105881690979,
0.005728160962462425,
-0.02400665543973446,
-0.007717760279774666,
0.03652370348572731,
-0.05394972860813141,
-0.10265196114778519,
-0.025132833048701286,
0.0885673388838768,
-0.034939397126436234,
-0.018976155668497086,
0.0011143117444589734,
0.0016433601267635822,
0.06434432417154312,
-0.14893974363803864,
0.02197735197842121,
-0.028574861586093903,
-0.0939459279179573,
0.02519790083169937,
0.054084889590740204,
0.09744326770305634,
0.0999094769358635,
-0.024212896823883057,
-0.013912977650761604,
0.008753369562327862,
0.25335216522216797,
-0.0544726587831974,
-0.022669164463877678,
0.19105368852615356,
0.01596885174512863,
0.04580074921250343,
0.13278429210186005,
0.024170322343707085,
-0.06762296706438065,
-0.0009314247872680426,
0.0384368859231472,
-0.04391200467944145,
-0.2759617567062378,
-0.06453818827867508,
-0.043572913855314255,
-0.00960405170917511,
0.05417478084564209,
0.04945492371916771,
-0.013328035362064838,
0.06566013395786285,
-0.06407758593559265,
0.06004492938518524,
-0.06421318650245667,
0.08317717909812927,
0.16190391778945923,
0.03642266243696213,
0.12051260471343994,
-0.02842377871274948,
-0.010037626139819622,
0.06698819994926453,
0.07939893752336502,
0.1500689536333084,
-0.06899701058864594,
0.14145049452781677,
0.07220163941383362,
0.14432299137115479,
0.001426807837560773,
0.08115490525960922,
-0.016615716740489006,
0.03450954332947731,
0.008844176307320595,
-0.06541404873132706,
-0.07523718476295471,
0.003985715564340353,
-0.014585561119019985,
0.02035742998123169,
-0.026689467951655388,
0.015172723680734634,
0.05436346307396889,
0.21929678320884705,
0.0026393085718154907,
-0.24686850607395172,
-0.0987100899219513,
0.02968234196305275,
-0.025709453970193863,
-0.08168448507785797,
0.006650471594184637,
0.06309940665960312,
-0.10086849331855774,
0.04563256725668907,
-0.01986525021493435,
0.10818809270858765,
-0.04986109212040901,
0.0030572821851819754,
0.03777177631855011,
0.11659952998161316,
0.0289240013808012,
0.09146467596292496,
-0.21895012259483337,
0.1773020625114441,
0.03271660581231117,
0.10681626945734024,
-0.03110472671687603,
0.08913996070623398,
-0.03959784284234047,
0.03182649612426758,
0.07881790399551392,
-0.012062025256454945,
0.021194156259298325,
-0.11518856137990952,
-0.10208208113908768,
0.051621224731206894,
0.07404312491416931,
-0.02056334726512432,
0.11392144113779068,
-0.033981725573539734,
0.04087699204683304,
0.001918033347465098,
-0.060111112892627716,
-0.14562836289405823,
-0.10606361925601959,
0.06562945991754532,
-0.06043212115764618,
-0.03741302341222763,
-0.05433274805545807,
-0.07755640149116516,
-0.018696576356887817,
0.1706443578004837,
-0.19853001832962036,
-0.08156038820743561,
-0.12194636464118958,
0.09244377166032791,
0.14061328768730164,
-0.10628369450569153,
0.0023528540041297674,
-0.052078988403081894,
0.0787000060081482,
0.016928579658269882,
-0.08882997930049896,
0.10040821135044098,
-0.09567835181951523,
-0.1614675372838974,
-0.02901015616953373,
0.08271060883998871,
0.020500438287854195,
0.03019814006984234,
0.006111629772931337,
0.037150729447603226,
-0.07608193904161453,
-0.13895739614963531,
0.002323043067008257,
0.013091836124658585,
0.06434369087219238,
0.08758420497179031,
-0.06567472964525223,
-0.07030128687620163,
-0.04516878351569176,
-0.010506970807909966,
0.10639439523220062,
0.18523874878883362,
-0.08246186375617981,
0.0077504366636276245,
0.17647959291934967,
-0.05658433213829994,
-0.22755156457424164,
-0.0411117821931839,
0.08412408083677292,
0.016073737293481827,
-0.03893161192536354,
-0.1694185882806778,
0.1255119889974594,
0.06836304813623428,
-0.026931028813123703,
0.04988677799701691,
-0.2142912894487381,
-0.13964596390724182,
0.0953584760427475,
0.055743854492902756,
-0.026184869930148125,
-0.2255968451499939,
-0.05589418113231659,
-0.05677960440516472,
-0.16372543573379517,
0.10024953633546829,
-0.10598216205835342,
0.09295262396335602,
-0.007985654287040234,
0.08850062638521194,
-0.01348482072353363,
-0.05585635453462601,
0.14117500185966492,
-0.011464681476354599,
-0.008495727553963661,
-0.03409455716609955,
0.023758070543408394,
0.11516605317592621,
-0.01872531697154045,
0.06654726713895798,
-0.01158470194786787,
0.09252610802650452,
-0.12938790023326874,
-0.01336522027850151,
-0.06795188039541245,
0.07837454974651337,
-0.06151780113577843,
-0.05412745475769043,
-0.06066654250025749,
-0.003513573668897152,
0.04035945609211922,
-0.06380778551101685,
0.04378027841448784,
-0.011396443471312523,
0.15328457951545715,
0.1464318335056305,
0.07206224650144577,
-0.01626761630177498,
-0.12121392786502838,
0.043177783489227295,
-0.013851404190063477,
0.09585734456777573,
-0.12667600810527802,
0.047085508704185486,
0.136754110455513,
0.0598565936088562,
0.05815717205405235,
0.04368934780359268,
-0.11476346850395203,
-0.00779670150950551,
0.04689706116914749,
-0.18714942038059235,
-0.07267335057258606,
0.06573975831270218,
0.01622616872191429,
-0.11782218515872955,
0.06402816623449326,
0.14926180243492126,
-0.025806589052081108,
-0.054631464183330536,
0.006886382587254047,
0.04785820469260216,
-0.043537601828575134,
0.22033558785915375,
0.07553192973136902,
0.07495258003473282,
-0.11527715623378754,
0.05730597674846649,
0.06979285925626755,
-0.07931684702634811,
0.051506999880075455,
0.09230466187000275,
-0.07913682609796524,
-0.02667229436337948,
0.0142058115452528,
0.05426632612943649,
-0.11883619427680969,
-0.05473615229129791,
-0.09001260250806808,
-0.10737340152263641,
0.050572846084833145,
0.18734721839427948,
0.035939544439315796,
0.004676608368754387,
-0.020655447617173195,
0.02268209494650364,
-0.08654490113258362,
0.10614325106143951,
0.02276020124554634,
0.040595848113298416,
-0.12794974446296692,
0.10002346336841583,
0.009321163408458233,
0.06438255310058594,
-0.03400449454784393,
0.03996523842215538,
-0.12130504101514816,
0.005706633906811476,
-0.15095330774784088,
0.03671193867921829,
-0.018680330365896225,
-0.00793851725757122,
-0.0010581625392660499,
-0.07272141426801682,
-0.061665747314691544,
0.05367150902748108,
-0.059274911880493164,
-0.050597742199897766,
-0.03892131149768829,
0.011915739625692368,
-0.19399350881576538,
0.00923331268131733,
0.06488676369190216,
-0.07868793606758118,
0.09759603440761566,
0.11342377215623856,
0.016050046309828758,
0.05997924134135246,
-0.08035098761320114,
-0.005460603628307581,
0.030506828799843788,
0.07487660646438599,
0.046521201729774475,
-0.09576155990362167,
0.02864677645266056,
0.006827207747846842,
0.0390566922724247,
0.014196866191923618,
0.0908428207039833,
-0.1283990740776062,
-0.0010058271000161767,
-0.0274435393512249,
-0.05155019834637642,
-0.05781145393848419,
0.043054040521383286,
0.10239669680595398,
0.0796620324254036,
0.10696220397949219,
-0.08395162224769592,
0.018971780315041542,
-0.17853286862373352,
0.01150165218859911,
-0.0007808629889041185,
-0.0251802708953619,
-0.07437494397163391,
0.04052712023258209,
0.10331010818481445,
-0.0410284623503685,
0.19037562608718872,
-0.006946666166186333,
0.12179175764322281,
0.045345451682806015,
-0.03141283243894577,
0.004865831229835749,
0.03745686262845993,
0.1341579258441925,
0.045180756598711014,
0.01208268478512764,
-0.022084463387727737,
-0.009118936955928802,
0.009083887562155724,
-0.03224096819758415,
0.13274118304252625,
0.1201859563589096,
0.08255840092897415,
0.05664386227726936,
0.0964973121881485,
-0.10268620401620865,
-0.07211940735578537,
-0.027951935306191444,
-0.05288848280906677,
0.05867329612374306,
-0.007745076436549425,
0.07933424413204193,
0.15663103759288788,
-0.09853522479534149,
0.03873057663440704,
-0.08039543032646179,
-0.11196789145469666,
-0.15671025216579437,
0.05620218813419342,
-0.06798669695854187,
-0.0924493595957756,
-0.0015898460987955332,
-0.1622055470943451,
0.033085037022829056,
0.09870227426290512,
0.02987680770456791,
0.015915730968117714,
0.09019731730222702,
0.0728040412068367,
-0.037829119712114334,
-0.007468029856681824,
0.036924391984939575,
0.01953790709376335,
0.08241641521453857,
-0.002373752184212208,
0.037705615162849426,
0.06342091411352158,
0.11379890888929367,
0.031864434480667114,
-0.05102403461933136,
0.08761746436357498,
-0.062839075922966,
-0.053765833377838135,
-0.002847280353307724,
0.02740672044456005,
-0.012249822728335857,
0.14759916067123413,
0.024117641150951385,
-0.01327691599726677,
-0.009131710976362228,
0.24788638949394226,
-0.08726055175065994,
-0.09695622324943542,
-0.153906911611557,
0.20906688272953033,
0.005667115096002817,
0.04697674885392189,
0.021369576454162598,
-0.09238512068986893,
-0.07087414711713791,
0.14417611062526703,
0.12672090530395508,
-0.04920356720685959,
-0.013335954397916794,
0.016340672969818115,
-0.0056578475050628185,
-0.024561507627367973,
0.04564701393246651,
0.1351311057806015,
0.25963205099105835,
-0.08480080217123032,
0.005234889220446348,
-0.021021416410803795,
0.004063161090016365,
-0.12429080903530121,
0.012890655547380447,
0.005937288980931044,
0.01849227212369442,
-0.05127612128853798,
0.07758104801177979,
0.0019876547157764435,
-0.19664928317070007,
-0.04817874729633331,
-0.12152038514614105,
-0.13755394518375397,
-0.025501804426312447,
0.07431545853614807,
-0.021939270198345184,
0.0433262437582016,
-0.0018011569045484066,
-0.0003748993622139096,
0.07583841681480408,
0.008075849153101444,
-0.07779401540756226,
-0.051079556345939636,
0.1360211968421936,
0.006339434999972582,
0.12780070304870605,
-0.032309725880622864,
0.1177452802658081,
0.14079229533672333,
-0.002647717949002981,
-0.09497984498739243,
0.054288022220134735,
0.05087382346391678,
-0.13134342432022095,
0.04032508283853531,
0.11478134244680405,
-0.0064190588891506195,
0.0010855626314878464,
0.060348790138959885,
-0.06718370318412781,
-0.0263579860329628,
-0.03763562813401222,
-0.002799814334139228,
-0.10191582888364792,
0.01447771955281496,
-0.11051540076732635,
0.11671308428049088,
0.16030094027519226,
-0.052135251462459564,
0.0083536421880126,
-0.09581249207258224,
0.015042013488709927,
0.008674598298966885,
0.0808105319738388,
-0.005809529684484005,
-0.24857217073440552,
0.04721435531973839,
0.02299610711634159,
0.06006569415330887,
-0.22313539683818817,
-0.024714060127735138,
0.04624880850315094,
-0.003058183006942272,
-0.05128013342618942,
0.10633603483438492,
0.03224199637770653,
0.060724060982465744,
-0.04482937231659889,
-0.15383665263652802,
-0.04510221630334854,
0.1403970718383789,
-0.15364451706409454,
-0.11284846812486649
] |
null | null | transformers |
# RoBERTa-base for QA
## Overview
**Language model:** 'roberta-base' </br>
**Language:** English </br>
**Downstream-task:** Extractive QA </br>
**Training data:** SQuAD 2.0 </br>
**Eval data:** SQuAD 2.0 </br>
**Code:** <TBD> </br>
## Env Information
`transformers` version: 4.9.1 </br>
Platform: Linux-5.4.104+-x86_64-with-Ubuntu-18.04-bionic </br>
Python version: 3.7.11 </br>
PyTorch version (GPU?): 1.9.0+cu102 (False)</br>
Tensorflow version (GPU?): 2.5.0 (False)</br>
## Hyperparameters
```
max_seq_len=386
doc_stride=128
n_best_size=20
max_answer_length=30
min_null_score=7.0
batch_size=8
n_epochs=6
base_LM_model = "roberta-base"
learning_rate=1.5e-5
adam_epsilon=1e-5
adam_beta1=0.95
adam_beta2=0.999
warmup_steps=100
weight_decay=0.01
optimizer=AdamW
lr_scheduler="polynomial"
```
##### There is a special threshold value CLS_threshold=-3 used to more accurately identify no answers [Logic will be available in GitHub Repo [TBD]
## Performance
```
"exact": 81.192622
"f1": 83.95408
"total": 11873
"HasAns_exact": 74.190283
"HasAns_f1": 79.721119
"HasAns_total": 5928
"NoAns_exact": 88.174937
"NoAns_f1": 88.174937
"NoAns_total": 5945
```
## Usage
### In Transformers
```python
from transformers import AutoModelForQuestionAnswering, AutoTokenizer, pipeline
model_name = "PremalMatalia/roberta-base-best-squad2"
# a) Get predictions
nlp = pipeline('question-answering', model=model_name, tokenizer=model_name)
QA_input = {
'question': 'Which name is also used to describe the Amazon rainforest in English?',
'context': 'The Amazon rainforest (Portuguese: Floresta Amazônica or Amazônia; Spanish: Selva Amazónica, Amazonía or usually Amazonia; French: Forêt amazonienne; Dutch: Amazoneregenwoud), also known in English as Amazonia or the Amazon Jungle, is a moist broadleaf forest that covers most of the Amazon basin of South America. This basin encompasses 7,000,000 square kilometres (2,700,000 sq mi), of which 5,500,000 square kilometres (2,100,000 sq mi) are covered by the rainforest. This region includes territory belonging to nine nations. The majority of the forest is contained within Brazil, with 60% of the rainforest, followed by Peru with 13%, Colombia with 10%, and with minor amounts in Venezuela, Ecuador, Bolivia, Guyana, Suriname and French Guiana. States or departments in four nations contain "Amazonas" in their names. The Amazon represents over half of the planet\'s remaining rainforests, and comprises the largest and most biodiverse tract of tropical rainforest in the world, with an estimated 390 billion individual trees divided into 16,000 species.'
}
res = nlp(QA_input)
print(res)
# b) Load model & tokenizer
model = AutoModelForQuestionAnswering.from_pretrained(model_name)
tokenizer = AutoTokenizer.from_pretrained(model_name)
```
## Authors
Premal Matalia | {"datasets": ["squad_v2"]} | question-answering | PremalMatalia/roberta-base-best-squad2 | [
"transformers",
"pytorch",
"roberta",
"question-answering",
"dataset:squad_v2",
"endpoints_compatible",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [] | TAGS
#transformers #pytorch #roberta #question-answering #dataset-squad_v2 #endpoints_compatible #region-us
|
# RoBERTa-base for QA
## Overview
Language model: 'roberta-base' </br>
Language: English </br>
Downstream-task: Extractive QA </br>
Training data: SQuAD 2.0 </br>
Eval data: SQuAD 2.0 </br>
Code: <TBD> </br>
## Env Information
'transformers' version: 4.9.1 </br>
Platform: Linux-5.4.104+-x86_64-with-Ubuntu-18.04-bionic </br>
Python version: 3.7.11 </br>
PyTorch version (GPU?): 1.9.0+cu102 (False)</br>
Tensorflow version (GPU?): 2.5.0 (False)</br>
## Hyperparameters
##### There is a special threshold value CLS_threshold=-3 used to more accurately identify no answers [Logic will be available in GitHub Repo [TBD]
## Performance
## Usage
### In Transformers
## Authors
Premal Matalia | [
"# RoBERTa-base for QA",
"## Overview\nLanguage model: 'roberta-base' </br>\nLanguage: English </br>\nDownstream-task: Extractive QA </br>\nTraining data: SQuAD 2.0 </br>\nEval data: SQuAD 2.0 </br>\nCode: <TBD> </br>",
"## Env Information\n'transformers' version: 4.9.1 </br>\nPlatform: Linux-5.4.104+-x86_64-with-Ubuntu-18.04-bionic </br>\nPython version: 3.7.11 </br>\nPyTorch version (GPU?): 1.9.0+cu102 (False)</br>\nTensorflow version (GPU?): 2.5.0 (False)</br>",
"## Hyperparameters",
"##### There is a special threshold value CLS_threshold=-3 used to more accurately identify no answers [Logic will be available in GitHub Repo [TBD]",
"## Performance",
"## Usage",
"### In Transformers",
"## Authors\nPremal Matalia"
] | [
"TAGS\n#transformers #pytorch #roberta #question-answering #dataset-squad_v2 #endpoints_compatible #region-us \n",
"# RoBERTa-base for QA",
"## Overview\nLanguage model: 'roberta-base' </br>\nLanguage: English </br>\nDownstream-task: Extractive QA </br>\nTraining data: SQuAD 2.0 </br>\nEval data: SQuAD 2.0 </br>\nCode: <TBD> </br>",
"## Env Information\n'transformers' version: 4.9.1 </br>\nPlatform: Linux-5.4.104+-x86_64-with-Ubuntu-18.04-bionic </br>\nPython version: 3.7.11 </br>\nPyTorch version (GPU?): 1.9.0+cu102 (False)</br>\nTensorflow version (GPU?): 2.5.0 (False)</br>",
"## Hyperparameters",
"##### There is a special threshold value CLS_threshold=-3 used to more accurately identify no answers [Logic will be available in GitHub Repo [TBD]",
"## Performance",
"## Usage",
"### In Transformers",
"## Authors\nPremal Matalia"
] | [
39,
9,
71,
95,
5,
43,
2,
3,
6,
7
] | [
"passage: TAGS\n#transformers #pytorch #roberta #question-answering #dataset-squad_v2 #endpoints_compatible #region-us \n# RoBERTa-base for QA## Overview\nLanguage model: 'roberta-base' </br>\nLanguage: English </br>\nDownstream-task: Extractive QA </br>\nTraining data: SQuAD 2.0 </br>\nEval data: SQuAD 2.0 </br>\nCode: <TBD> </br>## Env Information\n'transformers' version: 4.9.1 </br>\nPlatform: Linux-5.4.104+-x86_64-with-Ubuntu-18.04-bionic </br>\nPython version: 3.7.11 </br>\nPyTorch version (GPU?): 1.9.0+cu102 (False)</br>\nTensorflow version (GPU?): 2.5.0 (False)</br>## Hyperparameters##### There is a special threshold value CLS_threshold=-3 used to more accurately identify no answers [Logic will be available in GitHub Repo [TBD]## Performance## Usage### In Transformers## Authors\nPremal Matalia"
] | [
-0.0818173810839653,
0.0927204117178917,
-0.0048794616013765335,
0.03822052478790283,
0.13921362161636353,
0.021977929398417473,
0.07681652903556824,
0.0970633327960968,
-0.0588899664580822,
0.13562360405921936,
0.08875064551830292,
0.11131785064935684,
0.07189217954874039,
0.09237556904554367,
-0.03700891509652138,
-0.1684766262769699,
0.04009279981255531,
-0.01987900212407112,
-0.06948747485876083,
0.11351101100444794,
0.10287779569625854,
-0.06166400387883186,
0.06093519181013107,
-0.01806844212114811,
-0.09010342508554459,
0.042324502021074295,
0.025555869564414024,
-0.07093115895986557,
0.1027357280254364,
0.049631595611572266,
0.0679662823677063,
-0.02114749699831009,
0.05325160548090935,
-0.18641626834869385,
-0.00007079464558046311,
0.10321065783500671,
-0.0019671900663524866,
0.07278238236904144,
0.02674158662557602,
-0.05644993111491203,
0.08693467825651169,
-0.06423544138669968,
0.051848024129867554,
0.030121132731437683,
-0.09108047187328339,
-0.19742310047149658,
-0.09449058026075363,
0.07662637531757355,
0.03756781294941902,
0.06722110509872437,
-0.010894889011979103,
0.181767076253891,
-0.069495789706707,
0.10862668603658676,
0.2191234976053238,
-0.2918354272842407,
-0.0682142898440361,
-0.0005517815588973463,
0.10380978137254715,
0.06263430416584015,
-0.03888922557234764,
0.011343054473400116,
0.04399174451828003,
0.062447451055049896,
0.022472316399216652,
-0.019918400794267654,
-0.08203905075788498,
0.023339932784438133,
-0.13849161565303802,
-0.05369339510798454,
0.13226613402366638,
-0.005411301273852587,
-0.05836348608136177,
0.0007818606682121754,
-0.1102655902504921,
-0.049968525767326355,
-0.0007542182574979961,
-0.034825537353754044,
0.0013881081249564886,
-0.037314604967832565,
-0.056065309792757034,
-0.013064014725387096,
-0.06558379530906677,
-0.10818234086036682,
-0.045662857592105865,
0.1925266683101654,
0.07439432293176651,
0.0762704387307167,
-0.0090672317892313,
0.07604136317968369,
-0.06668165326118469,
-0.1289997696876526,
-0.028782423585653305,
-0.01727788895368576,
-0.05361936241388321,
0.006785193923860788,
-0.013393295928835869,
0.013949797488749027,
0.07394669204950333,
0.21957997977733612,
-0.07222145050764084,
0.02253822050988674,
0.036569662392139435,
-0.010984267108142376,
0.0032846792601048946,
0.11104915291070938,
-0.18299497663974762,
-0.03512747958302498,
0.07119683176279068,
0.014228376559913158,
0.02841767854988575,
-0.03469114750623703,
-0.023684455081820488,
-0.04791122302412987,
0.1402614861726761,
0.0696537047624588,
0.03659556806087494,
0.03905263915657997,
-0.019409112632274628,
-0.034035444259643555,
0.03954794630408287,
-0.1339249163866043,
-0.005336955655366182,
0.0014743979554623365,
-0.04568133130669594,
0.023532837629318237,
0.043958935886621475,
-0.021729007363319397,
-0.11635280400514603,
0.07257598638534546,
-0.07921168953180313,
-0.042870406061410904,
-0.06422626227140427,
-0.14362038671970367,
0.014184541068971157,
-0.06603839993476868,
-0.005553161259740591,
-0.10916270315647125,
-0.1449868232011795,
-0.019017396494746208,
0.02415168471634388,
-0.07088662683963776,
-0.03487076982855797,
0.06126020848751068,
-0.10317656397819519,
0.025290649384260178,
-0.045190103352069855,
0.08613034337759018,
-0.06310061365365982,
0.10000856220722198,
0.12009890377521515,
0.056773778051137924,
-0.012867887504398823,
0.020788991823792458,
-0.06297583132982254,
-0.00017667628708295524,
-0.17888209223747253,
0.06280501931905746,
-0.12021571397781372,
0.027252446860074997,
-0.13290925323963165,
-0.08211851119995117,
0.054858893156051636,
-0.008973430842161179,
0.09810927510261536,
0.07878787815570831,
-0.12050152570009232,
-0.03509155288338661,
0.20735611021518707,
-0.09241499751806259,
-0.0943363681435585,
0.11190081387758255,
0.017640598118305206,
-0.03510655090212822,
0.0701204165816307,
0.10347236692905426,
0.06755904853343964,
-0.17649652063846588,
-0.0648985505104065,
0.029653532430529594,
0.01548310462385416,
0.016134312376379967,
0.0947021096944809,
0.007209067698568106,
0.058537352830171585,
0.0242904219776392,
-0.07176750898361206,
-0.01467912644147873,
-0.06506139039993286,
-0.06161678582429886,
-0.05551094189286232,
-0.04148632287979126,
-0.04184716194868088,
0.014211631380021572,
0.014182631857693195,
-0.052332017570734024,
-0.1310369223356247,
-0.002219682326540351,
0.09588152170181274,
-0.04536942392587662,
0.0166954193264246,
-0.10848639160394669,
0.11672031879425049,
-0.06623158603906631,
0.0189592856913805,
-0.21621564030647278,
-0.005107161123305559,
0.039502616971731186,
-0.02931910566985607,
0.036079682409763336,
-0.012339945882558823,
0.03569037839770317,
0.030729204416275024,
0.006231416016817093,
0.027267182245850563,
-0.05254160985350609,
-0.02319532446563244,
-0.05702423304319382,
-0.19192096590995789,
-0.013289785012602806,
-0.027103640139102936,
0.056439708918333054,
-0.0973147377371788,
0.007100417744368315,
0.05674927681684494,
0.12089842557907104,
-0.046840038150548935,
-0.026455890387296677,
-0.04085681214928627,
0.035437315702438354,
-0.04887313023209572,
-0.048843733966350555,
-0.010254234075546265,
0.01046954095363617,
-0.021802840754389763,
-0.005356626585125923,
-0.12567758560180664,
0.05916411429643631,
0.1210796982049942,
0.02067638747394085,
-0.10176737606525421,
0.10840371996164322,
-0.030470384284853935,
-0.02412935346364975,
-0.020918644964694977,
-0.05149849131703377,
0.15708106756210327,
0.03617063909769058,
0.09678395837545395,
-0.07987271994352341,
-0.03045997954905033,
0.021702466532588005,
-0.04139614850282669,
0.049890562891960144,
0.1125897765159607,
0.04014415293931961,
-0.11071176081895828,
0.08338310569524765,
0.0008398541831411421,
-0.1147807165980339,
0.05168070271611214,
-0.03695718199014664,
-0.08403042703866959,
-0.03176276758313179,
0.018722858279943466,
0.03891634941101074,
0.12799103558063507,
-0.09249631315469742,
0.025391295552253723,
0.035104405134916306,
0.009111979976296425,
0.039720285683870316,
-0.12235067784786224,
0.04790284484624863,
-0.01340730581432581,
-0.0734613835811615,
-0.05969083681702614,
-0.009458694607019424,
0.028455734252929688,
0.10330858081579208,
0.0389845035970211,
0.033096086233854294,
-0.03334660828113556,
-0.02629842609167099,
-0.10104072093963623,
0.2380082756280899,
-0.08943740278482437,
-0.1329527497291565,
-0.1032104641199112,
-0.02357478253543377,
-0.022282814607024193,
-0.016544636338949203,
0.029880579560995102,
-0.0704386830329895,
-0.09906486421823502,
-0.012758822180330753,
0.08436939120292664,
-0.04064617305994034,
-0.02311079017817974,
-0.015510490164160728,
0.008849544450640678,
0.07225072383880615,
-0.145132914185524,
0.025306006893515587,
-0.035560492426157,
-0.11210032552480698,
0.026417741551995277,
0.04285455122590065,
0.09650681167840958,
0.0806771069765091,
-0.0197228342294693,
-0.011727862060070038,
0.007664114702492952,
0.24817496538162231,
-0.050109732896089554,
-0.028076132759451866,
0.17418169975280762,
0.021967042237520218,
0.046365611255168915,
0.11651298403739929,
0.015085048973560333,
-0.06586949527263641,
-0.0098221804946661,
0.03716607764363289,
-0.04598718509078026,
-0.2697233259677887,
-0.0514804944396019,
-0.04351653903722763,
-0.013983812183141708,
0.05065103992819786,
0.045425355434417725,
-0.022235065698623657,
0.07030937820672989,
-0.06212903931736946,
0.07252377271652222,
-0.06037142500281334,
0.07491662353277206,
0.1407681107521057,
0.03273621201515198,
0.11879072338342667,
-0.0357864648103714,
-0.008525943383574486,
0.07304058223962784,
0.07980554550886154,
0.16169047355651855,
-0.08799585700035095,
0.1384902149438858,
0.06320197135210037,
0.16925668716430664,
-0.00025153884780593216,
0.07870566099882126,
-0.018196891993284225,
0.04068874195218086,
0.007992792874574661,
-0.06222206726670265,
-0.09003345668315887,
-0.0030018382240086794,
-0.009533340111374855,
0.016105668619275093,
-0.019811149686574936,
0.023433363065123558,
0.047486934810876846,
0.23538362979888916,
0.017264369875192642,
-0.2380889654159546,
-0.10875051468610764,
0.02993217669427395,
-0.02533995546400547,
-0.08087319880723953,
0.006995494477450848,
0.05912170931696892,
-0.10449662804603577,
0.036293789744377136,
-0.02398507483303547,
0.10953183472156525,
-0.04461372643709183,
-0.0024729017168283463,
0.034159574657678604,
0.12942159175872803,
0.038556862622499466,
0.09264775365591049,
-0.21682274341583252,
0.17851541936397552,
0.029663484543561935,
0.11415118724107742,
-0.03114616498351097,
0.0899236649274826,
-0.02399471588432789,
0.003905430668964982,
0.09300562739372253,
-0.010322553105652332,
0.050953079015016556,
-0.09409327059984207,
-0.12194905430078506,
0.05491889640688896,
0.07976152002811432,
-0.011138121597468853,
0.11139769852161407,
-0.043897390365600586,
0.03026452288031578,
-0.0019410212989896536,
-0.07078118622303009,
-0.127387136220932,
-0.10759299993515015,
0.05834955722093582,
-0.07645492255687714,
-0.04616275802254677,
-0.05705359950661659,
-0.07041056454181671,
-0.01666751503944397,
0.19801916182041168,
-0.17378486692905426,
-0.0816313773393631,
-0.11731738597154617,
0.09891457110643387,
0.1410095989704132,
-0.10937681794166565,
-0.007886828854680061,
-0.0720166340470314,
0.06140019744634628,
0.030408639460802078,
-0.09783171117305756,
0.08902639150619507,
-0.08192721754312515,
-0.14490453898906708,
-0.026487432420253754,
0.08547910302877426,
0.010099540464580059,
0.02881317213177681,
-0.0017308933893218637,
0.035742513835430145,
-0.07809016108512878,
-0.13707906007766724,
-0.013570394366979599,
0.003351598745211959,
0.04729694500565529,
0.09434233605861664,
-0.06750306487083435,
-0.07156987488269806,
-0.04999591410160065,
0.007834876887500286,
0.10310213267803192,
0.17454376816749573,
-0.08923710882663727,
0.01471653487533331,
0.16056984663009644,
-0.03699349984526634,
-0.24164743721485138,
-0.03742150962352753,
0.08469514548778534,
0.019696181640028954,
-0.060524631291627884,
-0.1558428704738617,
0.11761914938688278,
0.07275952398777008,
-0.026221396401524544,
0.06431438773870468,
-0.18575435876846313,
-0.13559795916080475,
0.10458836704492569,
0.06056463345885277,
0.006164821796119213,
-0.22467415034770966,
-0.05953392758965492,
-0.05190564692020416,
-0.1678641438484192,
0.08473853766918182,
-0.10522114485502243,
0.10311631113290787,
-0.0118443313986063,
0.10047924518585205,
-0.013168218545615673,
-0.06382899731397629,
0.12546749413013458,
-0.01652214489877224,
-0.017018770799040794,
-0.036506135016679764,
0.0254213847219944,
0.13118243217468262,
-0.022777564823627472,
0.060534436255693436,
-0.006471918895840645,
0.10344589501619339,
-0.12306603789329529,
-0.005177802871912718,
-0.07756944745779037,
0.06770247966051102,
-0.06450692564249039,
-0.04682648926973343,
-0.0586852952837944,
0.0032339536119252443,
0.026962142437696457,
-0.06175808608531952,
0.034335315227508545,
-0.012217815965414047,
0.1401374489068985,
0.13719260692596436,
0.058689184486866,
-0.004642452113330364,
-0.1032179594039917,
0.04194474220275879,
-0.01591438800096512,
0.08839850127696991,
-0.14203497767448425,
0.037596940994262695,
0.12247935682535172,
0.06429150700569153,
0.05135120451450348,
0.03461620584130287,
-0.12075432389974594,
0.0012394270161166787,
0.04556654766201973,
-0.18727698922157288,
-0.08113232254981995,
0.0639386773109436,
-0.033159490674734116,
-0.1190858781337738,
0.05411330983042717,
0.15062329173088074,
-0.026448670774698257,
-0.05266765132546425,
0.003975065890699625,
0.04333215206861496,
-0.05388695374131203,
0.21789099276065826,
0.08392558246850967,
0.07070696353912354,
-0.12101063877344131,
0.05849014222621918,
0.059518154710531235,
-0.051786355674266815,
0.04766129329800606,
0.0812930166721344,
-0.08785417675971985,
-0.02452988177537918,
-0.00348901329562068,
0.04815922677516937,
-0.09157812595367432,
-0.03926505148410797,
-0.08152651786804199,
-0.10013236850500107,
0.046113740652799606,
0.16338790953159332,
0.042067091912031174,
0.016803033649921417,
-0.023327229544520378,
0.0205882228910923,
-0.09339724481105804,
0.09718050062656403,
0.035511620342731476,
0.038702622056007385,
-0.12365184724330902,
0.08350908011198044,
0.0014519782271236181,
0.06570281088352203,
-0.032541099935770035,
0.03423934429883957,
-0.12926550209522247,
0.0013515816535800695,
-0.18883387744426727,
0.034997906535863876,
-0.014055220410227776,
-0.0014689259696751833,
-0.00448935991153121,
-0.0798417329788208,
-0.05798262357711792,
0.05570317804813385,
-0.05917107313871384,
-0.05488254129886627,
-0.04104731231927872,
-0.0001025731980917044,
-0.1934470534324646,
0.017637982964515686,
0.06720166653394699,
-0.08046964555978775,
0.09489230811595917,
0.1363363415002823,
0.016216807067394257,
0.0689985454082489,
-0.081087626516819,
-0.025239447131752968,
0.019897976890206337,
0.07156692445278168,
0.0593111552298069,
-0.08019418269395828,
0.03590870648622513,
0.0045770881697535515,
0.06197965517640114,
0.015859853476285934,
0.10650595277547836,
-0.1239306852221489,
0.0016208417946472764,
-0.025184663012623787,
-0.04129044711589813,
-0.06091674417257309,
0.05108412727713585,
0.11044878512620926,
0.0893092155456543,
0.10695862770080566,
-0.08488260954618454,
0.018273556604981422,
-0.1783696860074997,
0.007688272278755903,
-0.005029173567891121,
-0.03769254311919212,
-0.08103789389133453,
0.043191540986299515,
0.10127156227827072,
-0.038000114262104034,
0.17358340322971344,
-0.009976152330636978,
0.11664973199367523,
0.04286038875579834,
-0.03502603620290756,
0.014232618734240532,
0.03176732733845711,
0.11932609975337982,
0.059492867439985275,
0.011666009202599525,
-0.01305813156068325,
-0.011998974718153477,
0.005304983351379633,
-0.05565170571208,
0.12489239871501923,
0.13580076396465302,
0.06501182913780212,
0.06105152145028114,
0.09470406919717789,
-0.08573312312364578,
-0.05730713903903961,
-0.02310926653444767,
-0.06418118625879288,
0.05725674331188202,
-0.014494899660348892,
0.07870255410671234,
0.1444789469242096,
-0.10316137969493866,
0.039623621851205826,
-0.08002382516860962,
-0.11283937841653824,
-0.156117245554924,
0.05585417151451111,
-0.07114061713218689,
-0.08415590971708298,
0.008171919733285904,
-0.14815384149551392,
0.033229976892471313,
0.09562480449676514,
0.03615841642022133,
0.016856160014867783,
0.0934905931353569,
0.059528253972530365,
-0.043903786689043045,
0.010438261553645134,
0.031110167503356934,
0.008724816143512726,
0.0966644361615181,
0.00043845910113304853,
0.03773203864693642,
0.069059818983078,
0.12246985733509064,
0.03308352828025818,
-0.045806411653757095,
0.08782431483268738,
-0.07539420574903488,
-0.05927662551403046,
-0.005811202805489302,
0.04865056648850441,
-0.010015925392508507,
0.14775103330612183,
0.02563825063407421,
-0.013370667584240437,
-0.00894423108547926,
0.25987136363983154,
-0.09184300899505615,
-0.10233163833618164,
-0.16150611639022827,
0.21995234489440918,
-0.010193081572651863,
0.03325209394097328,
0.021163925528526306,
-0.08434584736824036,
-0.05558500066399574,
0.18299993872642517,
0.15334604680538177,
-0.056446194648742676,
-0.017316142097115517,
0.014625678770244122,
-0.007601883262395859,
-0.023902054876089096,
0.05955452844500542,
0.13768456876277924,
0.24704165756702423,
-0.07862246036529541,
-0.007730577606707811,
0.00007482037472072989,
-0.008892664685845375,
-0.11922343075275421,
0.023466400802135468,
-0.006909739691764116,
0.010823764838278294,
-0.058560777455568314,
0.07058243453502655,
-0.009414124302566051,
-0.19788989424705505,
-0.03446192666888237,
-0.11810097843408585,
-0.15760537981987,
-0.023062076419591904,
0.06448681652545929,
-0.015819678083062172,
0.054329149425029755,
-0.00010456545714987442,
0.0040215118788182735,
0.08727536350488663,
0.0014514672802761197,
-0.08439099788665771,
-0.054505717009305954,
0.13005971908569336,
0.024679698050022125,
0.12873350083827972,
-0.029004260897636414,
0.09178992360830307,
0.14428871870040894,
0.00299518508836627,
-0.10550789535045624,
0.04668563976883888,
0.049346696585416794,
-0.15255466103553772,
0.0342600978910923,
0.11289907991886139,
-0.0166659913957119,
-0.011382331140339375,
0.056476008147001266,
-0.05416998267173767,
-0.02972375974059105,
-0.039384111762046814,
0.0019232080085203052,
-0.11203973740339279,
0.0070448885671794415,
-0.11147512495517731,
0.1189604252576828,
0.18149258196353912,
-0.05117795988917351,
0.01608230359852314,
-0.10810964554548264,
0.010892324149608612,
0.008768810890614986,
0.06786693632602692,
-0.0096191531047225,
-0.23976145684719086,
0.023658664897084236,
0.03399932384490967,
0.04832994565367699,
-0.2443999946117401,
-0.013674989342689514,
0.04331757500767708,
0.004814959596842527,
-0.04208076000213623,
0.12503929436206818,
0.025514202192425728,
0.07306589931249619,
-0.03313549980521202,
-0.11919765174388885,
-0.04701976850628853,
0.13800030946731567,
-0.15628431737422943,
-0.11025596410036087
] |
null | null | null | https://github.com/Prim9000/Thai_TTS | {} | null | Prim9000/try | [
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [] | TAGS
#region-us
| URL | [] | [
"TAGS\n#region-us \n"
] | [
6
] | [
"passage: TAGS\n#region-us \n"
] | [
0.024608636274933815,
-0.026205500587821007,
-0.009666500613093376,
-0.10395516455173492,
0.08638657629489899,
0.059816278517246246,
0.01882290467619896,
0.020661840215325356,
0.23975107073783875,
-0.005599027033895254,
0.1219947561621666,
0.0015615287702530622,
-0.037353623658418655,
0.03733762726187706,
-0.0035912662278860807,
-0.17583473026752472,
0.03876631706953049,
-0.018274923786520958,
0.01843859627842903,
0.026470553129911423,
-0.07776834815740585,
-0.07564429938793182,
0.015296397730708122,
-0.10247814655303955,
-0.083692267537117,
0.11002834886312485,
0.031466204673051834,
-0.019670886918902397,
0.10779199749231339,
-0.04243955761194229,
0.18699054419994354,
-0.011512263678014278,
-0.11213519424200058,
-0.2536850869655609,
0.021806683391332626,
-0.01765260472893715,
-0.08747660368680954,
0.01506110467016697,
0.0665089413523674,
-0.09014441072940826,
-0.0588928684592247,
0.0795099288225174,
-0.01132340170443058,
0.04246443510055542,
-0.27593839168548584,
-0.12684126198291779,
-0.05297930911183357,
-0.1421966552734375,
0.08651168644428253,
0.04035491496324539,
0.008764253929257393,
0.15506891906261444,
-0.20897391438484192,
0.004104613792151213,
0.08255259692668915,
-0.2538507878780365,
0.05591634660959244,
0.17671173810958862,
0.03623908758163452,
0.18037272989749908,
0.0060391901060938835,
0.11029672622680664,
0.0716743916273117,
-0.024263937026262283,
-0.17590197920799255,
-0.08127854019403458,
-0.04696211963891983,
0.16642488539218903,
-0.06727185100317001,
-0.14248386025428772,
0.34701237082481384,
0.00015008423360995948,
0.009657775051891804,
0.16921205818653107,
-0.059524230659008026,
-0.09972117841243744,
0.07259953022003174,
0.016484731808304787,
0.018492350354790688,
0.1471305936574936,
0.16307872533798218,
-0.0458691343665123,
-0.13837823271751404,
-0.018630273640155792,
-0.22798998653888702,
0.17510560154914856,
-0.03248048573732376,
0.13137903809547424,
-0.27447956800460815,
0.01684025302529335,
-0.2570667266845703,
0.0032130838371813297,
0.04178816080093384,
-0.06004921346902847,
-0.0226522795855999,
-0.013265985064208508,
-0.08018817007541656,
0.004899587947875261,
0.06192673370242119,
0.1266920566558838,
-0.06128726154565811,
0.06128238886594772,
-0.09319206327199936,
0.141696035861969,
0.07166698575019836,
0.07868369668722153,
0.13037432730197906,
0.041205424815416336,
-0.07187089323997498,
-0.21872246265411377,
-0.0026476888451725245,
-0.06275863200426102,
-0.09502086788415909,
-0.0020165652967989445,
-0.11606067419052124,
0.17244569957256317,
-0.030802514404058456,
-0.09825427830219269,
-0.11208184063434601,
0.09148659557104111,
-0.032992321997880936,
-0.03437839448451996,
-0.03552987426519394,
-0.020977836102247238,
0.019381176680326462,
0.04704452306032181,
-0.1548958420753479,
-0.005131472367793322,
0.07039852440357208,
0.11502562463283539,
-0.1346137970685959,
-0.003783059772104025,
-0.07908964157104492,
0.03039063885807991,
0.07654735445976257,
-0.16510222852230072,
0.03158547356724739,
-0.1124754324555397,
-0.07531405985355377,
0.002912673633545637,
-0.015710093080997467,
-0.016202643513679504,
0.166526660323143,
-0.0020451415330171585,
0.0714716836810112,
-0.026345307007431984,
-0.05890209600329399,
-0.11243434250354767,
-0.08489254862070084,
0.05390460044145584,
0.03670717030763626,
0.03266148269176483,
-0.2193479984998703,
0.014805203303694725,
-0.12762966752052307,
0.1360815018415451,
-0.10566820204257965,
-0.04705966264009476,
-0.022842247039079666,
0.20562705397605896,
0.037286072969436646,
0.08762791007757187,
-0.22171171009540558,
0.039756543934345245,
-0.05404696613550186,
0.18480908870697021,
-0.1502426266670227,
-0.0799463614821434,
0.20813211798667908,
-0.07964949309825897,
-0.10115210711956024,
0.021235812455415726,
0.020391687750816345,
0.026287272572517395,
0.0766737088561058,
0.4564172327518463,
-0.09766800701618195,
-0.09146861732006073,
0.10178250074386597,
0.17055274546146393,
-0.12427149713039398,
-0.1827561855316162,
0.06446871906518936,
-0.16666454076766968,
-0.1973118633031845,
0.0018917324487119913,
0.09222044050693512,
0.038269978016614914,
-0.07875611633062363,
-0.020746968686580658,
0.06325206160545349,
-0.0007678253459744155,
0.09095914661884308,
0.03755716234445572,
0.09034032374620438,
-0.08716782182455063,
0.11115926504135132,
-0.05017651244997978,
0.004037132486701012,
0.1343354731798172,
0.027325427159667015,
-0.03223329409956932,
0.08694463223218918,
-0.0485352948307991,
0.05295134335756302,
-0.1662379503250122,
-0.15068690478801727,
0.03398871049284935,
0.06283251196146011,
0.03186952322721481,
0.1280253529548645,
0.08141885697841644,
-0.10732853412628174,
0.022690722718834877,
-0.004228927195072174,
0.058398615568876266,
0.03891623765230179,
0.006107209715992212,
0.008764320984482765,
0.0961301177740097,
-0.10607069730758667,
-0.13589619100093842,
-0.07336436957120895,
-0.014715781435370445,
0.14371353387832642,
-0.0302802175283432,
0.07690227776765823,
-0.004240254405885935,
0.00013200697139836848,
0.06930823624134064,
0.08137880265712738,
0.016412746161222458,
0.08971183747053146,
-0.05237193778157234,
-0.05160155147314072,
0.10863113403320312,
-0.13533565402030945,
0.17837053537368774,
0.14053137600421906,
-0.20532016456127167,
0.029453208670020103,
-0.06838275492191315,
0.03670361638069153,
-0.008162540383636951,
0.0975119024515152,
-0.08272241055965424,
-0.02106042578816414,
0.013134466484189034,
0.0052274600602686405,
-0.013007243163883686,
0.017682146281003952,
-0.07295988500118256,
-0.07787393033504486,
-0.10233919322490692,
0.08436838537454605,
0.11562882363796234,
-0.10282530635595322,
0.14214380085468292,
0.4384984076023102,
0.11495281755924225,
0.21582984924316406,
-0.09581480920314789,
-0.0412987545132637,
0.007486371789127588,
0.0001535322517156601,
-0.04476691037416458,
0.08031861484050751,
-0.15973517298698425,
-0.038901735097169876,
0.027348900213837624,
0.07128690183162689,
0.11475157737731934,
-0.14959022402763367,
-0.09639324247837067,
-0.00793045200407505,
0.0022841424215584993,
-0.1249532699584961,
0.023905446752905846,
-0.03974650055170059,
0.04015624523162842,
0.07232289016246796,
-0.021535737439990044,
0.13939237594604492,
-0.04166141897439957,
-0.0639561116695404,
0.07585346698760986,
-0.2017085999250412,
-0.23179671168327332,
-0.12309670448303223,
-0.14680525660514832,
0.04366797208786011,
0.05154111236333847,
0.01726446859538555,
-0.17635835707187653,
-0.015074856579303741,
0.07706750929355621,
0.07820965349674225,
-0.20886357128620148,
-0.022814949974417686,
-0.004290030337870121,
0.0895976573228836,
-0.10227091610431671,
-0.0017130117630586028,
-0.04419664293527603,
-0.10150232166051865,
0.0017003051470965147,
0.07279510796070099,
-0.137485533952713,
0.13807645440101624,
0.21589438617229462,
0.07225540280342102,
0.07359948754310608,
-0.019093448296189308,
0.09936179965734482,
-0.10856141895055771,
-0.16549113392829895,
0.08348225057125092,
-0.06234746053814888,
0.047262318432331085,
0.17534415423870087,
0.03307317942380905,
-0.13904969394207,
-0.015682822093367577,
-0.0402069091796875,
-0.15603256225585938,
-0.238995760679245,
-0.09178274869918823,
-0.1182505264878273,
0.16442428529262543,
0.0009358620154671371,
0.06651917099952698,
0.08258313685655594,
-0.022042419761419296,
0.16447891294956207,
-0.07379321753978729,
-0.07578866183757782,
-0.006978808436542749,
0.12375060468912125,
-0.056660156697034836,
-0.03080669604241848,
-0.10566964000463486,
-0.008295975625514984,
0.1151021271944046,
0.15304014086723328,
0.12214863300323486,
0.2957419455051422,
0.08268889784812927,
0.026645636186003685,
0.08958091586828232,
0.17622539401054382,
0.09495089203119278,
0.07838419824838638,
-0.045413073152303696,
-0.014814783819019794,
0.014317171648144722,
-0.04022889584302902,
0.010141594335436821,
0.14683100581169128,
-0.2679629921913147,
-0.006678564939647913,
-0.2710230350494385,
0.0965198427438736,
-0.10913380235433578,
0.11837165057659149,
-0.01015760749578476,
0.10194015502929688,
0.11082887649536133,
0.03233652561903,
-0.03858073800802231,
0.16613617539405823,
0.08450309932231903,
-0.11277695000171661,
0.001758623169735074,
0.03737903758883476,
0.09715615212917328,
-0.02818971499800682,
0.12721189856529236,
-0.11048974841833115,
-0.1464834064245224,
0.013753619976341724,
0.07152791321277618,
-0.15373679995536804,
0.3138748109340668,
0.012069208547472954,
-0.13481520116329193,
-0.01481647603213787,
-0.09957809001207352,
-0.006440147757530212,
0.1254177987575531,
0.09333524852991104,
0.07935678958892822,
-0.2185502052307129,
-0.13339371979236603,
0.05872276425361633,
-0.00575496768578887,
0.22408108413219452,
-0.034034017473459244,
-0.11356475204229355,
-0.027013886719942093,
0.04241163283586502,
-0.06043251231312752,
0.08524788916110992,
0.023536119610071182,
-0.08113526552915573,
-0.032957352697849274,
0.05323701351881027,
0.012368366122245789,
0.00524376705288887,
0.09360801428556442,
0.020107939839363098,
-0.0009265501867048442,
0.01785753294825554,
0.047885000705718994,
-0.0675911232829094,
-0.1984109878540039,
0.09357594698667526,
-0.05215044692158699,
0.0015536568826064467,
-0.08013670891523361,
-0.15122665464878082,
-0.08837161958217621,
-0.16009655594825745,
0.12540200352668762,
-0.034406669437885284,
0.12700119614601135,
-0.06619787961244583,
0.17341409623622894,
-0.07871770113706589,
0.04481020197272301,
-0.047349292784929276,
0.050332702696323395,
-0.007268077693879604,
-0.07756082713603973,
0.16585899889469147,
-0.15564003586769104,
0.01809087023139,
0.19572502374649048,
-0.018915493041276932,
0.07177707552909851,
0.021322092041373253,
-0.0636206790804863,
0.23147478699684143,
0.3014698624610901,
0.008138049393892288,
0.1665448248386383,
0.3018903136253357,
-0.07466315478086472,
-0.2642788887023926,
-0.05505012720823288,
-0.2841376066207886,
-0.05371501296758652,
0.10716094076633453,
-0.22523896396160126,
0.06986407935619354,
0.14383509755134583,
-0.06471995264291763,
0.30228954553604126,
-0.21825523674488068,
0.012589273042976856,
0.15434536337852478,
-0.08868814259767532,
0.5515313148498535,
-0.1133413165807724,
-0.17677772045135498,
-0.008122089318931103,
-0.08741296827793121,
0.10602109134197235,
-0.0340677872300148,
0.06877441704273224,
0.013465235009789467,
0.04797380417585373,
0.048932258039712906,
-0.03111894056200981,
0.22701001167297363,
0.008710170164704323,
0.09015397727489471,
-0.07378865778446198,
-0.18624304234981537,
0.11639340221881866,
-0.04359482601284981,
-0.08891059458255768,
0.0849778801202774,
-0.05942516401410103,
-0.11078983545303345,
0.04663389176130295,
-0.07950539886951447,
-0.024862350896000862,
0.08423490077257156,
-0.04678233340382576,
-0.042606171220541,
-0.008054176345467567,
-0.1618063747882843,
-0.0002289071271661669,
0.31360217928886414,
-0.07096036523580551,
0.16695955395698547,
0.03677211329340935,
0.00038613268407061696,
-0.11027684062719345,
0.030288029462099075,
-0.05203165486454964,
-0.021576624363660812,
0.09578979015350342,
-0.11096979677677155,
0.03204701095819473,
0.14160704612731934,
-0.04864364117383957,
0.05846960097551346,
0.09256096184253693,
-0.0849417969584465,
0.007583672646433115,
0.17753590643405914,
-0.17537221312522888,
-0.1273445188999176,
-0.006135711446404457,
-0.09862716495990753,
0.14055661857128143,
0.04394126310944557,
0.05191568285226822,
0.16669964790344238,
0.03967129811644554,
-0.029474308714270592,
-0.02817419543862343,
-0.1153380498290062,
-0.0201893113553524,
0.040153320878744125,
0.00045633706031367183,
-0.08791285753250122,
0.2262638509273529,
0.06409153342247009,
-0.1328488290309906,
-0.051157206296920776,
0.2161225974559784,
-0.06805316358804703,
-0.04911920800805092,
-0.223562553524971,
0.10752306133508682,
-0.07112517952919006,
-0.0965060144662857,
0.05453834682703018,
-0.02270081453025341,
0.005106312222778797,
0.181985542178154,
0.03941008821129799,
0.11070270836353302,
0.03738937899470329,
-0.02448922023177147,
0.15798696875572205,
-0.142850860953331,
-0.14191335439682007,
-0.025354057550430298,
-0.08757315576076508,
-0.13844476640224457,
-0.026804137974977493,
0.1617041826248169,
-0.09177309274673462,
-0.14772607386112213,
-0.2621181011199951,
0.10968475043773651,
-0.16432365775108337,
-0.10192688554525375,
-0.03469514101743698,
-0.08968492597341537,
0.0696166530251503,
0.030301768332719803,
-0.03093348816037178,
-0.06706760823726654,
-0.18593791127204895,
0.0816768929362297,
0.06349513679742813,
0.045533183962106705,
-0.017847947776317596,
0.0067379772663116455,
0.1720137596130371,
0.025955144315958023,
0.10040043294429779,
0.16762186586856842,
0.011397695168852806,
0.2246655523777008,
-0.1671202927827835,
-0.11496317386627197,
0.1336962729692459,
-0.026543032377958298,
0.06762003898620605,
0.16792191565036774,
-0.0772583931684494,
0.015526676550507545,
-0.028136352077126503,
0.07066910713911057,
-0.11003983020782471,
-0.105624258518219,
0.007937257178127766,
0.02567129209637642,
-0.2755882740020752,
-0.005599735304713249,
-0.19717298448085785,
0.14788752794265747,
0.02579621411859989,
0.03297143429517746,
0.10257530212402344,
0.10404334217309952,
0.08312062919139862,
-0.0017710148822516203,
0.03226327523589134,
-0.1176818460226059,
0.02753005363047123,
-0.059239376336336136,
-0.020663779228925705,
0.017624232918024063,
0.36952024698257446,
-0.03603357449173927,
-0.046802736818790436,
0.003710439894348383,
0.1307835876941681,
-0.02139742486178875,
0.017395347356796265,
0.13209912180900574,
0.12607666850090027,
-0.08595693111419678,
-0.1504845917224884,
0.04888554662466049,
-0.04565655067563057,
-0.02836887165904045,
0.1464131623506546,
0.05905961990356445,
0.1050296202301979,
0.0908031314611435,
-0.014463032595813274,
-0.00318976235575974,
0.012856799177825451,
-0.15486004948616028,
0.06223496049642563,
-0.010558074340224266,
0.012565906159579754,
0.017934376373887062,
0.15238402783870697,
-0.005540105979889631,
0.07739730179309845,
-0.09889880567789078,
0.004208535887300968,
-0.13498884439468384,
-0.07913459837436676,
0.03617347031831741,
-0.13393273949623108,
0.04141177982091904,
-0.01871878281235695,
0.029611799865961075,
0.30386561155319214,
0.02558239921927452,
-0.020639164373278618,
0.12512871623039246,
-0.1214587539434433,
-0.12050267308950424,
-0.001594188273884356,
-0.029960084706544876,
0.0791488066315651,
-0.02633434161543846,
-0.0997740775346756,
-0.1001306027173996,
-0.15166029334068298,
-0.09759195148944855,
0.05182836204767227,
-0.04993441700935364,
-0.059362251311540604,
-0.17634081840515137,
-0.05707859992980957,
-0.05147340148687363,
0.14025864005088806,
-0.12263951450586319,
0.15159130096435547,
-0.014490418136119843,
0.004084470681846142,
0.04405883327126503,
0.1950942426919937,
-0.03644494712352753,
0.08714226633310318,
0.0154351145029068,
0.1522706001996994,
-0.05119588226079941,
0.14720745384693146,
-0.10931728035211563,
-0.04014137014746666,
-0.06710435450077057,
0.21513493359088898,
0.25630924105644226,
-0.06136954948306084,
-0.008937356993556023,
-0.012760217301547527,
0.058654606342315674,
0.1073930487036705,
0.16049085557460785,
0.002326392102986574,
0.2802925705909729,
-0.03133585304021835,
0.04815128445625305,
0.02901598811149597,
0.013607407920062542,
-0.06336209923028946,
0.03397751972079277,
0.07539387792348862,
-0.035039983689785004,
-0.1412304788827896,
0.15837742388248444,
-0.21980468928813934,
0.18157227337360382,
0.11640069633722305,
-0.19996967911720276,
-0.013728445395827293,
-0.04882071167230606,
0.1689416468143463,
-0.0856364443898201,
0.1637246012687683,
-0.0903693437576294,
-0.2108195722103119,
-0.2056000679731369,
0.03867346793413162,
-0.34623071551322937,
-0.254462867975235,
0.10422009229660034,
0.1488201916217804,
0.04015883058309555,
-0.018507536500692368,
-0.019967829808592796,
-0.018367022275924683,
0.04877542704343796,
-0.0067357709631323814,
0.06014643982052803,
0.031397558748722076,
-0.02988368645310402,
-0.24127542972564697,
-0.029804671183228493,
0.023964406922459602,
-0.07093082368373871,
0.07464958727359772,
-0.06874357163906097,
-0.022495782002806664,
0.08059766888618469,
-0.03066304884850979,
0.03298592567443848,
-0.035373736172914505,
-0.16326889395713806,
0.027529051527380943,
0.03900543600320816,
0.036012712866067886,
0.00634160777553916,
0.0008072225609794259,
-0.03455270454287529,
0.0644603744149208,
-0.16716794669628143,
-0.16015739738941193,
0.14140215516090393,
-0.06745140254497528,
0.2779497504234314,
-0.05812826007604599,
-0.0809100940823555,
0.04766704887151718,
-0.03426874056458473,
0.1807648241519928,
-0.07756473124027252,
0.047254521399736404,
0.12766779959201813,
0.011127962730824947,
0.03121316432952881,
-0.3092964291572571,
0.11082969605922699,
-0.000795336440205574,
-0.006093299947679043,
-0.07581598311662674
] |
null | null | transformers |
# BART-Squad2
## Model description
BART for extractive (span-based) question answering, trained on Squad 2.0.
F1 score of 87.4.
## Intended uses & limitations
Unfortunately, the Huggingface auto-inference API won't run this model, so if you're attempting to try it through the input box above and it complains, don't be discouraged!
#### How to use
Here's a quick way to get question answering running locally:
```python
from transformers import AutoTokenizer, AutoModelForQuestionAnswering
tokenizer = AutoTokenizer.from_pretrained("Primer/bart-squad2")
model = AutoModelForQuestionAnswering.from_pretrained("Primer/bart-squad2")
model.to('cuda'); model.eval()
def answer(question, text):
seq = '<s>' + question + ' </s> </s> ' + text + ' </s>'
tokens = tokenizer.encode_plus(seq, return_tensors='pt', padding='max_length', max_length=1024)
input_ids = tokens['input_ids'].to('cuda')
attention_mask = tokens['attention_mask'].to('cuda')
start, end, _ = model(input_ids, attention_mask=attention_mask)
start_idx = int(start.argmax().int())
end_idx = int(end.argmax().int())
print(tokenizer.decode(input_ids[0, start_idx:end_idx]).strip())
# ^^ it will be an empty string if the model decided "unanswerable"
>>> question = "Where does Tom live?"
>>> context = "Tom is an engineer in San Francisco."
>>> answer(question, context)
San Francisco
```
(Just drop the `.to('cuda')` stuff if running on CPU).
#### Limitations and bias
Unknown, no further evaluation has been performed. In a technical sense one big limitation is that it's 1.6G 😬
## Training procedure
`run_squad.py` with:
|param|value|
|---|---|
|batch size|8|
|max_seq_length|1024|
|learning rate|1e-5|
|epochs|2|
Modified to freeze shared parameters and encoder embeddings.
| {"language": "en"} | question-answering | primer-ai/bart-squad2 | [
"transformers",
"pytorch",
"bart",
"question-answering",
"en",
"endpoints_compatible",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [
"en"
] | TAGS
#transformers #pytorch #bart #question-answering #en #endpoints_compatible #region-us
| BART-Squad2
===========
Model description
-----------------
BART for extractive (span-based) question answering, trained on Squad 2.0.
F1 score of 87.4.
Intended uses & limitations
---------------------------
Unfortunately, the Huggingface auto-inference API won't run this model, so if you're attempting to try it through the input box above and it complains, don't be discouraged!
#### How to use
Here's a quick way to get question answering running locally:
(Just drop the '.to('cuda')' stuff if running on CPU).
#### Limitations and bias
Unknown, no further evaluation has been performed. In a technical sense one big limitation is that it's 1.6G
Training procedure
------------------
'run\_squad.py' with:
Modified to freeze shared parameters and encoder embeddings.
| [
"#### How to use\n\n\nHere's a quick way to get question answering running locally:\n\n\n(Just drop the '.to('cuda')' stuff if running on CPU).",
"#### Limitations and bias\n\n\nUnknown, no further evaluation has been performed. In a technical sense one big limitation is that it's 1.6G\n\n\nTraining procedure\n------------------\n\n\n'run\\_squad.py' with:\n\n\n\nModified to freeze shared parameters and encoder embeddings."
] | [
"TAGS\n#transformers #pytorch #bart #question-answering #en #endpoints_compatible #region-us \n",
"#### How to use\n\n\nHere's a quick way to get question answering running locally:\n\n\n(Just drop the '.to('cuda')' stuff if running on CPU).",
"#### Limitations and bias\n\n\nUnknown, no further evaluation has been performed. In a technical sense one big limitation is that it's 1.6G\n\n\nTraining procedure\n------------------\n\n\n'run\\_squad.py' with:\n\n\n\nModified to freeze shared parameters and encoder embeddings."
] | [
31,
40,
65
] | [
"passage: TAGS\n#transformers #pytorch #bart #question-answering #en #endpoints_compatible #region-us \n#### How to use\n\n\nHere's a quick way to get question answering running locally:\n\n\n(Just drop the '.to('cuda')' stuff if running on CPU).#### Limitations and bias\n\n\nUnknown, no further evaluation has been performed. In a technical sense one big limitation is that it's 1.6G\n\n\nTraining procedure\n------------------\n\n\n'run\\_squad.py' with:\n\n\n\nModified to freeze shared parameters and encoder embeddings."
] | [
-0.11500798910856247,
-0.021485159173607826,
-0.00000867232938617235,
0.05784141644835472,
0.15567155182361603,
0.05451212823390961,
0.050994277000427246,
0.13618217408657074,
0.14049752056598663,
0.11914633214473724,
0.10946166515350342,
0.10758081823587418,
0.036494962871074677,
0.0954536572098732,
-0.0815303698182106,
-0.06515146791934967,
0.07530002295970917,
-0.027006737887859344,
0.03624869883060455,
0.08946331590414047,
0.023783927783370018,
-0.04925261065363884,
0.03487260639667511,
-0.07163111120462418,
-0.09376219660043716,
-0.02478327788412571,
-0.01824580878019333,
-0.010500449687242508,
0.1300584226846695,
0.002081619342789054,
0.08298369497060776,
-0.006071427837014198,
-0.09617317467927933,
-0.18669883906841278,
0.06266410648822784,
0.02902608923614025,
-0.025708947330713272,
0.02720062807202339,
0.015586108900606632,
0.027286924421787262,
0.0033951792865991592,
0.03115490823984146,
-0.022589175030589104,
0.05313213914632797,
-0.042955346405506134,
-0.014322374947369099,
-0.003242948092520237,
-0.09740227460861206,
0.09419850260019302,
0.09004654735326767,
0.009540839120745659,
0.18898510932922363,
-0.18041794002056122,
0.060527507215738297,
0.08435600250959396,
-0.24976523220539093,
-0.006706944666802883,
0.11531718075275421,
0.11622589826583862,
0.04710642620921135,
-0.06112352013587952,
-0.04243810102343559,
0.004552038852125406,
0.05648797005414963,
-0.19062760472297668,
-0.04545745253562927,
-0.05157608166337013,
-0.014227433130145073,
-0.09656988084316254,
-0.10870148241519928,
0.13947737216949463,
-0.024899525567889214,
-0.006959201768040657,
-0.041563406586647034,
-0.12290668487548828,
-0.0396491177380085,
0.009145532734692097,
0.003233076771721244,
-0.0892745777964592,
0.0708145871758461,
-0.03261645510792732,
-0.04039901867508888,
-0.08878309279680252,
-0.201254740357399,
-0.09785649925470352,
0.23749923706054688,
0.08008383959531784,
0.07292482256889343,
-0.12627334892749786,
0.135255366563797,
0.09285863488912582,
-0.012784008868038654,
-0.030164159834384918,
-0.07320523262023926,
-0.10255062580108643,
0.025559797883033752,
-0.08713109791278839,
0.04746518284082413,
0.142597034573555,
0.1723097413778305,
0.010509509593248367,
0.04181630164384842,
0.06959854811429977,
0.048528775572776794,
-0.016171183437108994,
0.05825473368167877,
-0.08183103054761887,
0.0010094419121742249,
0.039062030613422394,
0.029408935457468033,
-0.03392934426665306,
-0.06249357759952545,
-0.0958842784166336,
-0.11047787964344025,
0.09587524831295013,
0.14623920619487762,
0.11803720891475677,
0.05899535492062569,
-0.03624569997191429,
-0.05591568350791931,
0.018022911623120308,
-0.06219462677836418,
-0.03335323929786682,
0.05980207398533821,
-0.04173636436462402,
-0.03665115311741829,
-0.06756249815225601,
-0.0912351906299591,
-0.13910461962223053,
-0.007475827354937792,
-0.08315256983041763,
-0.016364583745598793,
-0.10224449634552002,
-0.04488138481974602,
0.042645230889320374,
-0.18679487705230713,
-0.019859524443745613,
-0.11587867885828018,
-0.060803890228271484,
0.0041426788084208965,
-0.03194824233651161,
-0.01797052100300789,
0.03688093274831772,
-0.04500982537865639,
-0.06540415436029434,
-0.015650447458028793,
-0.014115347526967525,
0.020566508173942566,
-0.0833105817437172,
0.1067926436662674,
0.00646450649946928,
0.08492711931467056,
-0.10866022855043411,
0.015905823558568954,
-0.10819533467292786,
0.01307686697691679,
0.030204912647604942,
0.05352969095110893,
-0.06636340916156769,
0.06261289119720459,
-0.005820373073220253,
-0.04532171040773392,
-0.04848340526223183,
-0.035964254289865494,
0.08263082057237625,
0.1660425215959549,
-0.16025453805923462,
0.039040785282850266,
0.1280515491962433,
-0.0848613828420639,
-0.1409561187028885,
0.1378220021724701,
0.028047727420926094,
-0.028130045160651207,
0.028180981054902077,
0.17716999351978302,
0.05047500506043434,
-0.11716075986623764,
-0.015031958930194378,
0.05592849850654602,
-0.20911167562007904,
-0.12260650843381882,
-0.0010108400601893663,
0.015819625928997993,
-0.05836837738752365,
0.04712153598666191,
-0.03560004383325577,
0.07376039028167725,
-0.0798870325088501,
-0.018237898126244545,
-0.027872353792190552,
-0.050609249621629715,
-0.0639766976237297,
0.02367301657795906,
-0.012809802778065205,
-0.021880680695176125,
0.027705127373337746,
-0.1054648607969284,
0.09513109922409058,
0.045927781611680984,
0.033376239240169525,
-0.146103173494339,
0.22068914771080017,
-0.09413203597068787,
0.05658893659710884,
-0.19012971222400665,
-0.09743791818618774,
0.06155012547969818,
0.0712563768029213,
-0.00899328850209713,
0.0710856094956398,
0.031970296055078506,
-0.05175362154841423,
0.07733649015426636,
-0.010967816226184368,
-0.0026224739849567413,
-0.038187358528375626,
-0.08168882876634598,
-0.014031454920768738,
-0.005538128316402435,
-0.045672208070755005,
0.028048286214470863,
0.08076364547014236,
0.004697070922702551,
0.10428670048713684,
-0.007097069639712572,
-0.05488387867808342,
-0.001631610793992877,
0.008408657275140285,
0.03202483430504799,
-0.014764310792088509,
-0.014317890629172325,
0.0930076539516449,
0.006526143755763769,
-0.09446782618761063,
0.0004945282707922161,
-0.1080477386713028,
-0.02294686809182167,
0.09532257169485092,
-0.021754298359155655,
0.03257150575518608,
0.008400270715355873,
-0.052268341183662415,
-0.03883348032832146,
0.02056020312011242,
-0.0116528095677495,
0.13689671456813812,
0.09429468214511871,
0.05478665605187416,
-0.038525622338056564,
0.06836988776922226,
-0.01459921058267355,
0.02026807889342308,
0.02817860245704651,
0.04537999629974365,
0.16420923173427582,
-0.06250891834497452,
0.0533616840839386,
0.015028965659439564,
-0.036905355751514435,
0.08791946619749069,
0.012211409397423267,
-0.10914744436740875,
0.006074873264878988,
0.108834408223629,
0.013644778169691563,
0.07609564810991287,
-0.10531017929315567,
0.05652632191777229,
0.04170599952340126,
0.04483930394053459,
0.10125551372766495,
-0.09818390756845474,
-0.025905989110469818,
-0.09267671406269073,
-0.08112252503633499,
-0.034380748867988586,
0.08384048938751221,
0.04343806579709053,
0.055433668196201324,
0.06415319442749023,
-0.08658846467733383,
0.06866300851106644,
-0.05789666250348091,
0.008041664958000183,
0.15186026692390442,
-0.08028317987918854,
-0.23920023441314697,
-0.10719311237335205,
0.02183372527360916,
-0.05815006047487259,
0.026858502998948097,
0.07824982702732086,
-0.06193016469478607,
-0.002910714130848646,
0.01875757798552513,
-0.09943052381277084,
0.015877149999141693,
-0.030795225873589516,
-0.12006206810474396,
0.014992053620517254,
0.015303465537726879,
-0.1207008883357048,
0.012996293604373932,
-0.06869542598724365,
-0.10457339882850647,
0.14297930896282196,
-0.034498583525419235,
0.07887081056833267,
0.08529243618249893,
0.05957501381635666,
0.008297271095216274,
-0.002811219310387969,
0.1811632215976715,
-0.041992057114839554,
-0.02937733195722103,
0.09649845957756042,
0.011657074093818665,
0.05930238589644432,
0.11054521799087524,
-0.04461783170700073,
-0.13146241009235382,
0.06604015082120895,
0.011243022046983242,
-0.07477323710918427,
-0.20034575462341309,
-0.08902844786643982,
-0.0963672399520874,
0.06371991336345673,
0.008475033566355705,
0.09446679800748825,
-0.0514896921813488,
0.07296255230903625,
-0.0013744739117100835,
-0.09021120518445969,
-0.10059230029582977,
0.07682130485773087,
0.10336726903915405,
-0.0032552056945860386,
0.04812261089682579,
-0.06400694698095322,
-0.04024171456694603,
0.1060803011059761,
0.11982707679271698,
0.1916469782590866,
-0.0023945034481585026,
0.1698615550994873,
0.044610388576984406,
0.1617707461118698,
0.06714766472578049,
0.10368268191814423,
-0.000043801537685794756,
0.004296264611184597,
-0.022119645029306412,
0.047706108540296555,
-0.14838947355747223,
0.009979558177292347,
0.10447800159454346,
-0.10854458063840866,
-0.031996142119169235,
0.10040726512670517,
0.11264196783304214,
0.21207045018672943,
-0.01683611050248146,
-0.11848929524421692,
-0.051697518676519394,
-0.04046844318509102,
-0.10140863060951233,
-0.0729353204369545,
0.0654650330543518,
0.20754511654376984,
-0.07515265792608261,
-0.1462564468383789,
-0.02119383215904236,
0.12451358884572983,
-0.0005990979261696339,
0.015356779098510742,
-0.07909682393074036,
0.011266770772635937,
0.033459171652793884,
0.09305258840322495,
-0.17938143014907837,
0.07134445756673813,
0.005338247399777174,
0.04390644282102585,
-0.0651114359498024,
-0.05421477556228638,
0.020356586202979088,
0.037605904042720795,
0.040763117372989655,
-0.013540009036660194,
0.07335375994443893,
-0.09558355063199997,
-0.15179017186164856,
0.09769778698682785,
0.08327305316925049,
0.044360850006341934,
0.005225997883826494,
-0.06232108920812607,
0.06988268345594406,
-0.02981642261147499,
-0.04787345230579376,
0.09975099563598633,
-0.053887270390987396,
0.05477476865053177,
0.041787419468164444,
0.02572237141430378,
-0.04540520906448364,
-0.0024609193205833435,
0.09323375672101974,
0.09163020551204681,
-0.14328017830848694,
-0.012647653929889202,
-0.08033231645822525,
-0.032490458339452744,
0.09274447709321976,
-0.0832296758890152,
0.015652421861886978,
-0.0758822038769722,
-0.03983232006430626,
0.024827152490615845,
-0.04712051898241043,
0.08113047480583191,
-0.06923523545265198,
-0.053149882704019547,
-0.012660851702094078,
0.07023005932569504,
-0.013135282322764397,
0.016231374815106392,
0.0364992655813694,
-0.06703746318817139,
-0.14482633769512177,
-0.17092904448509216,
-0.03992689773440361,
-0.010853872634470463,
0.002571140881627798,
0.06553132086992264,
-0.056041743606328964,
0.058299969881772995,
-0.02463584765791893,
0.0698988139629364,
0.2132606953382492,
0.2384410798549652,
-0.05486062914133072,
0.01728183962404728,
0.19509920477867126,
0.04792693257331848,
-0.270496129989624,
-0.09252671152353287,
-0.03832123801112175,
-0.01807505078613758,
-0.08692001551389694,
-0.06763241440057755,
0.04635174572467804,
0.061010926961898804,
-0.022517433390021324,
0.23519673943519592,
-0.15948757529258728,
-0.03508242592215538,
0.07077862322330475,
0.06479188799858093,
0.2893385887145996,
-0.1790332943201065,
-0.05164656043052673,
0.055134937167167664,
-0.1097145527601242,
0.06936849653720856,
-0.0062975757755339146,
0.1688603013753891,
-0.0578058585524559,
0.09907928854227066,
0.018027950078248978,
-0.11999738961458206,
0.1139305979013443,
0.017220111563801765,
0.04079892113804817,
-0.06866167485713959,
-0.08447486907243729,
-0.012817459180951118,
-0.03226336091756821,
0.06535734981298447,
-0.07408630102872849,
0.06264255940914154,
-0.12071467936038971,
-0.02170141227543354,
-0.08869344741106033,
0.0720553994178772,
0.03502859175205231,
-0.018490353599190712,
-0.03708377853035927,
0.02061363123357296,
-0.03137115016579628,
0.050183191895484924,
-0.0008399622747674584,
-0.05841364338994026,
0.0234163086861372,
0.10317455232143402,
0.048800449818372726,
-0.16907289624214172,
0.003623438300564885,
0.022159328684210777,
0.01656241901218891,
0.12793521583080292,
0.001647531520575285,
0.0521482415497303,
0.13406851887702942,
-0.0058442819863557816,
0.043127935379743576,
0.07906022667884827,
-0.006491885054856539,
0.010272789746522903,
0.0804612785577774,
-0.1500052511692047,
-0.12006190419197083,
0.06374106556177139,
0.011256673373281956,
-0.026455745100975037,
0.039645273238420486,
-0.0030459710396826267,
0.12742562592029572,
-0.016044223681092262,
0.010039175860583782,
0.02458181418478489,
-0.0023158604744821787,
0.08166292309761047,
0.0554034560918808,
0.049925148487091064,
-0.07822363823652267,
0.08098141849040985,
-0.02547404170036316,
-0.1674821674823761,
0.005014773458242416,
0.07121668010950089,
-0.16840513050556183,
-0.09715565294027328,
-0.08899907022714615,
0.037194181233644485,
0.01999213732779026,
-0.04060767590999603,
-0.10828426480293274,
0.0333673469722271,
0.02908097766339779,
0.04242889583110809,
0.06841268390417099,
0.06800973415374756,
-0.028445282950997353,
0.01599355973303318,
-0.049842964857816696,
0.007881582714617252,
-0.0733146145939827,
0.0008260610047727823,
-0.03953486680984497,
0.018465552479028702,
0.014507480897009373,
0.11831415444612503,
-0.060838937759399414,
-0.0667838603258133,
-0.10811325162649155,
0.07435844838619232,
-0.19123898446559906,
0.033614758402109146,
-0.07390374690294266,
-0.01160221267491579,
0.0747034102678299,
-0.07091847062110901,
-0.04983911290764809,
0.009032187983393669,
-0.09067370742559433,
-0.03186049684882164,
-0.028114723041653633,
-0.016148097813129425,
-0.08299863338470459,
-0.030618181452155113,
0.06690831482410431,
-0.05546269938349724,
0.13029876351356506,
0.16875170171260834,
-0.06646686047315598,
-0.008180388249456882,
-0.054267268627882004,
-0.10355114936828613,
-0.005309390369802713,
0.0832882896065712,
0.04775730147957802,
0.014931444078683853,
0.04360847920179367,
0.04893144220113754,
0.019097421318292618,
0.006596181076020002,
0.1254977434873581,
-0.13747155666351318,
0.0450347438454628,
-0.018641116097569466,
-0.10321569442749023,
-0.06110169366002083,
-0.10134164243936539,
0.12349164485931396,
0.15618321299552917,
0.10491291433572769,
-0.0008465366554446518,
0.0860632061958313,
0.02457457408308983,
0.004617075901478529,
0.0004047604452352971,
-0.008927986025810242,
0.06266740709543228,
-0.0460909940302372,
0.039370402693748474,
-0.02619921788573265,
0.10310565680265427,
-0.13134326040744781,
0.044309813529253006,
-0.0036749558057636023,
0.038621898740530014,
0.022050311788916588,
-0.06379006803035736,
0.0906713455915451,
0.08314421772956848,
0.014769161120057106,
0.03809452801942825,
-0.013659053482115269,
0.01804080605506897,
-0.0412510484457016,
0.12582111358642578,
0.050269223749637604,
0.04593895375728607,
0.11718565970659256,
0.02336634136736393,
-0.039628542959690094,
-0.027027945965528488,
-0.062108658254146576,
-0.09970942884683609,
-0.05394190549850464,
-0.03436445817351341,
0.1376369744539261,
0.10536068677902222,
0.001186721259728074,
-0.007752329111099243,
-0.03854363411664963,
-0.05445312336087227,
-0.08669473230838776,
0.029295535758137703,
-0.04143887385725975,
-0.12008499354124069,
0.031564708799123764,
-0.03241119533777237,
-0.07866077870130539,
0.08048909157514572,
0.044716041535139084,
-0.07967782765626907,
0.24037407338619232,
-0.01377621665596962,
-0.01870458759367466,
-0.017573170363903046,
-0.03285279870033264,
-0.01430445071309805,
0.05087272822856903,
-0.06425917148590088,
-0.020339936017990112,
-0.0366496667265892,
0.09309090673923492,
0.03288524970412254,
-0.023718561977148056,
0.043622035533189774,
-0.09501668065786362,
-0.0067613995634019375,
-0.057432521134614944,
0.07428395748138428,
-0.0789208933711052,
0.2605215311050415,
0.04574970155954361,
0.0256717000156641,
0.04209422320127487,
0.1412583738565445,
-0.03747957944869995,
-0.09332642704248428,
-0.09064658731222153,
0.1119861751794815,
0.024328099563717842,
0.005885899532586336,
-0.04285832494497299,
-0.05581815540790558,
-0.0717604011297226,
0.24187828600406647,
0.09305954724550247,
-0.16537000238895416,
0.01131096389144659,
0.07846665382385254,
0.028386088088154793,
0.00921189971268177,
0.0884627178311348,
0.07554696500301361,
0.15834437310695648,
-0.03071434237062931,
-0.07724181562662125,
-0.05845135450363159,
-0.005408126395195723,
-0.03739576414227486,
-0.08877959102392197,
-0.04710201919078827,
-0.015914810821413994,
-0.09340687841176987,
0.03925834596157074,
-0.171228289604187,
-0.04185817390680313,
0.05753924325108528,
0.02347845770418644,
-0.05142385512590408,
-0.0012908554635941982,
0.04238496720790863,
-0.08480922877788544,
0.08545280992984772,
-0.06364892423152924,
0.06207273527979851,
-0.04280921816825867,
-0.026104548946022987,
-0.1254023015499115,
-0.0628858208656311,
0.09982342272996902,
0.0031497834715992212,
0.0918683409690857,
-0.004367628134787083,
0.0652507096529007,
0.10985397547483444,
0.10343325138092041,
-0.12427899241447449,
0.19044066965579987,
0.07136984914541245,
-0.1848587542772293,
-0.026939846575260162,
-0.018822165206074715,
-0.026259051635861397,
-0.02224792167544365,
0.01743820682168007,
-0.04199879243969917,
-0.009362732991576195,
-0.0072913444600999355,
0.0770631730556488,
-0.1652826964855194,
0.0327596589922905,
-0.04797869175672531,
0.10909827053546906,
0.06424222141504288,
-0.016172757372260094,
-0.03328465670347214,
-0.07862168550491333,
0.05318992957472801,
-0.03816269338130951,
-0.10810146480798721,
-0.1472659856081009,
-0.169836163520813,
0.055769022554159164,
-0.03790751099586487,
-0.007560225669294596,
-0.0788886770606041,
0.004328232258558273,
0.02935481071472168,
-0.017855456098914146,
-0.044670622795820236,
0.054456956684589386,
-0.041140031069517136,
0.026676148176193237,
-0.01186005026102066,
-0.1103154793381691,
-0.03910595551133156,
0.09028618037700653,
-0.17872866988182068,
-0.12874861061573029
] |
null | null | transformers |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
#
This model is a fine-tuned version of [hf-test/xls-r-dummy](https://huggingface.co/hf-test/xls-r-dummy) on the COMMON_VOICE - HI dataset.
It achieves the following results on the evaluation set:
- Loss: 248.1278
- Wer: 1.0
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0003
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 32
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 500
- num_epochs: 2.0
- mixed_precision_training: Native AMP
### Training results
### Framework versions
- Transformers 4.17.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.18.2.dev0
- Tokenizers 0.11.0
| {"language": ["hi"], "tags": ["automatic-speech-recognition", "common_voice", "generated_from_trainer"], "datasets": ["common_voice"], "model-index": [{"name": "", "results": []}]} | automatic-speech-recognition | Priyajay/xls-r-ab-test | [
"transformers",
"pytorch",
"wav2vec2",
"automatic-speech-recognition",
"common_voice",
"generated_from_trainer",
"hi",
"dataset:common_voice",
"endpoints_compatible",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [
"hi"
] | TAGS
#transformers #pytorch #wav2vec2 #automatic-speech-recognition #common_voice #generated_from_trainer #hi #dataset-common_voice #endpoints_compatible #region-us
|
#
This model is a fine-tuned version of hf-test/xls-r-dummy on the COMMON_VOICE - HI dataset.
It achieves the following results on the evaluation set:
- Loss: 248.1278
- Wer: 1.0
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0003
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 32
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 500
- num_epochs: 2.0
- mixed_precision_training: Native AMP
### Training results
### Framework versions
- Transformers 4.17.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.18.2.dev0
- Tokenizers 0.11.0
| [
"# \n\nThis model is a fine-tuned version of hf-test/xls-r-dummy on the COMMON_VOICE - HI dataset.\nIt achieves the following results on the evaluation set:\n- Loss: 248.1278\n- Wer: 1.0",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 0.0003\n- train_batch_size: 16\n- eval_batch_size: 8\n- seed: 42\n- gradient_accumulation_steps: 2\n- total_train_batch_size: 32\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_steps: 500\n- num_epochs: 2.0\n- mixed_precision_training: Native AMP",
"### Training results",
"### Framework versions\n\n- Transformers 4.17.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.18.2.dev0\n- Tokenizers 0.11.0"
] | [
"TAGS\n#transformers #pytorch #wav2vec2 #automatic-speech-recognition #common_voice #generated_from_trainer #hi #dataset-common_voice #endpoints_compatible #region-us \n",
"# \n\nThis model is a fine-tuned version of hf-test/xls-r-dummy on the COMMON_VOICE - HI dataset.\nIt achieves the following results on the evaluation set:\n- Loss: 248.1278\n- Wer: 1.0",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 0.0003\n- train_batch_size: 16\n- eval_batch_size: 8\n- seed: 42\n- gradient_accumulation_steps: 2\n- total_train_batch_size: 32\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_steps: 500\n- num_epochs: 2.0\n- mixed_precision_training: Native AMP",
"### Training results",
"### Framework versions\n\n- Transformers 4.17.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.18.2.dev0\n- Tokenizers 0.11.0"
] | [
61,
57,
6,
12,
8,
3,
140,
4,
39
] | [
"passage: TAGS\n#transformers #pytorch #wav2vec2 #automatic-speech-recognition #common_voice #generated_from_trainer #hi #dataset-common_voice #endpoints_compatible #region-us \n# \n\nThis model is a fine-tuned version of hf-test/xls-r-dummy on the COMMON_VOICE - HI dataset.\nIt achieves the following results on the evaluation set:\n- Loss: 248.1278\n- Wer: 1.0## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 0.0003\n- train_batch_size: 16\n- eval_batch_size: 8\n- seed: 42\n- gradient_accumulation_steps: 2\n- total_train_batch_size: 32\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_steps: 500\n- num_epochs: 2.0\n- mixed_precision_training: Native AMP### Training results### Framework versions\n\n- Transformers 4.17.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.18.2.dev0\n- Tokenizers 0.11.0"
] | [
-0.08341717720031738,
0.15325012803077698,
-0.003751954063773155,
0.02155189961194992,
0.11826971173286438,
0.03755778819322586,
0.08689276874065399,
0.14923758804798126,
-0.052193429321050644,
0.10104827582836151,
0.05751317739486694,
0.011545736342668533,
0.08768916130065918,
0.07173722982406616,
-0.00202296394854784,
-0.2662065029144287,
0.01727176457643509,
-0.019353995099663734,
-0.07043951749801636,
0.08312425762414932,
0.11053024977445602,
-0.08429910242557526,
0.026089953258633614,
0.03393318131566048,
-0.11486251652240753,
0.021400118246674538,
-0.054552312940359116,
-0.03302322328090668,
0.08367717266082764,
0.0205696988850832,
0.053998108953237534,
0.025069324299693108,
0.08459159731864929,
-0.2870316505432129,
0.0015407116152346134,
0.07434693723917007,
0.04867718741297722,
0.06716514378786087,
0.055236343294382095,
0.01665360853075981,
0.13586990535259247,
-0.13944464921951294,
0.06628817319869995,
0.05971914157271385,
-0.04303620010614395,
-0.1787620633840561,
-0.08757290244102478,
0.04768161103129387,
0.09658744931221008,
0.10525836050510406,
-0.03088713437318802,
0.13628995418548584,
-0.09924858063459396,
0.04911413788795471,
0.16562198102474213,
-0.23965011537075043,
-0.03881105035543442,
-0.0024659011978656054,
0.042478326708078384,
0.05546950176358223,
-0.10812120884656906,
-0.009498817846179008,
0.038480572402477264,
0.020919444039463997,
0.03603878989815712,
0.016073187813162804,
0.019234605133533478,
0.005603852681815624,
-0.1085682213306427,
-0.04684729501605034,
0.1734490990638733,
0.0918956845998764,
-0.04306669533252716,
-0.13838331401348114,
-0.003824569284915924,
-0.13974696397781372,
-0.018615465611219406,
-0.018086526542901993,
0.012124121189117432,
-0.03981413319706917,
-0.08009232580661774,
-0.002120538614690304,
-0.07511653751134872,
-0.05227818712592125,
0.05860809609293938,
0.10959053784608841,
0.046976011246442795,
-0.023639893159270287,
0.01914266124367714,
0.08631828427314758,
0.03173360601067543,
-0.13028211891651154,
-0.037954047322273254,
0.020636362954974174,
-0.13439956307411194,
-0.03364725410938263,
-0.05103980004787445,
-0.06463713943958282,
0.034950803965330124,
0.09004796296358109,
-0.022497566416859627,
0.09373534470796585,
0.014591461978852749,
-0.008544598706066608,
0.013733436353504658,
0.145758256316185,
-0.03654429316520691,
-0.06867317855358124,
-0.03557015210390091,
0.115688256919384,
0.015047401189804077,
-0.023351771757006645,
-0.0683608278632164,
-0.001398644526489079,
0.12262099236249924,
0.06799612194299698,
-0.05326540395617485,
-0.007671512197703123,
-0.05971253663301468,
-0.02391674555838108,
-0.0030551443342119455,
-0.12511302530765533,
0.04906206950545311,
0.022687966004014015,
-0.03898991271853447,
0.03011421300470829,
-0.015852203592658043,
0.011937733739614487,
-0.06722254306077957,
0.09363657981157303,
-0.0695171132683754,
-0.021113893017172813,
-0.052418630570173264,
-0.053018588572740555,
0.026437420397996902,
-0.021450722590088844,
-0.0020252817776054144,
-0.05163375660777092,
-0.0861436203122139,
-0.05519179254770279,
0.039105311036109924,
-0.0674687847495079,
-0.07363660633563995,
-0.04301553592085838,
-0.021934445947408676,
0.038325078785419464,
-0.00836192537099123,
0.1097891703248024,
-0.044908180832862854,
0.07302950322628021,
0.010867609642446041,
0.007813675329089165,
0.057913076132535934,
0.05138654634356499,
-0.04610159993171692,
0.04587901383638382,
-0.05177998170256615,
0.10358408093452454,
-0.09717971831560135,
0.02150888368487358,
-0.12735165655612946,
-0.11256293952465057,
-0.036907512694597244,
-0.016892438754439354,
0.07017479836940765,
0.1110243871808052,
-0.1729460060596466,
-0.06890708953142166,
0.1360553503036499,
-0.06350523233413696,
-0.0778893455862999,
0.1423318088054657,
-0.02582145854830742,
-0.01809069514274597,
0.04849744215607643,
0.14103595912456512,
0.11682181060314178,
-0.11833242326974869,
-0.04934287071228027,
-0.019015949219465256,
0.08793448656797409,
0.05966721847653389,
0.0838189572095871,
-0.04655557870864868,
-0.0028904718346893787,
-0.00023689528461545706,
-0.0063665262423455715,
0.009497893042862415,
-0.05752669274806976,
-0.07677921652793884,
-0.02763894572854042,
-0.06959283351898193,
0.008655651472508907,
0.03194214031100273,
0.016828950494527817,
-0.08353335410356522,
-0.15624858438968658,
0.05892438068985939,
0.1456356942653656,
-0.08029700070619583,
0.01528516598045826,
-0.09102347493171692,
0.018556147813796997,
-0.0344439372420311,
-0.010182023048400879,
-0.16343770921230316,
-0.02788931131362915,
0.0591932088136673,
-0.09787891805171967,
0.054183002561330795,
0.018193423748016357,
0.058158691972494125,
0.04027536138892174,
-0.02981354482471943,
-0.03236972913146019,
-0.09176740795373917,
0.008639943785965443,
-0.06301627308130264,
-0.16142107546329498,
-0.07019725441932678,
-0.01749889925122261,
0.25921186804771423,
-0.22216694056987762,
-0.018399221822619438,
0.03845192492008209,
0.14105580747127533,
0.010321887210011482,
-0.07317845523357391,
0.0009628571569919586,
0.042917679995298386,
-0.005131920333951712,
-0.07354513555765152,
0.01501677930355072,
0.012648972682654858,
-0.1292027086019516,
-0.043745215982198715,
-0.1446453332901001,
-0.037366777658462524,
0.0777953714132309,
0.1012890636920929,
-0.06090174615383148,
-0.06935086846351624,
-0.05944085121154785,
-0.052293021231889725,
-0.05663362890481949,
-0.03990127891302109,
0.1830863356590271,
0.044357117265462875,
0.10050856322050095,
-0.049463022500276566,
-0.06219712272286415,
0.01446547731757164,
0.032685574144124985,
-0.031938064843416214,
0.07736364752054214,
0.03071628324687481,
-0.11780132353305817,
0.05053659528493881,
0.041895925998687744,
-0.05334528535604477,
0.14803101122379303,
-0.06346380710601807,
-0.11682342737913132,
-0.029568282887339592,
0.009379358030855656,
0.01609351858496666,
0.10842225700616837,
-0.16211625933647156,
0.010352238081395626,
0.034576889127492905,
0.010392218828201294,
0.03911098092794418,
-0.1540081948041916,
0.020186930894851685,
0.04516521841287613,
-0.02712327428162098,
-0.044410932809114456,
-0.016933191567659378,
0.008700053207576275,
0.053609952330589294,
0.027911020442843437,
0.0012227559927850962,
-0.007779441773891449,
-0.04246826842427254,
-0.09116523712873459,
0.1487172544002533,
-0.100424624979496,
-0.1681789755821228,
-0.13847990334033966,
0.020350605249404907,
-0.0325905941426754,
-0.036706868559122086,
0.031909532845020294,
-0.1142163872718811,
-0.06548819690942764,
-0.0885014757514,
-0.025608466938138008,
-0.07449591159820557,
0.0066451714374125,
0.06809083372354507,
0.007194267585873604,
0.09225189685821533,
-0.1250930279493332,
0.015591476112604141,
0.0006088431109674275,
-0.030959393829107285,
-0.02246168814599514,
0.02331271581351757,
0.08674857020378113,
0.14227674901485443,
0.014524713158607483,
0.04336485639214516,
-0.015512277372181416,
0.21414373815059662,
-0.11505391448736191,
-0.0242707971483469,
0.0877644419670105,
0.004022752866148949,
0.04743681848049164,
0.10416707396507263,
0.033752817660570145,
-0.06778131425380707,
0.023411283269524574,
0.06242213025689125,
-0.006887841038405895,
-0.2368106245994568,
-0.052824728190898895,
-0.060631196945905685,
-0.09994146227836609,
0.11288104951381683,
0.05953288450837135,
0.00026669466751627624,
0.0231198500841856,
-0.03148074448108673,
0.032133638858795166,
-0.00993820559233427,
0.07304900139570236,
0.07501450926065445,
0.049851272255182266,
0.06848343461751938,
-0.02319224737584591,
-0.0371956005692482,
0.04796841740608215,
0.011691669933497906,
0.2261805683374405,
0.0032678854186087847,
0.17359313368797302,
0.02246876247227192,
0.1088283360004425,
-0.0031925735529512167,
0.03156011179089546,
0.012893671169877052,
-0.011536678299307823,
0.027372656390070915,
-0.07091265171766281,
-0.04078024625778198,
0.046320367604494095,
0.11645389348268509,
0.02273900993168354,
-0.08469385653734207,
0.027501212432980537,
0.012971495278179646,
0.2890952527523041,
0.09682842344045639,
-0.25391024351119995,
-0.06994055956602097,
0.02317040041089058,
-0.056038711220026016,
-0.0703231617808342,
0.0100575415417552,
0.09439200907945633,
-0.11858277767896652,
0.07472274452447891,
-0.04863070696592331,
0.09005410969257355,
-0.07488532364368439,
0.006773774977773428,
0.036709874868392944,
0.1110859215259552,
-0.007572571747004986,
0.09605230391025543,
-0.16461965441703796,
0.1810862421989441,
0.008529946208000183,
0.10370713472366333,
-0.07457102090120316,
0.04336649179458618,
0.012637598440051079,
0.00025511396233923733,
0.0865107998251915,
0.011552362702786922,
-0.06135019287467003,
-0.15404029190540314,
-0.08284568041563034,
0.046361349523067474,
0.12498157471418381,
-0.028513234108686447,
0.09723418205976486,
-0.056017421185970306,
-0.006452325731515884,
0.034325018525123596,
-0.03999411314725876,
-0.1633152812719345,
-0.18457864224910736,
0.05077317729592323,
0.04329749196767807,
0.060426775366067886,
-0.08511810004711151,
-0.08750276267528534,
-0.03140167146921158,
0.20863428711891174,
-0.009178063832223415,
-0.026677843183279037,
-0.13038325309753418,
0.0892273485660553,
0.15023943781852722,
-0.044710103422403336,
-0.0042004669085145,
0.02662285417318344,
0.16530992090702057,
0.01872149109840393,
-0.034512490034103394,
0.04001719504594803,
-0.044677622616291046,
-0.13851593434810638,
-0.04827362298965454,
0.1881929636001587,
0.033175982534885406,
0.07582759857177734,
-0.0007090390427038074,
0.004144923761487007,
0.005022156983613968,
-0.07612147182226181,
0.05041274055838585,
0.0450480617582798,
0.01520830113440752,
0.08137589693069458,
-0.027468370273709297,
0.04623853787779808,
-0.07392973452806473,
-0.041196707636117935,
0.15491247177124023,
0.2416054755449295,
-0.05922995135188103,
0.06884443014860153,
0.0657319501042366,
-0.044997308403253555,
-0.13147364556789398,
-0.0017491632606834173,
0.1292715072631836,
0.05197210609912872,
0.025616833940148354,
-0.18707576394081116,
0.06739030033349991,
0.11170508712530136,
-0.015178674831986427,
0.022772593423724174,
-0.2895640730857849,
-0.11081402748823166,
0.08189014345407486,
0.05915471538901329,
-0.03056742623448372,
-0.13283546268939972,
-0.06373261660337448,
-0.07827544957399368,
-0.134263813495636,
0.06521912664175034,
-0.029689250513911247,
0.11854086071252823,
0.004733546636998653,
0.06947986036539078,
0.045535482466220856,
-0.03975982591509819,
0.16141989827156067,
0.016272269189357758,
0.029376305639743805,
-0.007903408259153366,
0.05131326988339424,
0.020961860194802284,
-0.0669628456234932,
0.05382634699344635,
-0.09961254149675369,
0.04177216440439224,
-0.1466103196144104,
-0.02668238990008831,
-0.05769365653395653,
0.030664870515465736,
-0.04148014262318611,
-0.03879022225737572,
-0.03720245510339737,
0.05149112641811371,
0.08346892893314362,
-0.013472861610352993,
0.077018603682518,
-0.01441703736782074,
0.08626983314752579,
0.09057588130235672,
0.11217348277568817,
-0.026419861242175102,
-0.12576812505722046,
-0.010906117036938667,
-0.00910111889243126,
0.05178523808717728,
-0.08738202601671219,
0.041209857910871506,
0.11224447190761566,
0.0573376789689064,
0.172520712018013,
0.02457072213292122,
-0.11466223746538162,
0.007656881585717201,
0.057613059878349304,
-0.04892101511359215,
-0.20166902244091034,
-0.04739029332995415,
0.0366668775677681,
-0.15741243958473206,
-0.018300505355000496,
0.09801819920539856,
-0.03572836518287659,
-0.016324210911989212,
0.005490696523338556,
0.020054761320352554,
-0.036270786076784134,
0.1805972009897232,
0.000630925758741796,
0.09400636702775955,
-0.08255013078451157,
0.0896851122379303,
0.08803058415651321,
-0.10953676700592041,
0.06628573685884476,
0.05911342427134514,
-0.05424237623810768,
-0.010330227203667164,
0.0024412504862993956,
0.06367499381303787,
0.03562319278717041,
-0.053911082446575165,
-0.06700710952281952,
-0.12079624831676483,
0.03727732226252556,
-0.024853667244315147,
0.01727289706468582,
0.005312537308782339,
-0.03869234770536423,
0.028040271252393723,
-0.16496378183364868,
0.08502152562141418,
0.050423283129930496,
0.06014794856309891,
-0.13854707777500153,
0.06040617451071739,
0.010798093862831593,
0.02216316945850849,
0.0041765859350562096,
-0.031131992116570473,
-0.07119240611791611,
-0.008497939445078373,
-0.1484147012233734,
-0.02390504814684391,
-0.050654370337724686,
-0.0006398300174623728,
0.015011579729616642,
-0.022667499259114265,
-0.04761113226413727,
0.05387268215417862,
-0.07533666491508484,
-0.0890759751200676,
-0.0030543257016688585,
0.06981921195983887,
-0.09330201894044876,
0.02648976258933544,
0.046690016984939575,
-0.13378888368606567,
0.08518000692129135,
0.056135307997465134,
0.019755087792873383,
0.035909611731767654,
-0.06501419097185135,
-0.03857668489217758,
0.03798999637365341,
0.02701535075902939,
0.054798103868961334,
-0.1461436003446579,
-0.01547156646847725,
-0.011601549573242664,
0.017768435180187225,
0.0016389896627515554,
0.02101149968802929,
-0.10140635818243027,
-0.045524369925260544,
-0.07655005156993866,
-0.03841201215982437,
-0.06436989456415176,
0.04480545595288277,
0.07967469096183777,
0.04056229442358017,
0.16509315371513367,
-0.06931944191455841,
0.055667340755462646,
-0.2001916468143463,
-0.019646810367703438,
-0.010548489168286324,
0.002595669822767377,
-0.05017326772212982,
-0.02459867112338543,
0.07241303473711014,
-0.053621433675289154,
0.10888613015413284,
-0.0627489686012268,
0.08784759044647217,
0.041074156761169434,
-0.04242502897977829,
-0.019131727516651154,
-0.007180488668382168,
0.2140723317861557,
0.08320224285125732,
-0.021555982530117035,
0.07715719193220139,
-0.06460092216730118,
0.04975233972072601,
0.10553473979234695,
0.12325027585029602,
0.14054036140441895,
0.018390895798802376,
0.06145342439413071,
0.07084329426288605,
-0.10545121133327484,
-0.14643500745296478,
0.15106531977653503,
-0.027320491150021553,
0.1248762235045433,
-0.01696431264281273,
0.19646543264389038,
0.11989089846611023,
-0.1674308031797409,
0.054170846939086914,
-0.06284169107675552,
-0.11312450468540192,
-0.09728958457708359,
-0.08424203842878342,
-0.08437322080135345,
-0.14848074316978455,
0.03498433157801628,
-0.09804552048444748,
0.04136380925774574,
0.06936254352331161,
0.027410222217440605,
0.014167745597660542,
0.11808592081069946,
-0.016891546547412872,
-0.007800095248967409,
0.0851714089512825,
-0.009969866834580898,
-0.023695887997746468,
-0.015702856704592705,
-0.04415776953101158,
0.07771540433168411,
-0.009722116403281689,
0.1115569919347763,
-0.027992382645606995,
-0.025281134992837906,
0.052532121539115906,
0.00392823526635766,
-0.09399066120386124,
0.03290684148669243,
-0.016481535509228706,
0.03665830194950104,
0.07781032472848892,
0.05960186570882797,
-0.005204246379435062,
-0.0523533821105957,
0.2166403979063034,
-0.07451806962490082,
-0.06755126267671585,
-0.1471407413482666,
0.2073506861925125,
0.03548356890678406,
-0.001725930953398347,
0.063084177672863,
-0.12962056696414948,
-0.013094110414385796,
0.11266741156578064,
0.11899545788764954,
-0.03038598783314228,
-0.01490313932299614,
-0.009271190501749516,
-0.017740368843078613,
-0.08371885120868683,
0.09184269607067108,
0.08683104813098907,
0.033726271241903305,
-0.016075650230050087,
0.05055355653166771,
-0.020071472972631454,
-0.069420725107193,
-0.0614485889673233,
0.089080311357975,
0.028010226786136627,
-0.008125650696456432,
-0.012936306186020374,
0.0991349071264267,
-0.004748768173158169,
-0.18927989900112152,
0.009973659180104733,
-0.13392801582813263,
-0.18780185282230377,
-0.019894365221261978,
0.05973171442747116,
-0.0031047877855598927,
0.057737138122320175,
-0.017142046242952347,
-0.02013324573636055,
0.14508825540542603,
-0.0011949710315093398,
-0.04739506170153618,
-0.07217156887054443,
0.09027499705553055,
-0.05372103676199913,
0.1844908595085144,
0.013197698630392551,
0.06808653473854065,
0.10331448167562485,
0.04640164598822594,
-0.11645758152008057,
0.057254232466220856,
0.07903730869293213,
-0.11781343817710876,
0.050800811499357224,
0.22074462473392487,
-0.037459563463926315,
0.11654038727283478,
0.05762404203414917,
-0.1066465899348259,
-0.020257916301488876,
-0.08014465123414993,
0.003236524760723114,
-0.07830148935317993,
-0.007477761711925268,
-0.047279294580221176,
0.15683569014072418,
0.1954246610403061,
-0.052203401923179626,
-0.015625668689608574,
-0.059870585799217224,
0.01432496216148138,
0.04823783040046692,
0.14056162536144257,
-0.03556344658136368,
-0.2065318524837494,
0.020312193781137466,
-0.018731215968728065,
0.02608584426343441,
-0.2626911997795105,
-0.08554191142320633,
0.037492524832487106,
-0.05071967467665672,
-0.019778024405241013,
0.1280897855758667,
0.06769582629203796,
0.024298349395394325,
-0.05078626051545143,
-0.12260357290506363,
-0.028606124222278595,
0.14627951383590698,
-0.1714116930961609,
-0.04294608533382416
] |
null | null | transformers |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
#
This model is a fine-tuned version of [facebook/wav2vec2-large-xlsr-53](https://huggingface.co/facebook/wav2vec2-large-xlsr-53) on the COMMON_VOICE - HI dataset.
It achieves the following results on the evaluation set:
- Loss: 26.7866
- Wer: 1.0
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0003
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 32
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 500
- num_epochs: 2.0
- mixed_precision_training: Native AMP
### Training results
### Framework versions
- Transformers 4.17.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.18.2.dev0
- Tokenizers 0.11.0
| {"language": ["hi"], "license": "apache-2.0", "tags": ["automatic-speech-recognition", "common_voice", "generated_from_trainer"], "datasets": ["common_voice"], "model-index": [{"name": "", "results": []}]} | automatic-speech-recognition | Priyajay/xls-r-kn-test | [
"transformers",
"pytorch",
"wav2vec2",
"automatic-speech-recognition",
"common_voice",
"generated_from_trainer",
"hi",
"dataset:common_voice",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [
"hi"
] | TAGS
#transformers #pytorch #wav2vec2 #automatic-speech-recognition #common_voice #generated_from_trainer #hi #dataset-common_voice #license-apache-2.0 #endpoints_compatible #region-us
|
#
This model is a fine-tuned version of facebook/wav2vec2-large-xlsr-53 on the COMMON_VOICE - HI dataset.
It achieves the following results on the evaluation set:
- Loss: 26.7866
- Wer: 1.0
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0003
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 32
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 500
- num_epochs: 2.0
- mixed_precision_training: Native AMP
### Training results
### Framework versions
- Transformers 4.17.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.18.2.dev0
- Tokenizers 0.11.0
| [
"# \n\nThis model is a fine-tuned version of facebook/wav2vec2-large-xlsr-53 on the COMMON_VOICE - HI dataset.\nIt achieves the following results on the evaluation set:\n- Loss: 26.7866\n- Wer: 1.0",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 0.0003\n- train_batch_size: 16\n- eval_batch_size: 8\n- seed: 42\n- gradient_accumulation_steps: 2\n- total_train_batch_size: 32\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_steps: 500\n- num_epochs: 2.0\n- mixed_precision_training: Native AMP",
"### Training results",
"### Framework versions\n\n- Transformers 4.17.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.18.2.dev0\n- Tokenizers 0.11.0"
] | [
"TAGS\n#transformers #pytorch #wav2vec2 #automatic-speech-recognition #common_voice #generated_from_trainer #hi #dataset-common_voice #license-apache-2.0 #endpoints_compatible #region-us \n",
"# \n\nThis model is a fine-tuned version of facebook/wav2vec2-large-xlsr-53 on the COMMON_VOICE - HI dataset.\nIt achieves the following results on the evaluation set:\n- Loss: 26.7866\n- Wer: 1.0",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 0.0003\n- train_batch_size: 16\n- eval_batch_size: 8\n- seed: 42\n- gradient_accumulation_steps: 2\n- total_train_batch_size: 32\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_steps: 500\n- num_epochs: 2.0\n- mixed_precision_training: Native AMP",
"### Training results",
"### Framework versions\n\n- Transformers 4.17.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.18.2.dev0\n- Tokenizers 0.11.0"
] | [
69,
60,
6,
12,
8,
3,
140,
4,
39
] | [
"passage: TAGS\n#transformers #pytorch #wav2vec2 #automatic-speech-recognition #common_voice #generated_from_trainer #hi #dataset-common_voice #license-apache-2.0 #endpoints_compatible #region-us \n# \n\nThis model is a fine-tuned version of facebook/wav2vec2-large-xlsr-53 on the COMMON_VOICE - HI dataset.\nIt achieves the following results on the evaluation set:\n- Loss: 26.7866\n- Wer: 1.0## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 0.0003\n- train_batch_size: 16\n- eval_batch_size: 8\n- seed: 42\n- gradient_accumulation_steps: 2\n- total_train_batch_size: 32\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_steps: 500\n- num_epochs: 2.0\n- mixed_precision_training: Native AMP### Training results### Framework versions\n\n- Transformers 4.17.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.18.2.dev0\n- Tokenizers 0.11.0"
] | [
-0.10594671219587326,
0.1762852519750595,
-0.004268857650458813,
0.01738222874701023,
0.09781741350889206,
0.008929585106670856,
0.05614690110087395,
0.14507533609867096,
-0.011118589900434017,
0.10897397994995117,
0.06574402004480362,
-0.019276494160294533,
0.0933857411146164,
0.1386021375656128,
-0.008586511015892029,
-0.22303132712841034,
0.013912511989474297,
-0.049628548324108124,
-0.05958135798573494,
0.08651218563318253,
0.11064866930246353,
-0.06102324277162552,
0.0315234512090683,
0.023523341864347458,
-0.09697557985782623,
0.017338154837489128,
-0.052283111959695816,
-0.05219914764165878,
0.08688425272703171,
0.030721746385097504,
0.026800254359841347,
0.04400602728128433,
0.0966903567314148,
-0.2849043607711792,
0.0018454064847901464,
0.08038295060396194,
0.037706561386585236,
0.07004411518573761,
0.08219338208436966,
-0.015697088092565536,
0.10608858615159988,
-0.13818329572677612,
0.07212968170642853,
0.06953977793455124,
-0.057455431669950485,
-0.17361825704574585,
-0.09581474214792252,
0.09832713752985,
0.1336611658334732,
0.08392998576164246,
-0.026200029999017715,
0.07343161106109619,
-0.06987085938453674,
0.05313766747713089,
0.18519960343837738,
-0.2422497719526291,
-0.03587765619158745,
-0.013690733350813389,
0.05083347111940384,
0.014198405668139458,
-0.09545499086380005,
0.024007640779018402,
0.05101883038878441,
0.014470646157860756,
0.05008230730891228,
0.0009922542376443744,
0.009073036722838879,
-0.0166831836104393,
-0.1133350282907486,
-0.029442913830280304,
0.14206834137439728,
0.09715738147497177,
-0.059086672961711884,
-0.14359278976917267,
-0.00007644003198947757,
-0.10322742909193039,
0.00390221131965518,
-0.046657439321279526,
-0.0003496869176160544,
-0.030143357813358307,
-0.05640268698334694,
-0.03778132423758507,
-0.06747877597808838,
-0.06477437913417816,
0.06539211422204971,
0.09883258491754532,
0.028152989223599434,
-0.01664387620985508,
0.017347201704978943,
0.09156104922294617,
0.021318849176168442,
-0.10985162854194641,
-0.01359948143362999,
-0.0014005652628839016,
-0.1575213521718979,
-0.029033683240413666,
-0.02823757752776146,
-0.03731950372457504,
0.017127173021435738,
0.09877274185419083,
-0.0029205854516476393,
0.09417660534381866,
0.008853220380842686,
-0.003277356270700693,
0.03861788660287857,
0.12550005316734314,
-0.03431003540754318,
-0.11656338721513748,
-0.030855786055326462,
0.10639986395835876,
0.0010265076998621225,
-0.030290523543953896,
-0.06515979021787643,
-0.012168542481958866,
0.09492812305688858,
0.08527695387601852,
-0.008607638068497181,
-0.017310045659542084,
-0.08118384331464767,
-0.02712099254131317,
0.0291151013225317,
-0.12561842799186707,
0.05787103623151779,
0.02722199074923992,
-0.037022560834884644,
0.012997759506106377,
0.021145738661289215,
0.009526251815259457,
-0.06643828749656677,
0.07027190178632736,
-0.031309813261032104,
-0.03626612201333046,
-0.022315843030810356,
-0.02676943875849247,
0.028660092502832413,
-0.0510425791144371,
-0.028063939884305,
-0.0626855194568634,
-0.1052432656288147,
-0.06947940587997437,
0.04453040659427643,
-0.09229633212089539,
-0.0864739865064621,
-0.03789139911532402,
-0.014769143424928188,
0.04080205038189888,
-0.032277218997478485,
0.11583124101161957,
-0.035807348787784576,
0.058408934623003006,
-0.02226383052766323,
0.014367103576660156,
0.08727999031543732,
0.06021130457520485,
-0.027612553909420967,
0.05175016075372696,
-0.09173794090747833,
0.13368771970272064,
-0.10513516515493393,
-0.015415950678288937,
-0.1586826592683792,
-0.08530918508768082,
-0.01163433026522398,
-0.019478218629956245,
0.08295419812202454,
0.14092138409614563,
-0.18825757503509521,
-0.06697448343038559,
0.11645898222923279,
-0.08493629097938538,
-0.060853637754917145,
0.13860958814620972,
-0.008242168463766575,
-0.015988918021321297,
0.04862891882658005,
0.16189266741275787,
0.12077798694372177,
-0.1483187973499298,
-0.01550592016428709,
0.005685143172740936,
0.08169261366128922,
0.07232145965099335,
0.06485091149806976,
-0.07761973142623901,
-0.0018613296560943127,
-0.0022488206159323454,
-0.022464225068688393,
0.012216492556035519,
-0.057484015822410583,
-0.06875874847173691,
-0.01798989251255989,
-0.07853671908378601,
0.0401238352060318,
0.004748999141156673,
-0.0178043395280838,
-0.06720026582479477,
-0.14135655760765076,
0.014333656057715416,
0.13768719136714935,
-0.06332242488861084,
0.021288055926561356,
-0.0726957619190216,
0.026921743527054787,
-0.004219846799969673,
-0.010419238358736038,
-0.162804514169693,
-0.023407792672514915,
0.05456941947340965,
-0.10811042785644531,
0.043981585651636124,
0.013745589181780815,
0.04380618408322334,
0.03608810156583786,
-0.03112831339240074,
-0.03995868191123009,
-0.07972526550292969,
0.01036986242979765,
-0.058433517813682556,
-0.19034522771835327,
-0.08252625912427902,
-0.042387597262859344,
0.22993339598178864,
-0.19928134977817535,
-0.02291283942759037,
0.048888348042964935,
0.15597668290138245,
0.012118070386350155,
-0.07783439010381699,
0.03155842423439026,
0.036757875233888626,
0.02039450593292713,
-0.08335719257593155,
0.015547801740467548,
0.014518984593451023,
-0.14781956374645233,
-0.0015789136523380876,
-0.12140893936157227,
-0.028614720329642296,
0.0478907972574234,
0.11960289627313614,
-0.10027223080396652,
-0.07654928416013718,
-0.05223437398672104,
-0.050701577216386795,
-0.06611282378435135,
-0.030738692730665207,
0.2332187443971634,
0.06711939722299576,
0.08659500628709793,
-0.04536544531583786,
-0.06421792507171631,
0.017856886610388756,
0.018962174654006958,
-0.04503719136118889,
0.07743538171052933,
0.04158047214150429,
-0.1284555196762085,
0.03813064843416214,
0.0594324916601181,
0.014348622411489487,
0.13149850070476532,
-0.036619752645492554,
-0.0954236388206482,
-0.03135921061038971,
0.011414659209549427,
-0.01180332899093628,
0.12143837660551071,
-0.11788269132375717,
-0.004508639220148325,
0.03060908429324627,
0.012892757542431355,
0.03247307986021042,
-0.12665703892707825,
0.029582180082798004,
0.03419670835137367,
-0.0465969443321228,
-0.028997380286455154,
-0.02294234186410904,
0.014834577217698097,
0.058171480894088745,
0.03025161847472191,
-0.014615535736083984,
-0.006776740308851004,
-0.0431484691798687,
-0.10402890294790268,
0.14123976230621338,
-0.11155466735363007,
-0.2050706297159195,
-0.11792843043804169,
0.05911850929260254,
-0.02955031767487526,
-0.03473864495754242,
0.022975383326411247,
-0.11563262343406677,
-0.06070387735962868,
-0.07545119524002075,
0.02061145380139351,
-0.06789432466030121,
0.007235540077090263,
0.070454902946949,
0.004890501033514738,
0.10012999176979065,
-0.11477188020944595,
0.013120854273438454,
-0.00344772357493639,
-0.030602063983678818,
-0.03811941668391228,
0.046613797545433044,
0.06023170053958893,
0.12851589918136597,
0.03884543105959892,
0.03740116208791733,
-0.04628033563494682,
0.17767593264579773,
-0.12060929089784622,
0.022326605394482613,
0.1038174256682396,
-0.006933055352419615,
0.041662126779556274,
0.1046588197350502,
0.005323907360434532,
-0.08847752958536148,
0.024095458909869194,
0.05243779718875885,
-0.009621542878448963,
-0.256664901971817,
-0.06919435411691666,
-0.04183020070195198,
-0.05208573862910271,
0.1038610115647316,
0.05349329486489296,
-0.024592317640781403,
0.013290954753756523,
-0.03646238148212433,
-0.018908105790615082,
0.009698707610368729,
0.05781017243862152,
0.060858942568302155,
0.024041542783379555,
0.06919705867767334,
-0.012509047985076904,
0.013234632089734077,
0.0725037157535553,
0.017908737063407898,
0.20296993851661682,
0.0026516481302678585,
0.12142716348171234,
0.023258721455931664,
0.11939433217048645,
-0.027666592970490456,
0.011426233686506748,
0.025623589754104614,
0.008742555975914001,
0.013406331650912762,
-0.0569370836019516,
-0.05776898190379143,
0.04301034286618233,
0.1328487992286682,
-0.014004020020365715,
-0.10318509489297867,
0.04197729378938675,
0.015245339833199978,
0.32107630372047424,
0.09113902598619461,
-0.20644864439964294,
-0.06858542561531067,
0.013815918006002903,
-0.0589946024119854,
-0.06341593712568283,
0.020485233515501022,
0.11378762125968933,
-0.146101176738739,
0.11353372782468796,
-0.03472111374139786,
0.08877310901880264,
-0.07985413074493408,
-0.00003933286279789172,
0.02593960426747799,
0.0909295603632927,
0.008562454953789711,
0.07318700850009918,
-0.13943423330783844,
0.20240801572799683,
0.010027834214270115,
0.08921004086732864,
-0.05273234471678734,
0.058226194232702255,
-0.008126931264996529,
0.005026534199714661,
0.11722952872514725,
0.01895509473979473,
-0.06930282711982727,
-0.11094333976507187,
-0.11123618483543396,
0.020738132297992706,
0.12006692588329315,
-0.06590937077999115,
0.05804136395454407,
-0.041912514716386795,
-0.018345342949032784,
0.012689716182649136,
-0.055965524166822433,
-0.16942420601844788,
-0.1630261242389679,
0.03869178518652916,
0.017454346641898155,
0.05117907375097275,
-0.08054477721452713,
-0.07605185359716415,
-0.027716804295778275,
0.23208710551261902,
-0.03934682533144951,
-0.028732864186167717,
-0.14296476542949677,
0.06966260075569153,
0.1535664200782776,
-0.048456985503435135,
0.012898406945168972,
0.033916499465703964,
0.14190305769443512,
-0.0029992663767188787,
-0.053488727658987045,
0.05320635065436363,
-0.07950840890407562,
-0.1604112833738327,
-0.055495135486125946,
0.21464918553829193,
0.06284360587596893,
0.05747845768928528,
0.010033288039267063,
-0.003997565247118473,
0.0362551249563694,
-0.08095046877861023,
0.08932413160800934,
0.07532399892807007,
0.0015011688228696585,
0.0695396289229393,
-0.007131601218134165,
-0.006594696547836065,
-0.06999526172876358,
-0.054256729781627655,
0.1350764036178589,
0.2590327560901642,
-0.08483942598104477,
0.14854717254638672,
0.11451475322246552,
-0.06536206603050232,
-0.13156765699386597,
0.005534857511520386,
0.1418103277683258,
0.042097385972738266,
0.03124949336051941,
-0.19797010719776154,
0.04664970189332962,
0.08579614758491516,
-0.021933531388640404,
-0.008454212918877602,
-0.2846013307571411,
-0.11947927623987198,
0.0946110412478447,
0.012861869297921658,
-0.019341537728905678,
-0.1017414927482605,
-0.06398778408765793,
-0.05744832009077072,
-0.1173490509390831,
0.04424464330077171,
-0.02129717543721199,
0.09156867116689682,
0.024790672585368156,
0.060230642557144165,
0.043112318962812424,
-0.016672473400831223,
0.14258620142936707,
0.02917001210153103,
0.019106905907392502,
-0.009569771587848663,
0.0728072002530098,
-0.025814106687903404,
-0.06550046056509018,
0.05926760658621788,
-0.06565945595502853,
0.04388827458024025,
-0.14224499464035034,
-0.02201351523399353,
-0.05929983779788017,
0.061682332307100296,
-0.04778645932674408,
-0.03939393162727356,
-0.022786937654018402,
0.055040862411260605,
0.07237111777067184,
-0.0227641724050045,
0.005331246182322502,
-0.02872614748775959,
0.08756286650896072,
0.1341262012720108,
0.11018358170986176,
-0.0034586673136800528,
-0.14169012010097504,
-0.004915706813335419,
-0.033560611307621,
0.052152302116155624,
-0.056594058871269226,
0.03450537100434303,
0.09364195168018341,
0.050361357629299164,
0.15387657284736633,
-0.007457464933395386,
-0.11442960798740387,
0.003709750948473811,
0.03701339662075043,
-0.07242502272129059,
-0.17421720921993256,
-0.025450900197029114,
0.03357644006609917,
-0.11962513625621796,
-0.005475807934999466,
0.13328814506530762,
-0.020107578486204147,
-0.02823597379028797,
-0.008128700777888298,
0.0434536449611187,
-0.030930496752262115,
0.16973187029361725,
0.0006358000100590289,
0.09274119138717651,
-0.0712498351931572,
0.1355409175157547,
0.07850462943315506,
-0.1269637495279312,
0.1036284789443016,
0.048917632550001144,
-0.05503386631608009,
-0.013517465442419052,
0.01244449708610773,
0.06375771760940552,
0.040655288845300674,
-0.054301582276821136,
-0.03241679444909096,
-0.11979851871728897,
0.051227305084466934,
0.03050730749964714,
0.0031482160557061434,
-0.006616577040404081,
-0.028034372255206108,
0.020440978929400444,
-0.12155124545097351,
0.06871750205755234,
0.08167408406734467,
0.029280394315719604,
-0.1345287561416626,
0.0715688019990921,
0.020821355283260345,
0.0338212288916111,
-0.006068024318665266,
-0.027032753452658653,
-0.07313929498195648,
-0.0061072236858308315,
-0.12464309483766556,
-0.022705987095832825,
-0.04266175627708435,
0.007227946072816849,
-0.0036152193788439035,
-0.03938358277082443,
-0.025331715121865273,
0.05042039602994919,
-0.07172925025224686,
-0.07643420994281769,
-0.010306792333722115,
0.06890100985765457,
-0.1269676238298416,
0.0076461839489638805,
0.04088134691119194,
-0.12053937464952469,
0.10108251124620438,
0.06433489918708801,
0.018240157514810562,
0.007395678665488958,
-0.08398546278476715,
-0.031945664435625076,
0.029454203322529793,
0.015985844656825066,
0.03893057256937027,
-0.17545902729034424,
-0.015323041938245296,
-0.02133261039853096,
-0.0016773046227172017,
0.0026653013192117214,
0.055836793035268784,
-0.09863896667957306,
-0.05264865607023239,
-0.0635698214173317,
-0.041582632809877396,
-0.05855194851756096,
0.049554500728845596,
0.09264706820249557,
0.05486493930220604,
0.12023846060037613,
-0.0859907791018486,
0.07026716321706772,
-0.18151527643203735,
-0.021336769685149193,
-0.022227194160223007,
0.033385537564754486,
-0.03003115952014923,
-0.018084179610013962,
0.09169412404298782,
-0.032980553805828094,
0.09446006268262863,
-0.06083314120769501,
0.10614224523305893,
0.03705386444926262,
-0.09947694092988968,
-0.04279648885130882,
0.008455599658191204,
0.10259246081113815,
0.05637624114751816,
-0.00857520755380392,
0.06529399752616882,
-0.06273753941059113,
0.044850967824459076,
0.09962943941354752,
0.1354137808084488,
0.12772782146930695,
0.041468553245067596,
0.05910130962729454,
0.06562463194131851,
-0.14555445313453674,
-0.13969899713993073,
0.12201152741909027,
-0.050184883177280426,
0.1318105310201645,
-0.040032822638750076,
0.16192805767059326,
0.08425815403461456,
-0.17027433216571808,
0.067524753510952,
-0.045764606446027756,
-0.09661387652158737,
-0.10501538217067719,
-0.08081448078155518,
-0.06411314010620117,
-0.13533726334571838,
0.03288597613573074,
-0.08144073188304901,
0.04374798759818077,
0.011971999891102314,
0.023142213001847267,
0.022693445906043053,
0.1004515066742897,
-0.03800436109304428,
-0.029869558289647102,
0.10943183302879333,
0.01800399087369442,
-0.011650712229311466,
-0.03012067824602127,
-0.033563755452632904,
0.08212011307477951,
0.031173573806881905,
0.09485599398612976,
-0.014411653392016888,
-0.026913994923233986,
0.05884164944291115,
0.016361048445105553,
-0.08833218365907669,
0.0225836094468832,
-0.020345309749245644,
0.04284864664077759,
0.10873247683048248,
0.07638873904943466,
-0.0018171683186665177,
-0.050587113946676254,
0.2350982427597046,
-0.08671512454748154,
-0.0391090102493763,
-0.16060853004455566,
0.18495583534240723,
0.0140455923974514,
0.000019710920241777785,
0.05122583359479904,
-0.11740565299987793,
-0.02305184304714203,
0.09269771724939346,
0.1423298567533493,
-0.026237063109874725,
-0.007630969863384962,
-0.024476604536175728,
-0.01396619901061058,
-0.051589567214250565,
0.09255359321832657,
0.08305034786462784,
0.05090921372175217,
-0.02275293879210949,
0.06045018509030342,
-0.009018330834805965,
-0.08017738163471222,
-0.051206786185503006,
0.12372391670942307,
-0.006572796497493982,
0.0024921551812440157,
-0.016666831448674202,
0.11041457951068878,
0.0020103512797504663,
-0.25821468234062195,
-0.007593339774757624,
-0.15177126228809357,
-0.18444480001926422,
-0.016007760539650917,
0.06182697042822838,
0.021971236914396286,
0.0650668740272522,
-0.0011866447748616338,
-0.015010219067335129,
0.1758124977350235,
0.003609035164117813,
-0.03089791163802147,
-0.10629405081272125,
0.07760144770145416,
-0.067221999168396,
0.1803697943687439,
0.011511574499309063,
0.05059773474931717,
0.08205648511648178,
0.04019299894571304,
-0.13222895562648773,
0.011002453975379467,
0.0771327018737793,
-0.08238405734300613,
0.04929795116186142,
0.21930362284183502,
-0.04352535307407379,
0.15033845603466034,
0.06655136495828629,
-0.08699017763137817,
-0.022915052250027657,
-0.09459612518548965,
-0.0007226294837892056,
-0.0668693333864212,
0.031873635947704315,
-0.049985770136117935,
0.15413835644721985,
0.15803202986717224,
-0.06025196984410286,
-0.014707989990711212,
-0.07282142341136932,
0.019461510702967644,
0.02771635353565216,
0.12617139518260956,
-0.01551437471061945,
-0.17925482988357544,
0.029042664915323257,
-0.0037769575137645006,
0.05782261863350868,
-0.21187813580036163,
-0.09429250657558441,
0.041164521127939224,
-0.03684186562895775,
-0.03451872244477272,
0.12291349470615387,
0.012826063670217991,
0.015662044286727905,
-0.03838875889778137,
-0.10956410318613052,
-0.0077508785761892796,
0.1300090104341507,
-0.16280241310596466,
-0.021976271644234657
] |
null | null | transformers |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# roberta-base-bne-finetuned-amazon_reviews_multi
This model is a fine-tuned version of [BSC-TeMU/roberta-base-bne](https://huggingface.co/BSC-TeMU/roberta-base-bne) on the amazon_reviews_multi dataset.
It achieves the following results on the evaluation set:
- Loss: 0.3011
- Accuracy: 0.9185
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 2
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.2427 | 1.0 | 125 | 0.2109 | 0.919 |
| 0.0986 | 2.0 | 250 | 0.3011 | 0.9185 |
### Framework versions
- Transformers 4.9.2
- Pytorch 1.9.0+cu102
- Datasets 1.11.0
- Tokenizers 0.10.3
| {"license": "cc-by-4.0", "tags": ["generated_from_trainer"], "datasets": ["amazon_reviews_multi"], "metrics": ["accuracy"], "model_index": [{"name": "roberta-base-bne-finetuned-amazon_reviews_multi", "results": [{"task": {"name": "Text Classification", "type": "text-classification"}, "dataset": {"name": "amazon_reviews_multi", "type": "amazon_reviews_multi", "args": "es"}, "metric": {"name": "Accuracy", "type": "accuracy", "value": 0.9185}}]}]} | text-classification | Proggleb/roberta-base-bne-finetuned-amazon_reviews_multi | [
"transformers",
"pytorch",
"tensorboard",
"roberta",
"text-classification",
"generated_from_trainer",
"dataset:amazon_reviews_multi",
"license:cc-by-4.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [] | TAGS
#transformers #pytorch #tensorboard #roberta #text-classification #generated_from_trainer #dataset-amazon_reviews_multi #license-cc-by-4.0 #autotrain_compatible #endpoints_compatible #region-us
| roberta-base-bne-finetuned-amazon\_reviews\_multi
=================================================
This model is a fine-tuned version of BSC-TeMU/roberta-base-bne on the amazon\_reviews\_multi dataset.
It achieves the following results on the evaluation set:
* Loss: 0.3011
* Accuracy: 0.9185
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 2e-05
* train\_batch\_size: 16
* eval\_batch\_size: 16
* seed: 42
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* num\_epochs: 2
### Training results
### Framework versions
* Transformers 4.9.2
* Pytorch 1.9.0+cu102
* Datasets 1.11.0
* Tokenizers 0.10.3
| [
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 2",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.9.2\n* Pytorch 1.9.0+cu102\n* Datasets 1.11.0\n* Tokenizers 0.10.3"
] | [
"TAGS\n#transformers #pytorch #tensorboard #roberta #text-classification #generated_from_trainer #dataset-amazon_reviews_multi #license-cc-by-4.0 #autotrain_compatible #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 2",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.9.2\n* Pytorch 1.9.0+cu102\n* Datasets 1.11.0\n* Tokenizers 0.10.3"
] | [
68,
98,
4,
34
] | [
"passage: TAGS\n#transformers #pytorch #tensorboard #roberta #text-classification #generated_from_trainer #dataset-amazon_reviews_multi #license-cc-by-4.0 #autotrain_compatible #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 2### Training results### Framework versions\n\n\n* Transformers 4.9.2\n* Pytorch 1.9.0+cu102\n* Datasets 1.11.0\n* Tokenizers 0.10.3"
] | [
-0.08839257806539536,
0.07182613760232925,
-0.0022761800792068243,
0.12354572117328644,
0.17461615800857544,
0.04664640501141548,
0.15001033246517181,
0.11295176297426224,
-0.0911908969283104,
0.0025587626732885838,
0.11353598535060883,
0.16285838186740875,
0.006882305257022381,
0.11342660337686539,
-0.06002412363886833,
-0.25545111298561096,
-0.014264321886003017,
0.04585016891360283,
-0.039443038403987885,
0.1466878354549408,
0.10804580897092819,
-0.13512179255485535,
0.09479020535945892,
0.007522543426603079,
-0.20025527477264404,
-0.01042194664478302,
0.02655612863600254,
-0.07337727397680283,
0.13639448583126068,
0.02944687008857727,
0.14236131310462952,
0.006021624431014061,
0.07734573632478714,
-0.18026873469352722,
0.01936332881450653,
0.035657964646816254,
0.0023094925563782454,
0.09061554074287415,
0.029501857236027718,
-0.01368747092783451,
0.13542525470256805,
-0.05290474370121956,
0.07828816026449203,
0.016513442620635033,
-0.11823423951864243,
-0.22217094898223877,
-0.08374817669391632,
0.0406045988202095,
0.05452033877372742,
0.09872910380363464,
-0.00793234258890152,
0.14786015450954437,
-0.0948602557182312,
0.08731918781995773,
0.23284532129764557,
-0.2814580202102661,
-0.07357364147901535,
0.030543820932507515,
0.029593121260404587,
0.0697181224822998,
-0.1090618371963501,
-0.03412232920527458,
0.05279349535703659,
0.053441669791936874,
0.12054288387298584,
-0.03979590907692909,
-0.10675445199012756,
0.01184848602861166,
-0.1420086920261383,
-0.023756710812449455,
0.19877062737941742,
0.038004450500011444,
-0.05136372894048691,
-0.04322843626141548,
-0.030508873984217644,
-0.14553587138652802,
-0.04248211160302162,
0.007587749976664782,
0.05101541429758072,
-0.06127399206161499,
-0.09615740925073624,
-0.0017856706399470568,
-0.1161680668592453,
-0.053532473742961884,
-0.07220099121332169,
0.1471410095691681,
0.03888067975640297,
0.020487187430262566,
-0.028637534007430077,
0.10526184737682343,
0.014859934337437153,
-0.10657451301813126,
0.00580232311040163,
0.0061612678691744804,
-0.007864749990403652,
-0.053524356335401535,
-0.061228446662425995,
-0.06704641878604889,
-0.00025649231974966824,
0.13949601352214813,
-0.027952156960964203,
0.026734979823231697,
0.0457291379570961,
0.05669664964079857,
-0.08014443516731262,
0.18844415247440338,
-0.02683238685131073,
-0.0008277795859612525,
-0.0005634169792756438,
0.050887711346149445,
0.025579452514648438,
-0.007942761294543743,
-0.13162744045257568,
0.005324666853994131,
0.073269784450531,
0.00650135288015008,
-0.08001315593719482,
0.05704151466488838,
-0.07255343347787857,
-0.04596323147416115,
0.005740107968449593,
-0.07239905744791031,
0.030041484162211418,
-0.007618363481014967,
-0.06846801191568375,
-0.01767161674797535,
0.020997317507863045,
0.021944206207990646,
-0.0005041505792178214,
0.15451928973197937,
-0.09127654135227203,
0.03210490942001343,
-0.08839957416057587,
-0.0920952633023262,
0.027586650103330612,
-0.08152618259191513,
0.0453820563852787,
-0.11449830234050751,
-0.17031645774841309,
-0.020580489188432693,
0.047850728034973145,
-0.02405575104057789,
-0.06681818515062332,
-0.03224606066942215,
-0.06519637256860733,
0.009773949161171913,
-0.014180172234773636,
0.14778795838356018,
-0.0733616054058075,
0.1075349748134613,
0.03468732163310051,
0.055587075650691986,
-0.047575678676366806,
0.044180043041706085,
-0.09771221876144409,
-0.01357980351895094,
-0.16308985650539398,
0.027985498309135437,
-0.04207569733262062,
0.05709175765514374,
-0.07737591117620468,
-0.11733967065811157,
0.011403121054172516,
0.008415933698415756,
0.04211335629224777,
0.08035637438297272,
-0.16377998888492584,
-0.07452040165662766,
0.13872405886650085,
-0.06883330643177032,
-0.12313717603683472,
0.12318543344736099,
-0.08148615807294846,
0.05949699878692627,
0.07269910722970963,
0.13917435705661774,
0.06339269131422043,
-0.07071439176797867,
0.015227603726089,
-0.014740873128175735,
0.03601967915892601,
-0.06450419872999191,
0.08131738007068634,
0.01597159542143345,
-0.007361931726336479,
0.028339942917227745,
-0.03421640396118164,
0.036996498703956604,
-0.09104132652282715,
-0.08855290710926056,
-0.038604896515607834,
-0.09525637328624725,
0.0626053586602211,
0.06858958303928375,
0.06863866001367569,
-0.11337462067604065,
-0.07577962428331375,
0.07028576731681824,
0.08421401679515839,
-0.04607420042157173,
0.01798451691865921,
-0.0526360459625721,
0.07684078812599182,
-0.03348326310515404,
-0.02640211209654808,
-0.1744549423456192,
-0.030968984588980675,
0.01694272831082344,
0.0011588186025619507,
0.03612899035215378,
0.01781797781586647,
0.052732884883880615,
0.043191660195589066,
-0.07270418852567673,
0.0006945625063963234,
-0.06212671101093292,
-0.008940636180341244,
-0.11370959877967834,
-0.20057447254657745,
-0.024644212797284126,
-0.01625540293753147,
0.1266949474811554,
-0.20963609218597412,
0.029023468494415283,
-0.02585691399872303,
0.0774892047047615,
0.0343538373708725,
-0.014041599817574024,
-0.025446679443120956,
0.08262673765420914,
-0.03468760848045349,
-0.031235182657837868,
0.06924164295196533,
0.010317608714103699,
-0.1047065407037735,
-0.011195463128387928,
-0.07097036391496658,
0.1770821213722229,
0.13403040170669556,
-0.10145552456378937,
-0.08873505145311356,
0.014581531286239624,
-0.05411935970187187,
-0.032035451382398605,
-0.08960950374603271,
0.037446752190589905,
0.1894320398569107,
-0.001424293965101242,
0.13721124827861786,
-0.08972306549549103,
-0.04917454347014427,
0.031534139066934586,
-0.03661871701478958,
0.03303784132003784,
0.13220469653606415,
0.13630959391593933,
-0.08093874901533127,
0.13200105726718903,
0.16006146371364594,
-0.08885109424591064,
0.14427129924297333,
-0.03932354226708412,
-0.06517161428928375,
-0.02269071713089943,
-0.042682647705078125,
-0.009029370732605457,
0.11519301682710648,
-0.13206270337104797,
0.001005477854050696,
0.03296366333961487,
0.00512603297829628,
0.007214987184852362,
-0.2274932861328125,
-0.04451654851436615,
0.03282438963651657,
-0.03509432449936867,
-0.014310409314930439,
0.006987050175666809,
0.017413591966032982,
0.10854659974575043,
0.004266231320798397,
-0.07451818138360977,
0.04589231312274933,
0.009620221331715584,
-0.08304597437381744,
0.21800756454467773,
-0.066200390458107,
-0.15829965472221375,
-0.13557088375091553,
-0.05871900916099548,
-0.04279159754514694,
-0.0002127130574081093,
0.06271582841873169,
-0.06532257795333862,
-0.03357158228754997,
-0.06688009947538376,
0.007139639463275671,
-0.008427230641245842,
0.02402581088244915,
-0.02371954545378685,
0.02462136000394821,
0.03626750782132149,
-0.10175847262144089,
-0.007257427554577589,
-0.05948714539408684,
-0.040552519261837006,
0.051101911813020706,
0.04928628355264664,
0.10782825946807861,
0.14850974082946777,
-0.024369776248931885,
-0.009392624720931053,
-0.03258702531456947,
0.21273957192897797,
-0.08677524328231812,
-0.04900782182812691,
0.13383613526821136,
-0.015798771753907204,
0.03315502405166626,
0.1263553649187088,
0.08046948909759521,
-0.08807558566331863,
0.015357403084635735,
0.024826738983392715,
-0.04052266851067543,
-0.26232755184173584,
-0.03683179244399071,
-0.05511616915464401,
-0.006190566346049309,
0.08051075786352158,
0.0241311676800251,
0.0021694828756153584,
0.07167551666498184,
0.041781045496463776,
0.07485855370759964,
-0.025474080815911293,
0.07378926873207092,
0.12447375059127808,
0.050249792635440826,
0.13252675533294678,
-0.04847250506281853,
-0.06300310045480728,
0.058082424104213715,
-0.01813790574669838,
0.2487623393535614,
0.008872135542333126,
0.13527928292751312,
0.0706343799829483,
0.12943999469280243,
0.02245587483048439,
0.054892994463443756,
0.023893367499113083,
-0.029607245698571205,
-0.021378204226493835,
-0.02561376243829727,
-0.027750996872782707,
0.027419930323958397,
-0.04982123151421547,
0.05130545422434807,
-0.1298668086528778,
-0.01679336652159691,
0.06025875732302666,
0.24364908039569855,
0.022235773503780365,
-0.3151286840438843,
-0.10580452531576157,
0.0057318564504384995,
-0.05056650564074516,
-0.004947665147483349,
0.02972867712378502,
0.07781769335269928,
-0.12178151309490204,
0.03902905806899071,
-0.07700997591018677,
0.09011957794427872,
-0.09081339836120605,
0.034281615167856216,
0.07659478485584259,
0.07316941022872925,
-0.0010426024673506618,
0.0765877440571785,
-0.2938985228538513,
0.27677902579307556,
-0.007408447097986937,
0.058628715574741364,
-0.06580454856157303,
-0.028945239260792732,
0.024182716384530067,
0.046879563480615616,
0.06133832409977913,
-0.0057052732445299625,
-0.0499798022210598,
-0.16795939207077026,
-0.038874950259923935,
0.021111713722348213,
0.07946537435054779,
-0.01819467358291149,
0.08480922877788544,
-0.02731863595545292,
0.0018539367010816932,
0.05661048740148544,
-0.024414394050836563,
-0.05033651366829872,
-0.09342178702354431,
-0.002592475851997733,
0.0223239716142416,
-0.05526675656437874,
-0.06214034929871559,
-0.1331361085176468,
-0.07076530903577805,
0.12564750015735626,
-0.02408377267420292,
-0.045267313718795776,
-0.09649199992418289,
0.07177935540676117,
0.08396106958389282,
-0.07546629011631012,
0.05155607685446739,
0.009259464219212532,
0.09000836312770844,
0.02413967438042164,
-0.0423247404396534,
0.09236237406730652,
-0.05351995304226875,
-0.19594109058380127,
-0.0695134773850441,
0.11535649001598358,
0.02461651712656021,
0.06681296974420547,
-0.022689655423164368,
0.009857146069407463,
-0.054549869149923325,
-0.09100735187530518,
0.018734870478510857,
-0.003288889303803444,
0.08391799032688141,
0.049050234258174896,
-0.0489177368581295,
0.007831727154552937,
-0.07117951661348343,
-0.056706368923187256,
0.18778429925441742,
0.21376314759254456,
-0.09329421073198318,
0.029958775267004967,
0.017582785338163376,
-0.06938750296831131,
-0.15946249663829803,
0.02635842002928257,
0.07167382538318634,
0.007606787607073784,
0.049025796353816986,
-0.14513596892356873,
0.12589283287525177,
0.10551708936691284,
-0.011555839329957962,
0.13003690540790558,
-0.3180287778377533,
-0.1359887421131134,
0.09173933416604996,
0.14565950632095337,
0.1344132423400879,
-0.12721838057041168,
-0.01620713621377945,
-0.03337448835372925,
-0.1434927135705948,
0.13366039097309113,
-0.08584384620189667,
0.1364767998456955,
-0.03698783740401268,
0.11113201826810837,
0.0016653684433549643,
-0.05280761793255806,
0.11385529488325119,
0.02592894807457924,
0.09923850744962692,
-0.05077749863266945,
-0.04492483660578728,
0.026064841076731682,
-0.02745537832379341,
0.0111097302287817,
-0.06845955550670624,
0.019015010446310043,
-0.08680183440446854,
-0.03610456734895706,
-0.07206900417804718,
0.03886820375919342,
-0.04073593392968178,
-0.05078645050525665,
-0.04444069415330887,
0.030555037781596184,
0.018375780433416367,
-0.02039724960923195,
0.1484956592321396,
0.01820829138159752,
0.1338973343372345,
0.06009873002767563,
0.09212946891784668,
-0.05526048317551613,
-0.1024155467748642,
-0.03391028940677643,
-0.027620552107691765,
0.047257598489522934,
-0.15652939677238464,
0.020483549684286118,
0.13507112860679626,
0.021976826712489128,
0.14905355870723724,
0.07828952372074127,
-0.029032956808805466,
0.016853313893079758,
0.06926754862070084,
-0.15163806080818176,
-0.0738796666264534,
-0.008418911136686802,
-0.07805860042572021,
-0.1190294697880745,
0.03821137174963951,
0.11247079074382782,
-0.07463440299034119,
-0.026338841766119003,
-0.006729671731591225,
0.004345289431512356,
-0.049187589436769485,
0.18215006589889526,
0.06892719864845276,
0.04812243953347206,
-0.10142429918050766,
0.0933762788772583,
0.061462368816137314,
-0.06827600300312042,
-0.0026664866600185633,
0.06169760227203369,
-0.09405539184808731,
-0.05358447879552841,
0.049001287668943405,
0.16789215803146362,
-0.08617469668388367,
-0.04443615302443504,
-0.143683061003685,
-0.12559135258197784,
0.08159831911325455,
0.15513025224208832,
0.11905217170715332,
0.014457736164331436,
-0.04335629194974899,
-0.009346491657197475,
-0.10294150561094284,
0.09983213990926743,
0.060858018696308136,
0.07419072091579437,
-0.14768460392951965,
0.10154380649328232,
0.027277423068881035,
0.04360850900411606,
-0.019121339544653893,
0.04284290596842766,
-0.10512181371450424,
0.013606070540845394,
-0.11018235236406326,
-0.0012156040174886584,
-0.022729715332388878,
0.018245328217744827,
-0.0013282198924571276,
-0.057650238275527954,
-0.0670996755361557,
0.01077970676124096,
-0.11572179198265076,
-0.01603565365076065,
0.04192126542329788,
0.07535851746797562,
-0.09438582509756088,
-0.037175629287958145,
0.03404359519481659,
-0.052279144525527954,
0.07377017289400101,
0.050573885440826416,
0.017306119203567505,
0.06209361553192139,
-0.12140700221061707,
0.02970919944345951,
0.05461839586496353,
0.017269639298319817,
0.05045250430703163,
-0.1242198497056961,
0.006522221490740776,
0.0000018370383259025402,
0.07131925970315933,
0.02513773925602436,
0.06461779028177261,
-0.1604229062795639,
-0.009764639660716057,
-0.003662751754745841,
-0.0821664109826088,
-0.055215656757354736,
0.012744735926389694,
0.0714007094502449,
0.027881667017936707,
0.21745003759860992,
-0.07637201249599457,
0.042003780603408813,
-0.19237551093101501,
0.006812606006860733,
-0.02344665303826332,
-0.1202234998345375,
-0.14560888707637787,
-0.07072530686855316,
0.05221022292971611,
-0.06840072572231293,
0.1745544672012329,
0.03515541926026344,
0.06575862318277359,
0.02939472906291485,
-0.003071068087592721,
-0.005330778658390045,
0.017776764929294586,
0.175225630402565,
0.02009904757142067,
-0.04377831146121025,
0.05904346704483032,
0.03875716030597687,
0.10548652708530426,
0.08948742598295212,
0.19184395670890808,
0.16831094026565552,
0.017398543655872345,
0.0857391357421875,
0.0444527305662632,
-0.030409757047891617,
-0.12576141953468323,
0.038116104900836945,
-0.014349998906254768,
0.10774760693311691,
-0.021594448015093803,
0.20390798151493073,
0.0668371319770813,
-0.16153213381767273,
0.039882175624370575,
-0.05661224201321602,
-0.08454542607069016,
-0.10485400259494781,
-0.043036311864852905,
-0.0969577506184578,
-0.14792025089263916,
0.012747039087116718,
-0.12602730095386505,
-0.0025759951677173376,
0.09347107261419296,
0.007022733800113201,
-0.03623494133353233,
0.11370430141687393,
0.013057163916528225,
0.013921258971095085,
0.09285126626491547,
0.015326525084674358,
-0.03665336221456528,
-0.09829901903867722,
-0.057606589049100876,
-0.03608478233218193,
-0.0286263395100832,
0.018853338435292244,
-0.05779462680220604,
-0.06483300775289536,
0.02068742923438549,
-0.018269112333655357,
-0.10196197777986526,
0.021494904533028603,
0.018392525613307953,
0.07963047176599503,
0.03440818935632706,
0.008001413196325302,
0.021576616913080215,
-0.00031698698876425624,
0.2469176948070526,
-0.06073731184005737,
-0.060195937752723694,
-0.11570291966199875,
0.22735150158405304,
0.040384162217378616,
-0.031827230006456375,
0.039677415043115616,
-0.06892390549182892,
0.007893211208283901,
0.2468440979719162,
0.21485458314418793,
-0.07445509731769562,
-0.013485186733305454,
0.024290332570672035,
-0.009282685816287994,
-0.007391860242933035,
0.11239375919103622,
0.1088830828666687,
0.031433798372745514,
-0.07557128369808197,
-0.041091497987508774,
-0.05275516211986542,
0.004576195031404495,
-0.015439008362591267,
0.06644943356513977,
0.04649226740002632,
-0.005012878682464361,
-0.04912477731704712,
0.07442872226238251,
-0.0883321613073349,
-0.12225606292486191,
0.06367437541484833,
-0.2119854986667633,
-0.17375734448432922,
-0.014436600729823112,
0.08413331210613251,
0.00803092960268259,
0.06735686957836151,
-0.025530150160193443,
-0.021362731233239174,
0.0744236633181572,
-0.012836866080760956,
-0.114098459482193,
-0.08732326328754425,
0.09466124325990677,
-0.08653917908668518,
0.1943945586681366,
-0.05068974196910858,
0.07110700756311417,
0.12006217241287231,
0.06953105330467224,
-0.07235819101333618,
0.06555058807134628,
0.04349931702017784,
-0.047000233083963394,
0.04627872630953789,
0.10372570902109146,
-0.03064076416194439,
0.07082284241914749,
0.05118507519364357,
-0.12489072233438492,
0.017214177176356316,
-0.0750071182847023,
-0.04503636062145233,
-0.06335513293743134,
-0.013619808480143547,
-0.07373028993606567,
0.12813083827495575,
0.23437751829624176,
-0.04066196456551552,
-0.017478231340646744,
-0.05611218884587288,
0.027619527652859688,
0.07173090428113937,
0.03138094022870064,
-0.05022638663649559,
-0.2193852812051773,
0.00860568042844534,
0.0810139924287796,
-0.013936986215412617,
-0.2563696801662445,
-0.07344834506511688,
0.00805171113461256,
-0.07237789034843445,
-0.07316054403781891,
0.08043372631072998,
0.07341767102479935,
0.045031316578388214,
-0.06273717433214188,
-0.05018014833331108,
-0.0692606270313263,
0.14781716465950012,
-0.15979093313217163,
-0.09752274304628372
] |
null | null | null |
# ***LegalNLP*** - Natural Language Processing Methods for the Brazilian Legal Language ⚖️
### The library of Natural Language Processing for Brazilian legal language, *LegalNLP*, was born in a partnership between Brazilian researchers and the legal tech [Tikal Tech](https://www.tikal.tech) based in São Paulo, Brazil. Besides containing pre-trained language models for the Brazilian legal language, ***LegalNLP*** provides functions that can facilitate the manipulation of legal texts in Portuguese and demonstration/tutorials to help people in their own work.
You can access our paper by clicking [**here**](https://arxiv.org/abs/2110.15709).
If you use our library in your academic work, please cite us in the following way
@article{polo2021legalnlp,
title={LegalNLP--Natural Language Processing methods for the Brazilian Legal Language},
author={Polo, Felipe Maia and Mendon{\c{c}}a, Gabriel Caiaffa Floriano and Parreira, Kau{\^e} Capellato J and Gianvechio, Lucka and Cordeiro, Peterson and Ferreira, Jonathan Batista and de Lima, Leticia Maria Paz and Maia, Ant{\^o}nio Carlos do Amaral and Vicente, Renato},
journal={arXiv preprint arXiv:2110.15709},
year={2021}
}
--------------
## Summary
0. [Accessing the Language Models](#0)
1. [ Introduction / Installing package](#1)
2. [ Language Models (Details / How to use)](#2)
1. [ Word2Vec/Doc2Vec ](#2.1)
3. [ Demonstrations / Tutorials](#3)
4. [ References](#4)
--------------
<a name="0"></a>
## 0\. Accessing the Language Models
All our models can be found [here](https://drive.google.com/drive/folders/1tCccOXPLSEAEUQtcWXvED3YaNJi3p7la?usp=sharing).
Please contact *[email protected]* if you have any problem accessing the language models.
--------------
<a name="1"></a>
## 1\. Introduction / Installing package
*LegalNLP* is promising given the scarcity of Natural Language Processing resources focused on the Brazilian legal language. It is worth mentioning that our library was made for Python, one of the most well-known programming languages for machine learning.
You first need to install the HuggingFaceHub library running the following command on terminal
``` :sh
$ pip install huggingface_hub
```
Import `hf_hub_download`:
```python
from huggingface_hub import hf_hub_download
```
And then you can download our Word2Vec(SG)/Doc2Vec(DBOW) and Word2Vec(CBOW)/Doc2Vec(DM) by the following commands:
```python
w2v_sg_d2v_dbow = hf_hub_download(repo_id = "Projeto/LegalNLP", filename = "w2v_d2v_dbow_size_100_window_15_epochs_20")
w2v_cbow_d2v_dm = hf_hub_download(repo_id = "Projeto/LegalNLP", filename = "w2v_d2v_dm_size_100_window_15_epochs_20")
```
--------------
<a name="2"></a>
## 2\. Model Languages
<a name="3.2"></a>
### 3.2\. Word2Vec/Doc2Vec
Our first models for generating vector representation for tokens and
texts (embeddings) are variations of the Word2Vec [1,
2] and Doc2Vec [3] methods. In short, the
Word2Vec methods generate embeddings for tokens5 and that somehow capture
the meaning of the various textual elements, based on the contexts in which these
elements appear. Doc2Vec methods are extensions/modifications of Word2Vec
for generating whole text representations.
Remember to at least make all letters lowercase. Please check our paper or [Gensim page](https://radimrehurek.com/gensim_3.8.3/models/doc2vec.html) for more details. Preferably use Gensim version 3.8.3.
Below we have a summary table with some important information about the trained models:
| Filenames | Doc2Vec | Word2Vec | Size | Windows
|:-------------------:|:--------------:|:--------------:|:--------------:|:--------------:|
| ```w2v_d2v_dm*``` | Distributed Memory (DM) | Continuous Bag-of-Words (CBOW) | 100, 200, 300 | 15
| ```w2v_d2v_dbow*``` | Distributed Bag-of-Words (DBOW) | Skip-Gram (SG) | 100, 200, 300 | 15
Here we made available both models with 100 size and 15 window.
#### Using *Word2Vec*
Installing Gensim
```python
!pip install gensim=='3.8.3'
```
Loading W2V:
```python
from gensim.models import KeyedVectors
#Loading a W2V model
w2v=KeyedVectors.load(w2v_cbow_d2v_dm)
w2v=w2v.wv
```
Viewing the first 10 entries of 'juiz' vector
```python
w2v['juiz'][:10]
```
array([ 6.570131 , -1.262787 , 5.156106 , -8.943866 , -5.884408 ,
-7.717058 , 1.8819941 , -8.02803 , -0.66901577, 6.7223144 ],
dtype=float32)
Viewing closest tokens to 'juiz'
```python
w2v.most_similar('juiz')
```
[('juíza', 0.8210258483886719),
('juiza', 0.7306275367736816),
('juíz', 0.691645085811615),
('juízo', 0.6605231165885925),
('magistrado', 0.6213295459747314),
('mmª_juíza', 0.5510469675064087),
('juizo', 0.5494943261146545),
('desembargador', 0.5313084721565247),
('mmjuiz', 0.5277603268623352),
('fabíola_melo_feijão_juíza', 0.5043971538543701)]
#### Using *Doc2Vec*
Installing Gensim
```python
!pip install gensim=='3.8.3'
```
Loading D2V
```python
from gensim.models import Doc2Vec
#Loading a D2V model
d2v=Doc2Vec.load(w2v_cbow_d2v_dm)
```
Inferring vector for a text
```python
txt='direito do consumidor origem : bangu regional xxix juizado especial civel ação : [processo] - - recte : fundo de investimento em direitos creditórios'
tokens=txt.split()
txt_vec=d2v.infer_vector(tokens, epochs=20)
txt_vec[:10]
```
array([ 0.02626514, -0.3876521 , -0.24873355, -0.0318402 , 0.3343679 ,
-0.21307918, 0.07193747, 0.02030687, 0.407305 , 0.20065512],
dtype=float32)
--------------
<a name="4"></a>
## 4\. Demonstrations
For a better understanding of the application of these models, below are the links to notebooks where we apply them to a legal dataset using various classification models such as Logistic Regression and CatBoost:
- **BERT notebook** :
[](https://colab.research.google.com/github/felipemaiapolo/legalnlp/blob/main/demo/BERT/BERT_TUTORIAL.ipynb)
- **Word2Vec notebook** :
[](https://colab.research.google.com/github/felipemaiapolo/legalnlp/blob/main/demo/Word2Vec/Word2Vec_TUTORIAL.ipynb)
- **Doc2Vec notebook** :
[](https://colab.research.google.com/github/felipemaiapolo/legalnlp/blob/main/demo/Doc2Vec/Doc2Vec_TUTORIAL.ipynb)
--------------
<a name="5"></a>
## 5\. References
[1] Mikolov, T., Sutskever, I., Chen, K., Corrado, G. S., and Dean, J. (2013b).
Distributed representations of words and phrases and their compositionality.
In Advances in neural information processing systems, pages 3111–3119.
[2] Mikolov, T., Chen, K., Corrado, G., and Dean, J. (2013a). Efficient estimation of
word representations in vector space. arXiv preprint arXiv:1301.3781.
[3] Le, Q. and Mikolov, T. (2014). Distributed representations of sentences and
documents. In International conference on machine learning, pages 1188–1196.
PMLR.
[4] Bojanowski, P., Grave, E., Joulin, A., and Mikolov, T. (2017). Enriching
word vectors with subword information. Transactions of the Association for
Computational Linguistics, 5:135–146.
[5] Devlin, J., Chang, M.-W., Lee, K., and Toutanova, K. (2018). Bert: Pre-training
of deep bidirectional transformers for language understanding. arXiv preprint
arXiv:1810.04805.
[6] Souza, F., Nogueira, R., and Lotufo, R. (2020). BERTimbau: pretrained BERT
models for Brazilian Portuguese. In 9th Brazilian Conference on Intelligent
Systems, BRACIS, Rio Grande do Sul, Brazil, October 20-23
| {"language": "pt-br", "license": "mit", "tags": ["LegalNLP", "NLP", "legal field", "python", "word2vec", "doc2vec"]} | null | Projeto/LegalNLP | [
"LegalNLP",
"NLP",
"legal field",
"python",
"word2vec",
"doc2vec",
"arxiv:2110.15709",
"license:mit",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [
"2110.15709"
] | [
"pt-br"
] | TAGS
#LegalNLP #NLP #legal field #python #word2vec #doc2vec #arxiv-2110.15709 #license-mit #region-us
| *LegalNLP* - Natural Language Processing Methods for the Brazilian Legal Language ️
===================================================================================
### The library of Natural Language Processing for Brazilian legal language, *LegalNLP*, was born in a partnership between Brazilian researchers and the legal tech Tikal Tech based in São Paulo, Brazil. Besides containing pre-trained language models for the Brazilian legal language, *LegalNLP* provides functions that can facilitate the manipulation of legal texts in Portuguese and demonstration/tutorials to help people in their own work.
You can access our paper by clicking here.
If you use our library in your academic work, please cite us in the following way
```
@article{polo2021legalnlp,
title={LegalNLP--Natural Language Processing methods for the Brazilian Legal Language},
author={Polo, Felipe Maia and Mendon{\c{c}}a, Gabriel Caiaffa Floriano and Parreira, Kau{\^e} Capellato J and Gianvechio, Lucka and Cordeiro, Peterson and Ferreira, Jonathan Batista and de Lima, Leticia Maria Paz and Maia, Ant{\^o}nio Carlos do Amaral and Vicente, Renato},
journal={arXiv preprint arXiv:2110.15709},
year={2021}
}
```
---
Summary
-------
0. Accessing the Language Models
1. Introduction / Installing package
2. Language Models (Details / How to use)
1. Word2Vec/Doc2Vec
3. Demonstrations / Tutorials
4. References
---
0. Accessing the Language Models
--------------------------------
All our models can be found here.
Please contact *felipemaiapolo@URL* if you have any problem accessing the language models.
---
1. Introduction / Installing package
------------------------------------
*LegalNLP* is promising given the scarcity of Natural Language Processing resources focused on the Brazilian legal language. It is worth mentioning that our library was made for Python, one of the most well-known programming languages for machine learning.
You first need to install the HuggingFaceHub library running the following command on terminal
Import 'hf\_hub\_download':
And then you can download our Word2Vec(SG)/Doc2Vec(DBOW) and Word2Vec(CBOW)/Doc2Vec(DM) by the following commands:
---
2. Model Languages
------------------
### 3.2. Word2Vec/Doc2Vec
Our first models for generating vector representation for tokens and
texts (embeddings) are variations of the Word2Vec [1,
2] and Doc2Vec [3] methods. In short, the
Word2Vec methods generate embeddings for tokens5 and that somehow capture
the meaning of the various textual elements, based on the contexts in which these
elements appear. Doc2Vec methods are extensions/modifications of Word2Vec
for generating whole text representations.
Remember to at least make all letters lowercase. Please check our paper or Gensim page for more details. Preferably use Gensim version 3.8.3.
Below we have a summary table with some important information about the trained models:
Here we made available both models with 100 size and 15 window.
#### Using *Word2Vec*
Installing Gensim
Loading W2V:
Viewing the first 10 entries of 'juiz' vector
```
array([ 6.570131 , -1.262787 , 5.156106 , -8.943866 , -5.884408 ,
-7.717058 , 1.8819941 , -8.02803 , -0.66901577, 6.7223144 ],
dtype=float32)
```
Viewing closest tokens to 'juiz'
```
[('juíza', 0.8210258483886719),
('juiza', 0.7306275367736816),
('juíz', 0.691645085811615),
('juízo', 0.6605231165885925),
('magistrado', 0.6213295459747314),
('mmª_juíza', 0.5510469675064087),
('juizo', 0.5494943261146545),
('desembargador', 0.5313084721565247),
('mmjuiz', 0.5277603268623352),
('fabíola_melo_feijão_juíza', 0.5043971538543701)]
```
#### Using *Doc2Vec*
Installing Gensim
Loading D2V
Inferring vector for a text
```
array([ 0.02626514, -0.3876521 , -0.24873355, -0.0318402 , 0.3343679 ,
-0.21307918, 0.07193747, 0.02030687, 0.407305 , 0.20065512],
dtype=float32)
```
---
4. Demonstrations
-----------------
For a better understanding of the application of these models, below are the links to notebooks where we apply them to a legal dataset using various classification models such as Logistic Regression and CatBoost:
* BERT notebook :
.
Distributed representations of words and phrases and their compositionality.
In Advances in neural information processing systems, pages 3111–3119.
[2] Mikolov, T., Chen, K., Corrado, G., and Dean, J. (2013a). Efficient estimation of
word representations in vector space. arXiv preprint arXiv:1301.3781.
[3] Le, Q. and Mikolov, T. (2014). Distributed representations of sentences and
documents. In International conference on machine learning, pages 1188–1196.
PMLR.
[4] Bojanowski, P., Grave, E., Joulin, A., and Mikolov, T. (2017). Enriching
word vectors with subword information. Transactions of the Association for
Computational Linguistics, 5:135–146.
[5] Devlin, J., Chang, M.-W., Lee, K., and Toutanova, K. (2018). Bert: Pre-training
of deep bidirectional transformers for language understanding. arXiv preprint
arXiv:1810.04805.
[6] Souza, F., Nogueira, R., and Lotufo, R. (2020). BERTimbau: pretrained BERT
models for Brazilian Portuguese. In 9th Brazilian Conference on Intelligent
Systems, BRACIS, Rio Grande do Sul, Brazil, October 20-23
| [
"### The library of Natural Language Processing for Brazilian legal language, *LegalNLP*, was born in a partnership between Brazilian researchers and the legal tech Tikal Tech based in São Paulo, Brazil. Besides containing pre-trained language models for the Brazilian legal language, *LegalNLP* provides functions that can facilitate the manipulation of legal texts in Portuguese and demonstration/tutorials to help people in their own work.\n\n\nYou can access our paper by clicking here.\n\n\nIf you use our library in your academic work, please cite us in the following way\n\n\n\n```\n@article{polo2021legalnlp,\n title={LegalNLP--Natural Language Processing methods for the Brazilian Legal Language},\n author={Polo, Felipe Maia and Mendon{\\c{c}}a, Gabriel Caiaffa Floriano and Parreira, Kau{\\^e} Capellato J and Gianvechio, Lucka and Cordeiro, Peterson and Ferreira, Jonathan Batista and de Lima, Leticia Maria Paz and Maia, Ant{\\^o}nio Carlos do Amaral and Vicente, Renato},\n journal={arXiv preprint arXiv:2110.15709},\n year={2021}\n}\n\n```\n\n\n\n---\n\n\nSummary\n-------\n\n\n0. Accessing the Language Models\n1. Introduction / Installing package\n2. Language Models (Details / How to use)\n\t1. Word2Vec/Doc2Vec\n3. Demonstrations / Tutorials\n4. References\n\n\n\n\n---\n\n\n\n0. Accessing the Language Models\n--------------------------------\n\n\nAll our models can be found here.\n\n\nPlease contact *felipemaiapolo@URL* if you have any problem accessing the language models.\n\n\n\n\n---\n\n\n\n1. Introduction / Installing package\n------------------------------------\n\n\n*LegalNLP* is promising given the scarcity of Natural Language Processing resources focused on the Brazilian legal language. It is worth mentioning that our library was made for Python, one of the most well-known programming languages for machine learning.\n\n\nYou first need to install the HuggingFaceHub library running the following command on terminal\n\n\nImport 'hf\\_hub\\_download':\n\n\nAnd then you can download our Word2Vec(SG)/Doc2Vec(DBOW) and Word2Vec(CBOW)/Doc2Vec(DM) by the following commands:\n\n\n\n\n---\n\n\n\n2. Model Languages\n------------------",
"### 3.2. Word2Vec/Doc2Vec\n\n\nOur first models for generating vector representation for tokens and\ntexts (embeddings) are variations of the Word2Vec [1,\n2] and Doc2Vec [3] methods. In short, the\nWord2Vec methods generate embeddings for tokens5 and that somehow capture\nthe meaning of the various textual elements, based on the contexts in which these\nelements appear. Doc2Vec methods are extensions/modifications of Word2Vec\nfor generating whole text representations.\n\n\nRemember to at least make all letters lowercase. Please check our paper or Gensim page for more details. Preferably use Gensim version 3.8.3.\n\n\nBelow we have a summary table with some important information about the trained models:\n\n\n\nHere we made available both models with 100 size and 15 window.",
"#### Using *Word2Vec*\n\n\nInstalling Gensim\n\n\nLoading W2V:\n\n\nViewing the first 10 entries of 'juiz' vector\n\n\n\n```\narray([ 6.570131 , -1.262787 , 5.156106 , -8.943866 , -5.884408 ,\n -7.717058 , 1.8819941 , -8.02803 , -0.66901577, 6.7223144 ],\n dtype=float32)\n\n```\n\nViewing closest tokens to 'juiz'\n\n\n\n```\n[('juíza', 0.8210258483886719),\n ('juiza', 0.7306275367736816),\n ('juíz', 0.691645085811615),\n ('juízo', 0.6605231165885925),\n ('magistrado', 0.6213295459747314),\n ('mmª_juíza', 0.5510469675064087),\n ('juizo', 0.5494943261146545),\n ('desembargador', 0.5313084721565247),\n ('mmjuiz', 0.5277603268623352),\n ('fabíola_melo_feijão_juíza', 0.5043971538543701)]\n\n```",
"#### Using *Doc2Vec*\n\n\nInstalling Gensim\n\n\nLoading D2V\n\n\nInferring vector for a text\n\n\n\n```\narray([ 0.02626514, -0.3876521 , -0.24873355, -0.0318402 , 0.3343679 ,\n -0.21307918, 0.07193747, 0.02030687, 0.407305 , 0.20065512],\n dtype=float32)\n\n```\n\n\n\n---\n\n\n\n4. Demonstrations\n-----------------\n\n\nFor a better understanding of the application of these models, below are the links to notebooks where we apply them to a legal dataset using various classification models such as Logistic Regression and CatBoost:\n\n\n* BERT notebook :\n.\nDistributed representations of words and phrases and their compositionality.\nIn Advances in neural information processing systems, pages 3111–3119.\n\n\n[2] Mikolov, T., Chen, K., Corrado, G., and Dean, J. (2013a). Efficient estimation of\nword representations in vector space. arXiv preprint arXiv:1301.3781.\n\n\n[3] Le, Q. and Mikolov, T. (2014). Distributed representations of sentences and\ndocuments. In International conference on machine learning, pages 1188–1196.\nPMLR.\n\n\n[4] Bojanowski, P., Grave, E., Joulin, A., and Mikolov, T. (2017). Enriching\nword vectors with subword information. Transactions of the Association for\nComputational Linguistics, 5:135–146.\n\n\n[5] Devlin, J., Chang, M.-W., Lee, K., and Toutanova, K. (2018). Bert: Pre-training\nof deep bidirectional transformers for language understanding. arXiv preprint\narXiv:1810.04805.\n\n\n[6] Souza, F., Nogueira, R., and Lotufo, R. (2020). BERTimbau: pretrained BERT\nmodels for Brazilian Portuguese. In 9th Brazilian Conference on Intelligent\nSystems, BRACIS, Rio Grande do Sul, Brazil, October 20-23"
] | [
"TAGS\n#LegalNLP #NLP #legal field #python #word2vec #doc2vec #arxiv-2110.15709 #license-mit #region-us \n",
"### The library of Natural Language Processing for Brazilian legal language, *LegalNLP*, was born in a partnership between Brazilian researchers and the legal tech Tikal Tech based in São Paulo, Brazil. Besides containing pre-trained language models for the Brazilian legal language, *LegalNLP* provides functions that can facilitate the manipulation of legal texts in Portuguese and demonstration/tutorials to help people in their own work.\n\n\nYou can access our paper by clicking here.\n\n\nIf you use our library in your academic work, please cite us in the following way\n\n\n\n```\n@article{polo2021legalnlp,\n title={LegalNLP--Natural Language Processing methods for the Brazilian Legal Language},\n author={Polo, Felipe Maia and Mendon{\\c{c}}a, Gabriel Caiaffa Floriano and Parreira, Kau{\\^e} Capellato J and Gianvechio, Lucka and Cordeiro, Peterson and Ferreira, Jonathan Batista and de Lima, Leticia Maria Paz and Maia, Ant{\\^o}nio Carlos do Amaral and Vicente, Renato},\n journal={arXiv preprint arXiv:2110.15709},\n year={2021}\n}\n\n```\n\n\n\n---\n\n\nSummary\n-------\n\n\n0. Accessing the Language Models\n1. Introduction / Installing package\n2. Language Models (Details / How to use)\n\t1. Word2Vec/Doc2Vec\n3. Demonstrations / Tutorials\n4. References\n\n\n\n\n---\n\n\n\n0. Accessing the Language Models\n--------------------------------\n\n\nAll our models can be found here.\n\n\nPlease contact *felipemaiapolo@URL* if you have any problem accessing the language models.\n\n\n\n\n---\n\n\n\n1. Introduction / Installing package\n------------------------------------\n\n\n*LegalNLP* is promising given the scarcity of Natural Language Processing resources focused on the Brazilian legal language. It is worth mentioning that our library was made for Python, one of the most well-known programming languages for machine learning.\n\n\nYou first need to install the HuggingFaceHub library running the following command on terminal\n\n\nImport 'hf\\_hub\\_download':\n\n\nAnd then you can download our Word2Vec(SG)/Doc2Vec(DBOW) and Word2Vec(CBOW)/Doc2Vec(DM) by the following commands:\n\n\n\n\n---\n\n\n\n2. Model Languages\n------------------",
"### 3.2. Word2Vec/Doc2Vec\n\n\nOur first models for generating vector representation for tokens and\ntexts (embeddings) are variations of the Word2Vec [1,\n2] and Doc2Vec [3] methods. In short, the\nWord2Vec methods generate embeddings for tokens5 and that somehow capture\nthe meaning of the various textual elements, based on the contexts in which these\nelements appear. Doc2Vec methods are extensions/modifications of Word2Vec\nfor generating whole text representations.\n\n\nRemember to at least make all letters lowercase. Please check our paper or Gensim page for more details. Preferably use Gensim version 3.8.3.\n\n\nBelow we have a summary table with some important information about the trained models:\n\n\n\nHere we made available both models with 100 size and 15 window.",
"#### Using *Word2Vec*\n\n\nInstalling Gensim\n\n\nLoading W2V:\n\n\nViewing the first 10 entries of 'juiz' vector\n\n\n\n```\narray([ 6.570131 , -1.262787 , 5.156106 , -8.943866 , -5.884408 ,\n -7.717058 , 1.8819941 , -8.02803 , -0.66901577, 6.7223144 ],\n dtype=float32)\n\n```\n\nViewing closest tokens to 'juiz'\n\n\n\n```\n[('juíza', 0.8210258483886719),\n ('juiza', 0.7306275367736816),\n ('juíz', 0.691645085811615),\n ('juízo', 0.6605231165885925),\n ('magistrado', 0.6213295459747314),\n ('mmª_juíza', 0.5510469675064087),\n ('juizo', 0.5494943261146545),\n ('desembargador', 0.5313084721565247),\n ('mmjuiz', 0.5277603268623352),\n ('fabíola_melo_feijão_juíza', 0.5043971538543701)]\n\n```",
"#### Using *Doc2Vec*\n\n\nInstalling Gensim\n\n\nLoading D2V\n\n\nInferring vector for a text\n\n\n\n```\narray([ 0.02626514, -0.3876521 , -0.24873355, -0.0318402 , 0.3343679 ,\n -0.21307918, 0.07193747, 0.02030687, 0.407305 , 0.20065512],\n dtype=float32)\n\n```\n\n\n\n---\n\n\n\n4. Demonstrations\n-----------------\n\n\nFor a better understanding of the application of these models, below are the links to notebooks where we apply them to a legal dataset using various classification models such as Logistic Regression and CatBoost:\n\n\n* BERT notebook :\n.\nDistributed representations of words and phrases and their compositionality.\nIn Advances in neural information processing systems, pages 3111–3119.\n\n\n[2] Mikolov, T., Chen, K., Corrado, G., and Dean, J. (2013a). Efficient estimation of\nword representations in vector space. arXiv preprint arXiv:1301.3781.\n\n\n[3] Le, Q. and Mikolov, T. (2014). Distributed representations of sentences and\ndocuments. In International conference on machine learning, pages 1188–1196.\nPMLR.\n\n\n[4] Bojanowski, P., Grave, E., Joulin, A., and Mikolov, T. (2017). Enriching\nword vectors with subword information. Transactions of the Association for\nComputational Linguistics, 5:135–146.\n\n\n[5] Devlin, J., Chang, M.-W., Lee, K., and Toutanova, K. (2018). Bert: Pre-training\nof deep bidirectional transformers for language understanding. arXiv preprint\narXiv:1810.04805.\n\n\n[6] Souza, F., Nogueira, R., and Lotufo, R. (2020). BERTimbau: pretrained BERT\nmodels for Brazilian Portuguese. In 9th Brazilian Conference on Intelligent\nSystems, BRACIS, Rio Grande do Sul, Brazil, October 20-23"
] | [
42,
529,
184,
294,
547
] | [
"passage: TAGS\n#LegalNLP #NLP #legal field #python #word2vec #doc2vec #arxiv-2110.15709 #license-mit #region-us \n",
"passage: ### The library of Natural Language Processing for Brazilian legal language, *LegalNLP*, was born in a partnership between Brazilian researchers and the legal tech Tikal Tech based in São Paulo, Brazil. Besides containing pre-trained language models for the Brazilian legal language, *LegalNLP* provides functions that can facilitate the manipulation of legal texts in Portuguese and demonstration/tutorials to help people in their own work.\n\n\nYou can access our paper by clicking here.\n\n\nIf you use our library in your academic work, please cite us in the following way\n\n\n\n```\n@article{polo2021legalnlp,\n title={LegalNLP--Natural Language Processing methods for the Brazilian Legal Language},\n author={Polo, Felipe Maia and Mendon{\\c{c}}a, Gabriel Caiaffa Floriano and Parreira, Kau{\\^e} Capellato J and Gianvechio, Lucka and Cordeiro, Peterson and Ferreira, Jonathan Batista and de Lima, Leticia Maria Paz and Maia, Ant{\\^o}nio Carlos do Amaral and Vicente, Renato},\n journal={arXiv preprint arXiv:2110.15709},\n year={2021}\n}\n\n```\n\n\n\n---\n\n\nSummary\n-------\n\n\n0. Accessing the Language Models\n1. Introduction / Installing package\n2. Language Models (Details / How to use)\n\t1. Word2Vec/Doc2Vec\n3. Demonstrations / Tutorials\n4. References\n\n\n\n\n---\n\n\n\n0. Accessing the Language Models\n--------------------------------\n\n\nAll our models can be found here.\n\n\nPlease contact *felipemaiapolo@URL* if you have any problem accessing the language models.\n\n\n\n\n---\n\n\n\n1. Introduction / Installing package\n------------------------------------\n\n\n*LegalNLP* is promising given the scarcity of Natural Language Processing resources focused on the Brazilian legal language. It is worth mentioning that our library was made for Python, one of the most well-known programming languages for machine learning.\n\n\nYou first need to install the HuggingFaceHub library running the following command on terminal\n\n\nImport 'hf\\_hub\\_download':\n\n\nAnd then you can download our Word2Vec(SG)/Doc2Vec(DBOW) and Word2Vec(CBOW)/Doc2Vec(DM) by the following commands:\n\n\n\n\n---\n\n\n\n2. Model Languages\n------------------### 3.2. Word2Vec/Doc2Vec\n\n\nOur first models for generating vector representation for tokens and\ntexts (embeddings) are variations of the Word2Vec [1,\n2] and Doc2Vec [3] methods. In short, the\nWord2Vec methods generate embeddings for tokens5 and that somehow capture\nthe meaning of the various textual elements, based on the contexts in which these\nelements appear. Doc2Vec methods are extensions/modifications of Word2Vec\nfor generating whole text representations.\n\n\nRemember to at least make all letters lowercase. Please check our paper or Gensim page for more details. Preferably use Gensim version 3.8.3.\n\n\nBelow we have a summary table with some important information about the trained models:\n\n\n\nHere we made available both models with 100 size and 15 window.#### Using *Word2Vec*\n\n\nInstalling Gensim\n\n\nLoading W2V:\n\n\nViewing the first 10 entries of 'juiz' vector\n\n\n\n```\narray([ 6.570131 , -1.262787 , 5.156106 , -8.943866 , -5.884408 ,\n -7.717058 , 1.8819941 , -8.02803 , -0.66901577, 6.7223144 ],\n dtype=float32)\n\n```\n\nViewing closest tokens to 'juiz'\n\n\n\n```\n[('juíza', 0.8210258483886719),\n ('juiza', 0.7306275367736816),\n ('juíz', 0.691645085811615),\n ('juízo', 0.6605231165885925),\n ('magistrado', 0.6213295459747314),\n ('mmª_juíza', 0.5510469675064087),\n ('juizo', 0.5494943261146545),\n ('desembargador', 0.5313084721565247),\n ('mmjuiz', 0.5277603268623352),\n ('fabíola_melo_feijão_juíza', 0.5043971538543701)]\n\n```"
] | [
-0.005099405534565449,
0.17003193497657776,
-0.009411917999386787,
0.03977297246456146,
0.020594868808984756,
-0.023516014218330383,
0.05506087839603424,
0.10594905912876129,
0.06456539034843445,
0.025498123839497566,
0.10901442915201187,
0.08782448619604111,
0.057633958756923676,
0.009699517861008644,
0.004995729774236679,
-0.21352030336856842,
0.013287583366036415,
-0.05109848082065582,
-0.03922003507614136,
0.053693052381277084,
0.06388223171234131,
-0.027598954737186432,
0.05180767923593521,
0.01075826957821846,
0.007699638605117798,
0.025283781811594963,
-0.013518065214157104,
-0.07194115221500397,
0.02768521010875702,
0.06612725555896759,
0.07439917325973511,
0.04076690226793289,
0.023526130244135857,
-0.1992468535900116,
0.00013787485659122467,
-0.03268299996852875,
-0.05137401819229126,
0.02587944269180298,
0.111906498670578,
-0.023570284247398376,
0.21348220109939575,
-0.07182923704385757,
0.003729008138179779,
0.05479824170470238,
-0.17287775874137878,
-0.13526460528373718,
-0.06636346131563187,
0.02094092220067978,
0.05257444456219673,
0.08437766134738922,
0.020808011293411255,
0.10647519677877426,
-0.03346722945570946,
0.027205072343349457,
0.10758055746555328,
-0.23559126257896423,
-0.02862466126680374,
0.1023971363902092,
0.06730928272008896,
0.01239502802491188,
0.009247494861483574,
0.006621355190873146,
0.03866938501596451,
-0.035677406936883926,
-0.07469693571329117,
-0.07761598378419876,
0.03915289044380188,
0.03035365603864193,
-0.09983743727207184,
-0.07257373631000519,
0.16290730237960815,
-0.00275995209813118,
-0.038223035633563995,
-0.008138949051499367,
-0.001055101864039898,
0.05854262039065361,
-0.0027392031624913216,
0.04401617497205734,
0.028272388502955437,
0.04472844675183296,
0.14733576774597168,
-0.04486105963587761,
-0.09142262488603592,
0.026111837476491928,
-0.060819901525974274,
0.12409508973360062,
0.0016532037407159805,
0.024496588855981827,
-0.050872839987277985,
-0.010860791429877281,
-0.0253766980022192,
-0.07333891093730927,
-0.005651404615491629,
-0.03710780292749405,
0.0798657163977623,
0.03370828926563263,
0.019760161638259888,
0.04336398467421532,
0.10505661368370056,
0.0692368745803833,
-0.027928801253437996,
0.039685558527708054,
-0.04786495864391327,
0.08962398022413254,
-0.007229967974126339,
0.06262299418449402,
0.022701993584632874,
0.030651822686195374,
0.003544669598340988,
-0.10918629914522171,
0.04601765424013138,
-0.013107113540172577,
-0.10564221441745758,
0.06558747589588165,
-0.1665244847536087,
0.13573014736175537,
0.08714559674263,
-0.0429929718375206,
-0.07013066112995148,
-0.0060340892523527145,
0.05984951928257942,
-0.06427887082099915,
0.040335189551115036,
0.029591679573059082,
0.0005566524341702461,
-0.03247685730457306,
-0.012111721560359001,
0.0378599651157856,
-0.06660900264978409,
0.048089541494846344,
-0.05418359488248825,
0.008855581283569336,
-0.04490666091442108,
-0.04003822058439255,
0.097307950258255,
0.02996143512427807,
0.030742986127734184,
-0.09546973556280136,
-0.05047021433711052,
-0.06984888017177582,
0.025631753727793694,
-0.06795579940080643,
-0.005763202905654907,
-0.006136354990303516,
-0.011488202959299088,
-0.030418604612350464,
-0.012826330959796906,
-0.09105724096298218,
-0.07562439143657684,
0.06682080775499344,
-0.03254947066307068,
0.0386011004447937,
-0.09696625918149948,
-0.023166095837950706,
-0.1053587794303894,
0.04548612982034683,
-0.028798118233680725,
0.016461800783872604,
-0.166974276304245,
0.07579967379570007,
-0.08643071353435516,
-0.029831474646925926,
-0.024732572957873344,
0.01871599070727825,
-0.00038440432399511337,
0.13760840892791748,
-0.1085796058177948,
-0.042421821504831314,
0.18062299489974976,
-0.09287195652723312,
-0.054032884538173676,
0.03405477851629257,
0.006545926444232464,
0.08185535669326782,
0.046879082918167114,
0.2700639069080353,
0.06990807503461838,
-0.13800610601902008,
0.05372999981045723,
0.019688501954078674,
-0.039978623390197754,
0.014644809067249298,
0.11337923258543015,
-0.12843438982963562,
-0.05874380096793175,
-0.009437710046768188,
-0.09834467619657516,
0.011671612970530987,
-0.008191239088773727,
-0.0764404833316803,
0.018537800759077072,
0.03738018870353699,
-0.019684897735714912,
0.01219848170876503,
-0.03383294492959976,
-0.054091889411211014,
-0.021668897941708565,
0.022997424006462097,
0.0300491563975811,
0.04219609498977661,
-0.030053794384002686,
-0.03522852063179016,
0.04435222968459129,
0.05701390653848648,
-0.030971288681030273,
-0.05176045745611191,
-0.08400538563728333,
0.02189047820866108,
-0.06381157040596008,
0.1395360231399536,
0.01655874401330948,
0.028000235557556152,
0.008124979212880135,
0.021831661462783813,
-0.0046766395680606365,
0.014403718523681164,
0.014382886700332165,
0.036053258925676346,
-0.12015549838542938,
0.06979981064796448,
0.004445591941475868,
0.02440940961241722,
-0.11524638533592224,
-0.014416308142244816,
0.09751792252063751,
0.05518518388271332,
0.010247670114040375,
-0.0012807594612240791,
0.02231094241142273,
0.021221604198217392,
0.05409342795610428,
-0.03133438155055046,
0.07463894784450531,
0.003863525576889515,
-0.011532267555594444,
0.10688554495573044,
-0.06951206922531128,
0.047080397605895996,
0.11413917690515518,
-0.02987954393029213,
0.012911403551697731,
-0.06629469990730286,
-0.020256293937563896,
0.005574719049036503,
-0.0075877029448747635,
0.020527884364128113,
0.09962448477745056,
0.016766291111707687,
0.014774219132959843,
-0.09442061185836792,
-0.026107605546712875,
-0.010778078809380531,
-0.09229233860969543,
-0.05873753875494003,
0.09307831525802612,
0.14572687447071075,
-0.12282335758209229,
0.10187656432390213,
0.1651318520307541,
0.043373771011829376,
0.15062573552131653,
-0.04164023697376251,
-0.059178248047828674,
-0.06620719283819199,
0.08409996330738068,
0.014102820307016373,
0.05068875849246979,
-0.0857183039188385,
0.035264261066913605,
0.05049446225166321,
-0.028302013874053955,
0.05674159899353981,
-0.11053714156150818,
-0.07289335131645203,
-0.02467379719018936,
-0.023957349359989166,
-0.08634739369153976,
0.0037818122655153275,
-0.03556910529732704,
0.05323914438486099,
-0.0006798785179853439,
-0.06706809997558594,
0.022260744124650955,
-0.005562650505453348,
-0.08175771683454514,
0.09849873930215836,
-0.1299753487110138,
-0.25438737869262695,
-0.16413286328315735,
0.003615189343690872,
-0.01875719428062439,
0.007274135015904903,
0.05322807654738426,
-0.028737619519233704,
-0.08165793120861053,
0.017196742817759514,
-0.0005321521311998367,
-0.04650019109249115,
-0.1253059208393097,
-0.046724848449230194,
0.08809983730316162,
0.005963415373116732,
-0.12669086456298828,
-0.06298460811376572,
0.044581472873687744,
-0.010730128735303879,
0.037899620831012726,
-0.01949801668524742,
0.09157972037792206,
0.0812920555472374,
0.03426683694124222,
-0.013041683472692966,
-0.02372099459171295,
0.1997867375612259,
-0.08335071802139282,
-0.012540139257907867,
0.04609159380197525,
-0.0383983850479126,
0.057880353182554245,
0.1407654732465744,
0.07200311124324799,
-0.06681445240974426,
-0.009594819508492947,
-0.0018608737736940384,
-0.026128014549613,
-0.25743815302848816,
-0.10661283880472183,
-0.067494697868824,
0.10482681542634964,
0.07019181549549103,
0.06564483046531677,
-0.00915505550801754,
0.004459948744624853,
0.006342710927128792,
-0.062098320573568344,
0.09342962503433228,
0.09202190488576889,
0.2154863476753235,
-0.04020464047789574,
0.023202436044812202,
-0.06402276456356049,
-0.07288661599159241,
0.10432730615139008,
0.05507572367787361,
0.15586155652999878,
0.17793427407741547,
0.14827227592468262,
0.11255890130996704,
0.1284237504005432,
0.007106063887476921,
0.0010282034054398537,
0.043399304151535034,
0.025431616231799126,
-0.027892421931028366,
-0.07821178436279297,
0.027242440730333328,
0.04874162748456001,
0.04725710675120354,
-0.13500836491584778,
0.020672816783189774,
-0.0969400554895401,
0.04035593196749687,
0.03747743368148804,
0.055546365678310394,
-0.032560430467128754,
0.05294721573591232,
0.06555072963237762,
-0.010246763937175274,
-0.07473520934581757,
0.07762505114078522,
-0.03499140590429306,
-0.09889110922813416,
0.0700971782207489,
0.012291794642806053,
0.07991063594818115,
0.0053397491574287415,
0.012539975345134735,
-0.1110031008720398,
-0.0883478969335556,
0.03175334632396698,
0.12392181158065796,
-0.17397253215312958,
0.23087987303733826,
0.022009432315826416,
-0.015503648668527603,
-0.0432174988090992,
0.005138767883181572,
-0.031805772334337234,
0.005030401051044464,
0.1479986608028412,
0.053014662116765976,
-0.07773531228303909,
-0.08581739664077759,
-0.04794876277446747,
-0.003888689214363694,
0.06979704648256302,
0.007575381547212601,
0.0026743896305561066,
-0.007010187953710556,
-0.0035517867654561996,
-0.03471478074789047,
-0.02691841498017311,
-0.06803296506404877,
-0.13064609467983246,
0.07354103028774261,
-0.06730978190898895,
0.07441823929548264,
-0.05437283217906952,
0.011563885025680065,
-0.030527696013450623,
0.1866702139377594,
-0.07678058743476868,
-0.06459749490022659,
-0.07679766416549683,
-0.12586639821529388,
0.08113183081150055,
-0.039368852972984314,
0.025454331189393997,
-0.014649864286184311,
0.001677531749010086,
-0.08417564630508423,
-0.0332646369934082,
0.12157757580280304,
-0.08013461530208588,
-0.06993843615055084,
-0.02761179953813553,
0.16544459760189056,
-0.02067430689930916,
0.04636155441403389,
0.030056873336434364,
0.06721716374158859,
-0.031063608825206757,
-0.12550558149814606,
0.029921159148216248,
-0.048121463507413864,
-0.02808472141623497,
-0.009430687874555588,
-0.015033934265375137,
-0.03331892937421799,
-0.07858116924762726,
-0.05734753608703613,
0.1687810719013214,
0.22900864481925964,
-0.024858564138412476,
0.1956498622894287,
0.14284613728523254,
-0.109993115067482,
-0.15442869067192078,
-0.06907741725444794,
-0.02982243150472641,
-0.08672866225242615,
0.055735718458890915,
-0.15634039044380188,
0.015683922916650772,
0.14531970024108887,
-0.053557299077510834,
0.10268640518188477,
-0.1928645372390747,
-0.06544752418994904,
0.06966383010149002,
-0.017916597425937653,
0.04852835088968277,
-0.15352094173431396,
-0.12008219212293625,
-0.026972085237503052,
-0.20279844105243683,
0.17261584103107452,
-0.06415125727653503,
0.0008081982377916574,
0.04067234694957733,
0.027640873566269875,
0.03216028958559036,
-0.02137252315878868,
0.18985208868980408,
-0.06704156845808029,
0.01714615523815155,
-0.0513981357216835,
-0.12225935608148575,
0.045305635780096054,
0.02502169832587242,
0.07215677946805954,
0.006465816870331764,
-0.0026720818132162094,
-0.1554090529680252,
-0.029103346168994904,
-0.06262031197547913,
0.09370795637369156,
-0.053557462990283966,
-0.06121516227722168,
-0.05704280734062195,
0.01156824454665184,
0.0006180573254823685,
-0.00539415143430233,
0.1568908840417862,
-0.04906492680311203,
-0.010805461555719376,
0.16674518585205078,
0.08580896258354187,
-0.012845687568187714,
-0.033093176782131195,
-0.014573165215551853,
-0.04476568102836609,
0.0388629250228405,
-0.10115451365709305,
0.014074044302105904,
0.11423832923173904,
-0.010826610028743744,
0.062056899070739746,
-0.017281558364629745,
-0.1076817438006401,
-0.021363522857427597,
0.1150793582201004,
-0.03378582373261452,
-0.1337684839963913,
-0.03826643526554108,
0.09905174374580383,
0.05481138080358505,
-0.07906258851289749,
0.08337686210870743,
-0.00687803328037262,
-0.03615740314126015,
0.03772977739572525,
0.0328507125377655,
-0.053129710257053375,
0.002609001472592354,
0.07773299515247345,
-0.0011334093287587166,
-0.09006296098232269,
0.10608547180891037,
0.08128997683525085,
-0.05260637402534485,
-0.01840323582291603,
0.17464298009872437,
-0.03567717224359512,
-0.0623021237552166,
-0.07242881506681442,
0.05444222688674927,
-0.12485470622777939,
-0.051584288477897644,
0.021287737414240837,
-0.10773740708827972,
0.0032550767064094543,
0.06698212027549744,
-0.008060315623879433,
-0.018860353156924248,
0.07137417793273926,
-0.0049148667603731155,
0.058487918227910995,
0.03578431159257889,
-0.008175399154424667,
-0.026987861841917038,
-0.012104134075343609,
0.06443969160318375,
0.03241530805826187,
-0.016511261463165283,
0.0010125935077667236,
-0.040532223880290985,
-0.1644805371761322,
0.02365603670477867,
-0.05590331554412842,
0.05508624017238617,
-0.12263119965791702,
0.013128758408129215,
0.022525416687130928,
0.004256715998053551,
-0.010770287364721298,
-0.010875602252781391,
-0.10442018508911133,
-0.029519222676753998,
-0.028422774747014046,
0.10511213541030884,
-0.07074074447154999,
-0.04335973784327507,
0.09250781685113907,
0.01628188230097294,
0.0262922253459692,
0.010434441268444061,
0.02968214638531208,
0.09428131580352783,
-0.15704211592674255,
0.025923747569322586,
0.10169220715761185,
0.00868912972509861,
-0.008796574547886848,
-0.14010688662528992,
-0.012245383113622665,
0.03567666560411453,
-0.02448766864836216,
0.004022512584924698,
-0.07845035195350647,
-0.093350350856781,
-0.004318716935813427,
0.02315257117152214,
-0.18750369548797607,
0.01294579915702343,
-0.04212581366300583,
0.04934848099946976,
-0.05459718406200409,
0.11012063920497894,
-0.04329169541597366,
0.051395803689956665,
-0.03870176896452904,
0.008342907764017582,
-0.027451511472463608,
-0.042718254029750824,
-0.08152761310338974,
-0.0607123076915741,
0.006321819499135017,
-0.022843604907393456,
0.1619526594877243,
0.033588655292987823,
-0.07956409454345703,
0.02865149825811386,
0.013779321685433388,
-0.03654555603861809,
0.022896934300661087,
0.08168245106935501,
0.037047624588012695,
0.011937903240323067,
-0.12859191000461578,
-0.0005669482052326202,
-0.05598866939544678,
-0.07091055810451508,
0.05642852932214737,
0.09813537448644638,
0.0865606814622879,
0.017840877175331116,
0.08848484605550766,
-0.06345310062170029,
-0.048179492354393005,
-0.003272203728556633,
0.08962879329919815,
0.035498183220624924,
-0.04303974285721779,
0.09483283758163452,
0.18081200122833252,
-0.1273268759250641,
0.03680839389562607,
-0.021496478468179703,
-0.03821761906147003,
-0.10773833096027374,
-0.1356084644794464,
-0.011715446598827839,
-0.045173466205596924,
0.012633167207241058,
-0.06881736218929291,
0.1113395243883133,
-0.032753005623817444,
0.01658361591398716,
-0.04739692434668541,
-0.00008621253073215485,
-0.03347710520029068,
-0.12762890756130219,
-0.012044079601764679,
0.029514405876398087,
0.09095513075590134,
-0.11513020098209381,
0.012113181874155998,
-0.0909489095211029,
-0.04610627144575119,
-0.008195582777261734,
0.07234448194503784,
0.00785217247903347,
-0.052346035838127136,
-0.10566962510347366,
-0.047820281237363815,
0.005208486691117287,
0.04682377725839615,
0.02269895002245903,
0.25827470421791077,
0.02467627450823784,
0.006522195413708687,
0.06623164564371109,
0.11003920435905457,
-0.02773280069231987,
-0.11271674931049347,
-0.07215388119220734,
0.1339724212884903,
-0.017821690067648888,
0.0790528655052185,
-0.03430888056755066,
-0.05669694021344185,
0.03960246592760086,
0.1998756229877472,
0.17085622251033783,
0.054574154317379,
0.02895161136984825,
0.01162700355052948,
0.03632286190986633,
0.07188558578491211,
0.0594022274017334,
0.005155882798135281,
0.29093050956726074,
-0.08655232191085815,
0.002220394089818001,
-0.050583597272634506,
0.0225567277520895,
-0.12697967886924744,
0.055534012615680695,
0.018607638776302338,
-0.08419191837310791,
-0.013615669682621956,
0.13319818675518036,
-0.13263997435569763,
-0.042687952518463135,
0.044359657913446426,
-0.1040986180305481,
-0.04469674453139305,
-0.018222346901893616,
-0.03546087443828583,
0.05587571859359741,
0.09804583340883255,
-0.013582248240709305,
-0.11781869828701019,
0.05386345088481903,
0.0660468265414238,
-0.21754607558250427,
-0.07840895652770996,
0.050068922340869904,
0.026380106806755066,
0.1694614142179489,
0.02658621221780777,
0.11330794543027878,
0.05436944216489792,
0.04392721876502037,
-0.11083774268627167,
0.07549197971820831,
0.09645281732082367,
-0.05558648705482483,
-0.033761851489543915,
-0.09797537326812744,
-0.011822568252682686,
0.04235832020640373,
0.0940055325627327,
0.018530068919062614,
0.03439321368932724,
0.15398874878883362,
0.02280406281352043,
-0.04089941456913948,
0.08503872901201248,
-0.12342917174100876,
-0.012718063779175282,
0.08234699815511703,
-0.013947154395282269,
-0.04975561052560806,
-0.046862341463565826,
0.017960667610168457,
0.04523792862892151,
0.013919062912464142,
-0.01672133430838585,
-0.03885854035615921,
0.008026356808841228,
0.0451778843998909,
0.06318090856075287,
-0.13172093033790588,
-0.007172881159931421,
-0.031680259853601456,
0.04356594383716583,
-0.0798451155424118,
0.10469967126846313,
0.04719433933496475,
-0.049848802387714386,
0.026821523904800415,
-0.15672838687896729,
0.01753850094974041,
0.055227503180503845,
-0.020422613248229027,
0.025263259187340736
] |
null | null | transformers |
# Prompsit/paraphrase-bert-en
This model allows to evaluate paraphrases for a given phrase.
We have fine-tuned this model from pretrained "bert-base-uncased".
Model built under a TSI-100905-2019-4 project, co-financed by Ministry of Economic Affairs and Digital Transformation from the Government of Spain.
# How to use it
The model answer the following question: Is "phrase B" a paraphrase of "phrase A".
Please note that we're considering phrases instead of sentences. Therefore, we must take into account that the model doesn't expect to find punctuation marks or long pieces of text.
Resulting probabilities correspond to classes:
* 0: Not a paraphrase
* 1: It's a paraphrase
So, considering the phrase "may be addressed" and a candidate paraphrase like "could be included", you can use the model like this:
```
import torch
from transformers import AutoTokenizer, AutoModelForSequenceClassification
tokenizer = AutoTokenizer.from_pretrained("Prompsit/paraphrase-bert-en")
model = AutoModelForSequenceClassification.from_pretrained("Prompsit/paraphrase-bert-en")
input = tokenizer('may be addressed','could be included',return_tensors='pt')
logits = model(**input).logits
soft = torch.nn.Softmax(dim=1)
print(soft(logits))
```
Code output is:
```
tensor([[0.1592, 0.8408]], grad_fn=<SoftmaxBackward>)
```
As the probability of 1 (=It's a paraphrase) is 0.84 and the probability of 0 (=It is not a paraphrase) is 0.15, we can conclude, for our previous example, that "could be included" is a paraphrase of "may be addressed".
# Evaluation results
We have used as test dataset 16500 pairs of phrases human tagged.
Metrics obtained are:
```
metrics={
'test_loss': 0.5660144090652466,
'test_accuracy': 0.8170742794799527,
'test_precision': 0.7043977055449331,
'test_recall': 0.5978578383641675,
'test_f1': 0.6467696629213483,
'test_matthews_correlation': 0.5276716223607356,
'test_runtime': 19.3345,
'test_samples_per_second': 568.88,
'test_steps_per_second': 17.792
}
``` | {"language": "en", "tags": ["transformers"], "pipeline_tag": "text-classification", "inference": false} | text-classification | Prompsit/paraphrase-bert-en | [
"transformers",
"pytorch",
"bert",
"text-classification",
"en",
"autotrain_compatible",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [
"en"
] | TAGS
#transformers #pytorch #bert #text-classification #en #autotrain_compatible #region-us
|
# Prompsit/paraphrase-bert-en
This model allows to evaluate paraphrases for a given phrase.
We have fine-tuned this model from pretrained "bert-base-uncased".
Model built under a TSI-100905-2019-4 project, co-financed by Ministry of Economic Affairs and Digital Transformation from the Government of Spain.
# How to use it
The model answer the following question: Is "phrase B" a paraphrase of "phrase A".
Please note that we're considering phrases instead of sentences. Therefore, we must take into account that the model doesn't expect to find punctuation marks or long pieces of text.
Resulting probabilities correspond to classes:
* 0: Not a paraphrase
* 1: It's a paraphrase
So, considering the phrase "may be addressed" and a candidate paraphrase like "could be included", you can use the model like this:
Code output is:
As the probability of 1 (=It's a paraphrase) is 0.84 and the probability of 0 (=It is not a paraphrase) is 0.15, we can conclude, for our previous example, that "could be included" is a paraphrase of "may be addressed".
# Evaluation results
We have used as test dataset 16500 pairs of phrases human tagged.
Metrics obtained are:
| [
"# Prompsit/paraphrase-bert-en\n\nThis model allows to evaluate paraphrases for a given phrase. \nWe have fine-tuned this model from pretrained \"bert-base-uncased\".\n\nModel built under a TSI-100905-2019-4 project, co-financed by Ministry of Economic Affairs and Digital Transformation from the Government of Spain.",
"# How to use it\n\nThe model answer the following question: Is \"phrase B\" a paraphrase of \"phrase A\".\nPlease note that we're considering phrases instead of sentences. Therefore, we must take into account that the model doesn't expect to find punctuation marks or long pieces of text.\n\nResulting probabilities correspond to classes: \n* 0: Not a paraphrase\n* 1: It's a paraphrase\n\n\n\nSo, considering the phrase \"may be addressed\" and a candidate paraphrase like \"could be included\", you can use the model like this:\n\n\n\nCode output is:\n \n\nAs the probability of 1 (=It's a paraphrase) is 0.84 and the probability of 0 (=It is not a paraphrase) is 0.15, we can conclude, for our previous example, that \"could be included\" is a paraphrase of \"may be addressed\".",
"# Evaluation results\n\nWe have used as test dataset 16500 pairs of phrases human tagged. \n\nMetrics obtained are:"
] | [
"TAGS\n#transformers #pytorch #bert #text-classification #en #autotrain_compatible #region-us \n",
"# Prompsit/paraphrase-bert-en\n\nThis model allows to evaluate paraphrases for a given phrase. \nWe have fine-tuned this model from pretrained \"bert-base-uncased\".\n\nModel built under a TSI-100905-2019-4 project, co-financed by Ministry of Economic Affairs and Digital Transformation from the Government of Spain.",
"# How to use it\n\nThe model answer the following question: Is \"phrase B\" a paraphrase of \"phrase A\".\nPlease note that we're considering phrases instead of sentences. Therefore, we must take into account that the model doesn't expect to find punctuation marks or long pieces of text.\n\nResulting probabilities correspond to classes: \n* 0: Not a paraphrase\n* 1: It's a paraphrase\n\n\n\nSo, considering the phrase \"may be addressed\" and a candidate paraphrase like \"could be included\", you can use the model like this:\n\n\n\nCode output is:\n \n\nAs the probability of 1 (=It's a paraphrase) is 0.84 and the probability of 0 (=It is not a paraphrase) is 0.15, we can conclude, for our previous example, that \"could be included\" is a paraphrase of \"may be addressed\".",
"# Evaluation results\n\nWe have used as test dataset 16500 pairs of phrases human tagged. \n\nMetrics obtained are:"
] | [
30,
79,
197,
28
] | [
"passage: TAGS\n#transformers #pytorch #bert #text-classification #en #autotrain_compatible #region-us \n# Prompsit/paraphrase-bert-en\n\nThis model allows to evaluate paraphrases for a given phrase. \nWe have fine-tuned this model from pretrained \"bert-base-uncased\".\n\nModel built under a TSI-100905-2019-4 project, co-financed by Ministry of Economic Affairs and Digital Transformation from the Government of Spain.# How to use it\n\nThe model answer the following question: Is \"phrase B\" a paraphrase of \"phrase A\".\nPlease note that we're considering phrases instead of sentences. Therefore, we must take into account that the model doesn't expect to find punctuation marks or long pieces of text.\n\nResulting probabilities correspond to classes: \n* 0: Not a paraphrase\n* 1: It's a paraphrase\n\n\n\nSo, considering the phrase \"may be addressed\" and a candidate paraphrase like \"could be included\", you can use the model like this:\n\n\n\nCode output is:\n \n\nAs the probability of 1 (=It's a paraphrase) is 0.84 and the probability of 0 (=It is not a paraphrase) is 0.15, we can conclude, for our previous example, that \"could be included\" is a paraphrase of \"may be addressed\".# Evaluation results\n\nWe have used as test dataset 16500 pairs of phrases human tagged. \n\nMetrics obtained are:"
] | [
0.01973036117851734,
-0.057359352707862854,
-0.0042261299677193165,
-0.022444017231464386,
0.10028834640979767,
-0.056979138404130936,
0.11581441015005112,
-0.007822954095900059,
-0.01870650053024292,
0.08454804867506027,
0.14428970217704773,
0.15620844066143036,
-0.009721344336867332,
-0.15869396924972534,
0.013926464132964611,
-0.2687259614467621,
0.05463309958577156,
-0.06790564954280853,
0.11621849983930588,
0.14714057743549347,
0.09914738684892654,
-0.0842730849981308,
0.04492112249135971,
0.052318718284368515,
0.07456787675619125,
0.07609286904335022,
-0.07103121280670166,
0.009038806892931461,
0.11273554712533951,
0.11155462265014648,
0.05384371802210808,
-0.0011599797289818525,
-0.01933499239385128,
-0.12717057764530182,
-0.016876351088285446,
0.04879095405340195,
0.007633061148226261,
-0.008838270790874958,
0.04555444419384003,
-0.0023466795682907104,
0.10408588498830795,
0.06791003048419952,
0.062212660908699036,
0.03599974140524864,
-0.16158360242843628,
-0.03962312638759613,
0.0010865972144529223,
-0.06209409609436989,
0.11273099482059479,
0.05076273903250694,
-0.10873842239379883,
0.21351072192192078,
-0.06357037276029587,
0.03746563568711281,
0.00984837207943201,
-0.2293129861354828,
-0.025788098573684692,
-0.08395563066005707,
0.10877374559640884,
0.147761270403862,
0.04122838005423546,
0.05158292502164841,
0.05780405178666115,
0.006118816789239645,
-0.05422019213438034,
-0.021869013085961342,
0.03668024390935898,
-0.019628671929240227,
-0.2111254334449768,
-0.10472900420427322,
0.25922930240631104,
-0.002405742881819606,
-0.0382368229329586,
-0.16005904972553253,
-0.0354313924908638,
0.14596407115459442,
0.020090840756893158,
-0.11375444382429123,
-0.009482711553573608,
0.02601625584065914,
0.1321881264448166,
-0.033682458102703094,
-0.10243260115385056,
-0.002834412967786193,
-0.09757129848003387,
0.19336669147014618,
-0.006832590326666832,
0.0222995076328516,
-0.03947995603084564,
0.06137852370738983,
-0.06749033182859421,
-0.004090354777872562,
0.02989041805267334,
-0.0777653381228447,
0.010629077441990376,
0.04434813931584358,
-0.1189306229352951,
-0.13237537443637848,
0.02211766503751278,
0.09274844080209732,
0.046488117426633835,
-0.01785123348236084,
-0.01479998417198658,
0.11762546747922897,
0.10365113615989685,
0.0370916910469532,
-0.16563650965690613,
-0.026537489145994186,
-0.06857151538133621,
-0.00898053403943777,
0.06738746166229248,
0.005344621371477842,
-0.10954835265874863,
0.051994290202856064,
0.047248683869838715,
0.005027928855270147,
0.05273032188415527,
0.13217714428901672,
-0.05079227685928345,
-0.0139505285769701,
0.12349870055913925,
-0.0023705854546278715,
-0.0782083123922348,
-0.006585817318409681,
-0.045407384634017944,
-0.01562824286520481,
0.047973860055208206,
0.07209920883178711,
-0.021849174052476883,
0.0003849874483421445,
-0.09054383635520935,
-0.04806794971227646,
-0.044293083250522614,
-0.15086573362350464,
-0.018818877637386322,
-0.009376772679388523,
-0.029568031430244446,
-0.06493202596902847,
0.023393267765641212,
-0.09052249789237976,
-0.022549906745553017,
-0.03262835368514061,
0.02773822471499443,
-0.03411547839641571,
0.06559501588344574,
-0.07772034406661987,
-0.023245709016919136,
-0.00887798611074686,
0.00255972845479846,
0.07680819928646088,
0.017483113333582878,
0.039134640246629715,
-0.10767066478729248,
-0.001505208434537053,
-0.18240995705127716,
0.016294755041599274,
-0.19566942751407623,
0.11112267524003983,
-0.026420874521136284,
-0.005322541110217571,
-0.12119411677122116,
-0.020737430080771446,
-0.056963928043842316,
0.10682477056980133,
-0.029323894530534744,
0.1322280615568161,
-0.1360889971256256,
-0.05269943177700043,
0.13386887311935425,
-0.1170491874217987,
-0.07676553726196289,
0.09761593490839005,
-0.028352277353405952,
0.10058946162462234,
0.06392526626586914,
0.1393202245235443,
-0.04373442754149437,
-0.11803102493286133,
0.021641930565238,
0.008197170682251453,
-0.03945739567279816,
0.18565252423286438,
0.07930049300193787,
-0.1196918934583664,
-0.0984843522310257,
0.030522368848323822,
-0.056726597249507904,
-0.10752306878566742,
-0.06546387076377869,
-0.0024643144570291042,
0.015307815745472908,
-0.013185901567339897,
0.06545896828174591,
0.0755390077829361,
-0.0394369401037693,
-0.040725890547037125,
-0.1299421489238739,
0.20519067347049713,
-0.012764262035489082,
-0.04110227897763252,
-0.04061153903603554,
-0.01800277642905712,
0.03366319462656975,
-0.07618039101362228,
-0.04320290684700012,
-0.20603591203689575,
-0.03250588849186897,
-0.05969831347465515,
0.030270451679825783,
0.1200701892375946,
0.2423897087574005,
-0.003340748604387045,
0.05151869356632233,
0.03127717226743698,
0.08017638325691223,
0.10768135637044907,
0.053229376673698425,
-0.08812423795461655,
-0.152711883187294,
-0.03361884504556656,
-0.033540237694978714,
0.25207817554473877,
-0.030864644795656204,
0.00007217522943392396,
-0.0016235769726336002,
0.028661908581852913,
-0.03512753173708916,
-0.04467315226793289,
0.024981789290905,
0.06072474271059036,
0.0005483875866048038,
0.05576033145189285,
0.11428723484277725,
-0.005858113057911396,
-0.11445207893848419,
0.03967027738690376,
-0.22042137384414673,
-0.21634091436862946,
0.10350571572780609,
-0.16035711765289307,
-0.10940323770046234,
-0.1381973922252655,
-0.07991354912519455,
0.0007137173088267446,
0.017472194507718086,
-0.1224997267127037,
0.21332812309265137,
0.03494592383503914,
0.11746056377887726,
-0.12861184775829315,
0.020090466365218163,
0.023396437987685204,
-0.11410777270793915,
-0.014433677308261395,
0.12803122401237488,
0.06800360977649689,
-0.21792733669281006,
0.043508145958185196,
-0.015560037456452847,
0.028655901551246643,
0.10598307102918625,
0.06994220614433289,
-0.09250534325838089,
-0.030882658436894417,
0.00008390058792429045,
-0.0003883753961417824,
-0.0486849844455719,
-0.29825258255004883,
-0.024353768676519394,
0.04955887049436569,
-0.04948936030268669,
0.028981592506170273,
-0.06628026068210602,
0.04639311879873276,
0.008717084303498268,
0.026318730786442757,
0.05517662689089775,
0.03929828852415085,
0.03328092023730278,
0.039353806525468826,
0.007507490459829569,
-0.07870887964963913,
-0.0030439167749136686,
-0.018070988357067108,
-0.14483457803726196,
0.13923047482967377,
-0.05071278288960457,
-0.2688838839530945,
-0.07110369950532913,
0.08120214939117432,
-0.06490438431501389,
0.006470358464866877,
0.10904273390769958,
-0.004225554410368204,
-0.03679187223315239,
-0.049276769161224365,
0.07123016566038132,
-0.06465526670217514,
-0.09702636301517487,
-0.1939672827720642,
-0.0027901018038392067,
0.057967010885477066,
-0.10620452463626862,
-0.01935357227921486,
-0.05753585323691368,
-0.0977962538599968,
0.006415216252207756,
-0.11668188124895096,
0.05544715374708176,
0.14100635051727295,
0.031146900728344917,
0.0005129485507495701,
-0.024873433634638786,
0.32412564754486084,
-0.05802030488848686,
-0.051627159118652344,
0.10872876644134521,
-0.054538313299417496,
0.06771709769964218,
0.12373841553926468,
0.04524817317724228,
-0.10602596402168274,
0.04543028399348259,
0.058132193982601166,
-0.05509645864367485,
-0.08227245509624481,
-0.1992802768945694,
-0.0010386897483840585,
-0.15312305092811584,
0.0546988770365715,
-0.044591717422008514,
-0.009312065318226814,
0.08662296086549759,
-0.09955616295337677,
0.005834261886775494,
-0.03365608677268028,
0.058230359107255936,
0.24009446799755096,
-0.07967358082532883,
0.1457941234111786,
-0.011689911596477032,
-0.1592700034379959,
0.032832611352205276,
-0.07823371887207031,
0.053402747958898544,
0.13172805309295654,
-0.01658235862851143,
0.191153421998024,
-0.009085195139050484,
-0.02543613128364086,
-0.033962685614824295,
-0.05833958089351654,
0.03327500447630882,
-0.1140352413058281,
-0.11470600217580795,
-0.057944439351558685,
0.09679344296455383,
0.0598367415368557,
-0.05705215036869049,
-0.07089590281248093,
-0.061806149780750275,
0.037053611129522324,
0.12366987019777298,
0.11562593281269073,
-0.17394980788230896,
-0.03066665679216385,
-0.012724489904940128,
-0.0692269429564476,
-0.08287794142961502,
-0.010580558329820633,
-0.054445188492536545,
-0.08105433732271194,
0.037024036049842834,
-0.02828914113342762,
0.0525314062833786,
-0.0779489055275917,
0.11215993016958237,
-0.0722147673368454,
-0.047164466232061386,
0.02473222091794014,
0.1188821867108345,
-0.10882823169231415,
0.28333529829978943,
0.0600193627178669,
-0.07597560435533524,
-0.153114914894104,
-0.04487146437168121,
-0.09082495421171188,
0.07011612504720688,
0.19851179420948029,
-0.023028554394841194,
0.3219630718231201,
-0.0122147211804986,
-0.05109196901321411,
0.019900817424058914,
0.1451626420021057,
-0.10319347679615021,
0.08959541469812393,
0.0069806999526917934,
0.054251790046691895,
-0.030783072113990784,
0.13435164093971252,
-0.061840660870075226,
-0.17871081829071045,
0.07067248225212097,
-0.06981737911701202,
0.007492148783057928,
0.00003088609446422197,
0.010717255063354969,
-0.05209944397211075,
0.11329896003007889,
-0.12927144765853882,
-0.09044284373521805,
-0.022040454670786858,
0.016261549666523933,
0.12334991991519928,
0.0029581161215901375,
-0.025959590449929237,
-0.05976112559437752,
0.04052814468741417,
-0.03246813639998436,
0.022225389257073402,
0.10462423413991928,
-0.09550304710865021,
-0.0486534908413887,
-0.11887305229902267,
0.029451237991452217,
0.04370032623410225,
0.03724293038249016,
0.07125773280858994,
0.07114185392856598,
-0.039781030267477036,
-0.08636733889579773,
-0.10224883258342743,
-0.028852758929133415,
0.0915256068110466,
-0.0008452086476609111,
-0.011249602772295475,
0.13602757453918457,
-0.12322209030389786,
-0.01956588216125965,
0.07824650406837463,
0.1103934571146965,
-0.08258979022502899,
0.07358957082033157,
0.24817025661468506,
-0.06651131063699722,
-0.18653692305088043,
-0.14789944887161255,
0.11063983291387558,
-0.027750762179493904,
0.07782986015081406,
-0.08914631605148315,
0.07195000350475311,
0.15086790919303894,
-0.019095299765467644,
-0.13102689385414124,
-0.3236634433269501,
-0.08496400713920593,
0.1525791585445404,
-0.05519050359725952,
0.06743238866329193,
-0.15378184616565704,
-0.13113228976726532,
-0.03293658047914505,
0.12053411453962326,
0.06735256314277649,
-0.09028628468513489,
0.10364949703216553,
0.028482792899012566,
0.05585213005542755,
0.05103873461484909,
0.0132063589990139,
0.16093075275421143,
-0.0555511899292469,
-0.000044781780161429197,
-0.04506038501858711,
-0.09068885445594788,
0.0024771615862846375,
0.002053159521892667,
0.08891972154378891,
0.037283528596162796,
0.041592929512262344,
-0.1420450359582901,
-0.07352638244628906,
-0.05801403522491455,
0.019288677722215652,
-0.04141810163855553,
-0.0725451335310936,
-0.034333642572164536,
-0.01428107637912035,
-0.018502065911889076,
-0.004108054097741842,
-0.02623133361339569,
-0.19176045060157776,
0.08350127190351486,
0.2401277720928192,
0.20484574139118195,
-0.00044945147237740457,
-0.13427035510540009,
0.03527269512414932,
0.010728723369538784,
0.1127399131655693,
-0.03848004341125488,
0.0062674726359546185,
0.0664057508111,
0.020752299576997757,
0.07243173569440842,
-0.015695951879024506,
-0.11233507841825485,
-0.0031859437003731728,
0.009744240902364254,
-0.14353837072849274,
-0.12897735834121704,
0.021112652495503426,
0.18212281167507172,
-0.06662002205848694,
-0.0075832814909517765,
0.1323406994342804,
-0.021088149398565292,
-0.026427175849676132,
0.010072451084852219,
-0.04228344187140465,
-0.008889184333384037,
0.030206214636564255,
0.05616584047675133,
0.06426682323217392,
-0.07000366598367691,
-0.00318254460580647,
0.047410737723112106,
-0.024079661816358566,
0.0574876107275486,
-0.10670696943998337,
-0.04421105235815048,
-0.06328155100345612,
-0.23870784044265747,
0.04723695293068886,
-0.07845485210418701,
-0.07309674471616745,
0.01245152484625578,
-0.038124967366456985,
-0.025385567918419838,
0.06813191622495651,
0.05825749784708023,
0.03365662693977356,
-0.0515688881278038,
-0.06259666383266449,
-0.06331294029951096,
-0.007281037513166666,
0.010717861354351044,
-0.07896522432565689,
0.054599661380052567,
0.0910584032535553,
0.025925813242793083,
-0.07304418832063675,
-0.044018782675266266,
-0.061920393258333206,
-0.10187537223100662,
0.03878827020525932,
-0.15658381581306458,
-0.018625210970640182,
-0.043466098606586456,
-0.0423089936375618,
0.11058380454778671,
0.008506975136697292,
0.003947811666876078,
-0.06570125371217728,
-0.0317782461643219,
0.06311889737844467,
0.059913765639066696,
0.12213774770498276,
-0.08479725569486618,
0.011539340019226074,
0.08239593356847763,
-0.030536798760294914,
-0.01663852296769619,
-0.005334522109478712,
-0.03494708612561226,
-0.005819081328809261,
-0.0032930299639701843,
0.07059996575117111,
0.06370096653699875,
0.056625593453645706,
0.01716608554124832,
0.058518603444099426,
-0.03786615654826164,
0.029000578448176384,
0.04964343085885048,
-0.007610333617776632,
0.10238546133041382,
-0.12378624826669693,
0.06332461535930634,
0.1696249097585678,
-0.07608366757631302,
-0.05952322110533714,
0.026150962337851524,
-0.011185760609805584,
-0.002611973090097308,
0.0998908281326294,
-0.09676031768321991,
0.04051652178168297,
-0.1071510761976242,
0.023989174515008926,
-0.02598588727414608,
0.03442491590976715,
0.013044812716543674,
-0.09201651066541672,
0.006152336485683918,
0.01486942172050476,
0.2908402979373932,
0.09555787593126297,
0.28636860847473145,
0.04744218289852142,
0.09511725604534149,
0.07585573941469193,
-0.04791668429970741,
0.00851571187376976,
0.1001034826040268,
0.04873359203338623,
-0.14440272748470306,
0.05974026024341583,
-0.050868213176727295,
-0.01683111861348152,
-0.04151195287704468,
0.06265882402658463,
-0.02039603516459465,
0.0396793894469738,
0.093210369348526,
0.03355355188250542,
0.002452076878398657,
-0.01867152936756611,
0.14987047016620636,
0.07799166440963745,
-0.018333792686462402,
0.07499351352453232,
0.1565171480178833,
-0.008917180821299553,
0.08654902130365372,
0.011437760666012764,
-0.0266091451048851,
-0.13666032254695892,
-0.0652720034122467,
-0.02991289272904396,
-0.06577663123607635,
-0.014658160507678986,
-0.07274787127971649,
0.010293335653841496,
0.04363396018743515,
0.03238416463136673,
-0.0019600405357778072,
0.04623713716864586,
0.04518460854887962,
-0.1321103274822235,
-0.03764355555176735,
-0.03709087148308754,
0.1458008736371994,
-0.10248296707868576,
0.01088160090148449,
0.07241452485322952,
0.061280257999897,
-0.02096705697476864,
0.08524946123361588,
-0.007430928759276867,
-0.08831162005662918,
-0.048366282135248184,
-0.09938731789588928,
-0.06549547612667084,
0.00839513260871172,
-0.02690994180738926,
0.23147264122962952,
-0.00808301568031311,
-0.05280182510614395,
-0.06170216575264931,
0.030984405428171158,
-0.07656779140233994,
-0.10444504767656326,
-0.06333217024803162,
0.4349483847618103,
-0.06648631393909454,
0.14316779375076294,
-0.029370903968811035,
-0.10000002384185791,
0.052726831287145615,
0.2729111313819885,
0.16881568729877472,
-0.022091064602136612,
-0.003952639177441597,
0.0791943222284317,
0.0031905672512948513,
0.021062109619379044,
0.005399586167186499,
-0.06404563039541245,
0.293118417263031,
-0.09957023710012436,
0.1070166677236557,
-0.056451909244060516,
0.04205295071005821,
-0.007157657295465469,
0.03079262562096119,
0.09017383307218552,
-0.025195647031068802,
-0.03752734884619713,
0.1675174981355667,
-0.052949465811252594,
-0.18475855886936188,
0.04706805571913719,
0.032147232443094254,
-0.0558655746281147,
-0.018628112971782684,
-0.050410039722919464,
0.056689754128456116,
0.15554150938987732,
-0.0006427268381230533,
-0.08592398464679718,
0.0846172645688057,
0.009267195127904415,
-0.1270630806684494,
-0.0961819589138031,
0.052062537521123886,
0.03950655460357666,
0.1221201941370964,
-0.006564663723111153,
0.14608514308929443,
0.07204338163137436,
0.013962808065116405,
-0.012921000830829144,
0.08703383058309555,
0.028775060549378395,
-0.0019185584969818592,
0.04773436114192009,
-0.10484259575605392,
0.04266248643398285,
0.03799808770418167,
0.013879324309527874,
-0.22111348807811737,
0.12061627209186554,
0.037809133529663086,
-0.06816858053207397,
-0.08840528130531311,
0.09481701999902725,
-0.043703317642211914,
0.10620229691267014,
0.11926107108592987,
0.0022845177445560694,
-0.04749102145433426,
-0.03390619903802872,
-0.016263987869024277,
0.006798943970352411,
-0.013472757302224636,
-0.0368613637983799,
-0.14369869232177734,
-0.0045479098334908485,
0.10917773097753525,
-0.032212577760219574,
-0.2821805477142334,
-0.0034693388734012842,
0.14221714437007904,
0.034988854080438614,
0.09510322660207748,
-0.034270092844963074,
0.04621788486838341,
0.04806038737297058,
-0.03536264970898628,
-0.09680972248315811,
0.006580155808478594,
0.12284714728593826,
-0.08653813600540161,
-0.020636986941099167
] |
null | null | transformers |
# Prompsit/paraphrase-bert-pt
This model allows to evaluate paraphrases for a given phrase.
We have fine-tuned this model from pretrained "neuralmind/bert-base-portuguese-cased".
Model built under a TSI-100905-2019-4 project, co-financed by Ministry of Economic Affairs and Digital Transformation from the Government of Spain.
# How to use it
The model answer the following question: Is "phrase B" a paraphrase of "phrase A".
Please note that we're considering phrases instead of sentences. Therefore, we must take into account that the model doesn't expect to find punctuation marks or long pieces of text.
Resulting probabilities correspond to classes:
* 0: Not a paraphrase
* 1: It's a paraphrase
So, considering the phrase "logo após o homicídio" and a candidate paraphrase like "pouco depois do assassinato", you can use the model like this:
```
import torch
from transformers import AutoTokenizer, AutoModelForSequenceClassification
tokenizer = AutoTokenizer.from_pretrained("Prompsit/paraphrase-bert-pt")
model = AutoModelForSequenceClassification.from_pretrained("Prompsit/paraphrase-bert-pt")
input = tokenizer('logo após o homicídio','pouco depois do assassinato',return_tensors='pt')
logits = model(**input).logits
soft = torch.nn.Softmax(dim=1)
print(soft(logits))
```
Code output is:
```
tensor([[0.2137, 0.7863]], grad_fn=<SoftmaxBackward>)
```
As the probability of 1 (=It's a paraphrase) is 0.7863 and the probability of 0 (=It is not a paraphrase) is 0.2137, we can conclude, for our previous example, that "pouco depois do assassinato" is a paraphrase of "logo após o homicidio".
# Evaluation results
We have used as test dataset 16500 pairs of phrases human tagged.
Metrics obtained are:
```
metrics={
'test_loss': 0.6074697375297546,
'test_accuracy': 0.7809,
'test_precision': 0.7157638466220329,
'test_recall': 0.40551724137931033,
'test_f1': 0.5177195685670262,
'test_matthews_correlation': 0.41603913834665324,
'test_runtime': 16.4585,
'test_samples_per_second': 607.587,
'test_steps_per_second': 19.017
}
``` | {"language": "pt", "tags": ["transformers"], "pipeline_tag": "text-classification", "inference": false} | text-classification | Prompsit/paraphrase-bert-pt | [
"transformers",
"pytorch",
"bert",
"text-classification",
"pt",
"autotrain_compatible",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [
"pt"
] | TAGS
#transformers #pytorch #bert #text-classification #pt #autotrain_compatible #region-us
|
# Prompsit/paraphrase-bert-pt
This model allows to evaluate paraphrases for a given phrase.
We have fine-tuned this model from pretrained "neuralmind/bert-base-portuguese-cased".
Model built under a TSI-100905-2019-4 project, co-financed by Ministry of Economic Affairs and Digital Transformation from the Government of Spain.
# How to use it
The model answer the following question: Is "phrase B" a paraphrase of "phrase A".
Please note that we're considering phrases instead of sentences. Therefore, we must take into account that the model doesn't expect to find punctuation marks or long pieces of text.
Resulting probabilities correspond to classes:
* 0: Not a paraphrase
* 1: It's a paraphrase
So, considering the phrase "logo após o homicídio" and a candidate paraphrase like "pouco depois do assassinato", you can use the model like this:
Code output is:
As the probability of 1 (=It's a paraphrase) is 0.7863 and the probability of 0 (=It is not a paraphrase) is 0.2137, we can conclude, for our previous example, that "pouco depois do assassinato" is a paraphrase of "logo após o homicidio".
# Evaluation results
We have used as test dataset 16500 pairs of phrases human tagged.
Metrics obtained are:
| [
"# Prompsit/paraphrase-bert-pt\n\nThis model allows to evaluate paraphrases for a given phrase. \n\nWe have fine-tuned this model from pretrained \"neuralmind/bert-base-portuguese-cased\".\n\nModel built under a TSI-100905-2019-4 project, co-financed by Ministry of Economic Affairs and Digital Transformation from the Government of Spain.",
"# How to use it\n\nThe model answer the following question: Is \"phrase B\" a paraphrase of \"phrase A\".\n\nPlease note that we're considering phrases instead of sentences. Therefore, we must take into account that the model doesn't expect to find punctuation marks or long pieces of text.\n\nResulting probabilities correspond to classes: \n\n* 0: Not a paraphrase\n* 1: It's a paraphrase\n\nSo, considering the phrase \"logo após o homicídio\" and a candidate paraphrase like \"pouco depois do assassinato\", you can use the model like this:\n\n\n\nCode output is:\n\n \n\nAs the probability of 1 (=It's a paraphrase) is 0.7863 and the probability of 0 (=It is not a paraphrase) is 0.2137, we can conclude, for our previous example, that \"pouco depois do assassinato\" is a paraphrase of \"logo após o homicidio\".",
"# Evaluation results\n\nWe have used as test dataset 16500 pairs of phrases human tagged. \n\nMetrics obtained are:"
] | [
"TAGS\n#transformers #pytorch #bert #text-classification #pt #autotrain_compatible #region-us \n",
"# Prompsit/paraphrase-bert-pt\n\nThis model allows to evaluate paraphrases for a given phrase. \n\nWe have fine-tuned this model from pretrained \"neuralmind/bert-base-portuguese-cased\".\n\nModel built under a TSI-100905-2019-4 project, co-financed by Ministry of Economic Affairs and Digital Transformation from the Government of Spain.",
"# How to use it\n\nThe model answer the following question: Is \"phrase B\" a paraphrase of \"phrase A\".\n\nPlease note that we're considering phrases instead of sentences. Therefore, we must take into account that the model doesn't expect to find punctuation marks or long pieces of text.\n\nResulting probabilities correspond to classes: \n\n* 0: Not a paraphrase\n* 1: It's a paraphrase\n\nSo, considering the phrase \"logo após o homicídio\" and a candidate paraphrase like \"pouco depois do assassinato\", you can use the model like this:\n\n\n\nCode output is:\n\n \n\nAs the probability of 1 (=It's a paraphrase) is 0.7863 and the probability of 0 (=It is not a paraphrase) is 0.2137, we can conclude, for our previous example, that \"pouco depois do assassinato\" is a paraphrase of \"logo após o homicidio\".",
"# Evaluation results\n\nWe have used as test dataset 16500 pairs of phrases human tagged. \n\nMetrics obtained are:"
] | [
30,
86,
204,
28
] | [
"passage: TAGS\n#transformers #pytorch #bert #text-classification #pt #autotrain_compatible #region-us \n# Prompsit/paraphrase-bert-pt\n\nThis model allows to evaluate paraphrases for a given phrase. \n\nWe have fine-tuned this model from pretrained \"neuralmind/bert-base-portuguese-cased\".\n\nModel built under a TSI-100905-2019-4 project, co-financed by Ministry of Economic Affairs and Digital Transformation from the Government of Spain.# How to use it\n\nThe model answer the following question: Is \"phrase B\" a paraphrase of \"phrase A\".\n\nPlease note that we're considering phrases instead of sentences. Therefore, we must take into account that the model doesn't expect to find punctuation marks or long pieces of text.\n\nResulting probabilities correspond to classes: \n\n* 0: Not a paraphrase\n* 1: It's a paraphrase\n\nSo, considering the phrase \"logo após o homicídio\" and a candidate paraphrase like \"pouco depois do assassinato\", you can use the model like this:\n\n\n\nCode output is:\n\n \n\nAs the probability of 1 (=It's a paraphrase) is 0.7863 and the probability of 0 (=It is not a paraphrase) is 0.2137, we can conclude, for our previous example, that \"pouco depois do assassinato\" is a paraphrase of \"logo após o homicidio\".# Evaluation results\n\nWe have used as test dataset 16500 pairs of phrases human tagged. \n\nMetrics obtained are:"
] | [
-0.0069784969091415405,
-0.03985442966222763,
-0.007141727022826672,
0.0034451596438884735,
0.05412052571773529,
-0.05911584943532944,
0.035325463861227036,
0.05474561080336571,
-0.04987754672765732,
0.1187768206000328,
0.09101954102516174,
0.18203382194042206,
0.06390492618083954,
-0.11087255924940109,
-0.03091215342283249,
-0.29579591751098633,
0.0557268001139164,
-0.056427087634801865,
0.09919058531522751,
0.11709070205688477,
0.08054092526435852,
-0.024207597598433495,
0.045376844704151154,
0.02502836473286152,
0.03316168487071991,
0.05669371783733368,
-0.02304171770811081,
-0.0389724001288414,
0.0627213716506958,
0.07686378806829453,
0.016971029341220856,
0.006566248834133148,
-0.013859033584594727,
-0.10213235020637512,
-0.019871680065989494,
0.0195158701390028,
0.003812164766713977,
-0.018864376470446587,
0.0539819598197937,
-0.045602019876241684,
0.17888237535953522,
0.03800921142101288,
0.017635095864534378,
0.02549043297767639,
-0.1876244693994522,
-0.03284376859664917,
-0.04256902635097504,
-0.04012994095683098,
0.09130778163671494,
0.09678036719560623,
-0.06757039576768875,
0.17993265390396118,
-0.06751196831464767,
-0.0322500504553318,
0.04709073528647423,
-0.21061550080776215,
-0.05050088092684746,
-0.05917726829648018,
0.08639891445636749,
0.13125279545783997,
0.0156765915453434,
0.021631816402077675,
0.07202068716287613,
0.003453818615525961,
-0.10082114487886429,
-0.02867293916642666,
0.24448318779468536,
-0.04781319946050644,
-0.1468644142150879,
-0.0924658253788948,
0.24121464788913727,
0.036925677210092545,
-0.0569390170276165,
-0.01986086182296276,
-0.01934819296002388,
0.0161169171333313,
0.027550874277949333,
-0.08969726413488388,
-0.021134139969944954,
0.021691469475626945,
0.19865456223487854,
0.019653504714369774,
-0.11632328480482101,
-0.013372049666941166,
-0.12108102440834045,
0.20686855912208557,
0.0100189708173275,
0.02795577608048916,
0.018671365454792976,
0.06291121989488602,
-0.113674096763134,
-0.046057138592004776,
0.00020188913913443685,
-0.025915907695889473,
0.03356734290719032,
0.024014152586460114,
-0.07807444036006927,
-0.07029347866773605,
0.02714894525706768,
0.056737020611763,
0.04843926057219505,
-0.05451140180230141,
0.007159239146858454,
0.09953557699918747,
0.1161642000079155,
0.0529734268784523,
-0.08381648361682892,
-0.04044835641980171,
-0.053218111395835876,
-0.020321371033787727,
0.044071074575185776,
0.01834108866751194,
-0.05211055278778076,
0.015485092997550964,
0.028341667726635933,
0.0062929061241447926,
0.036978572607040405,
0.04555416479706764,
-0.11180495470762253,
-0.008910784497857094,
0.029746510088443756,
-0.0698707178235054,
-0.018250562250614166,
-0.05224607139825821,
-0.07227763533592224,
0.09820563346147537,
-0.021327059715986252,
0.052421364933252335,
-0.09730499237775803,
0.017703738063573837,
-0.0645073726773262,
-0.029980145394802094,
-0.08681986480951309,
-0.0723114013671875,
-0.0012387674069032073,
-0.0293947234749794,
-0.03818026930093765,
-0.07581475377082825,
0.014751177281141281,
-0.12428858131170273,
0.0019242927664890885,
-0.11171965301036835,
0.011373297311365604,
-0.04781966283917427,
0.017026657238602638,
-0.0857824757695198,
-0.0023706057108938694,
-0.04517113417387009,
-0.03610411286354065,
0.09840727597475052,
0.0106774577870965,
0.03850923851132393,
-0.10002955794334412,
-0.005401733331382275,
-0.14291583001613617,
0.005289001390337944,
-0.2154163420200348,
0.09619607776403427,
-0.010121786035597324,
0.03676491603255272,
-0.1372627317905426,
-0.005468124523758888,
-0.12071370333433151,
0.07781947404146194,
-0.046146176755428314,
0.13158489763736725,
-0.13875888288021088,
-0.02138346992433071,
0.17574375867843628,
-0.13454918563365936,
-0.026927350088953972,
0.11478548496961594,
-0.001350544742308557,
0.14014896750450134,
0.06758031994104385,
0.12421796470880508,
-0.09169502556324005,
-0.1285524219274521,
0.06719250977039337,
-0.07804375141859055,
-0.05083902180194855,
0.08525009453296661,
0.1344132423400879,
-0.15640950202941895,
-0.11700624972581863,
0.022447409108281136,
-0.07541719079017639,
-0.14060236513614655,
-0.08106330037117004,
-0.01880195364356041,
0.04114875942468643,
0.029270194470882416,
-0.0879189744591713,
0.04271968454122543,
-0.00872071087360382,
-0.06475524604320526,
-0.13972577452659607,
0.01669340208172798,
-0.012768103741109371,
0.0038236097898334265,
-0.022178106009960175,
-0.06241166219115257,
0.12024268507957458,
-0.08622265607118607,
-0.05404103919863701,
-0.1172720268368721,
0.06608641892671585,
-0.011371933855116367,
0.031596023589372635,
0.108754463493824,
0.13435091078281403,
-0.013752803206443787,
0.04456225410103798,
0.06848427653312683,
0.05692259222269058,
0.049628231674432755,
0.021005207672715187,
-0.05650573596358299,
-0.11793225258588791,
0.031085897237062454,
0.009246769361197948,
0.17485882341861725,
-0.003420772962272167,
0.002583984052762389,
0.027343565598130226,
0.055006083101034164,
-0.04906005784869194,
-0.041728343814611435,
-0.05252445861697197,
0.0497787743806839,
-0.0007146065472625196,
0.05283336341381073,
0.08766358345746994,
0.01846623234450817,
-0.04185232147574425,
0.03962487727403641,
-0.17276500165462494,
-0.07233531773090363,
0.1216137483716011,
-0.11754050850868225,
-0.06302576512098312,
-0.10678534209728241,
-0.07047498226165771,
0.015494168736040592,
0.010218487121164799,
-0.09887430816888809,
0.16623684763908386,
0.021698379889130592,
0.07971108704805374,
-0.14043983817100525,
0.023880181834101677,
0.047785449773073196,
-0.1397457867860794,
-0.027631200850009918,
0.08746501803398132,
0.09539847075939178,
-0.20978952944278717,
0.09913008660078049,
0.02369239740073681,
0.011355719529092312,
0.12109380215406418,
0.03135976567864418,
-0.09854808449745178,
-0.08625351637601852,
0.08425024896860123,
0.01963389664888382,
0.01049698144197464,
-0.29128947854042053,
0.022505104541778564,
0.02202051319181919,
-0.020801164209842682,
0.041920602321624756,
-0.10508782416582108,
0.007722101639956236,
0.0092693492770195,
0.05421819910407066,
0.08922124654054642,
-0.022618073970079422,
-0.0005239066085778177,
0.07929276674985886,
0.037633441388607025,
-0.08951997756958008,
0.014452347531914711,
-0.015713665634393692,
-0.11368877440690994,
0.09811834990978241,
-0.05641080066561699,
-0.18144045770168304,
-0.09764706343412399,
0.12903644144535065,
-0.051613178104162216,
0.030607936903834343,
0.06817945092916489,
0.0016716420650482178,
-0.012658029794692993,
-0.04768477752804756,
0.14274273812770844,
-0.02220977656543255,
-0.06871975213289261,
-0.07321830093860626,
0.0316167026758194,
0.011084401980042458,
-0.03315652161836624,
-0.09939415007829666,
-0.08945432305335999,
-0.15722841024398804,
0.03969071805477142,
-0.1743296980857849,
0.08315256237983704,
0.11763634532690048,
0.030705289915204048,
0.033759113401174545,
0.003861735574901104,
0.2391166090965271,
-0.06780067086219788,
-0.00020223144383635372,
0.11413585394620895,
0.03148837387561798,
0.05381990596652031,
0.1284954696893692,
0.017584310844540596,
-0.07865723222494125,
0.014113022945821285,
0.09329008311033249,
-0.010415383614599705,
-0.1528995931148529,
-0.13027897477149963,
0.003548061242327094,
-0.1347431242465973,
0.05107321962714195,
-0.0017062830738723278,
0.08736768364906311,
0.11984220147132874,
-0.040058258920907974,
-0.050875913351774216,
-0.0055191414430737495,
0.0649334117770195,
0.11167094856500626,
-0.07042849808931351,
0.09128863364458084,
0.007925807498395443,
-0.1589367687702179,
0.10132569819688797,
-0.015476573258638382,
0.04698579013347626,
0.12658476829528809,
-0.06868842989206314,
0.13696882128715515,
0.08417580276727676,
-0.031252671033144,
-0.014648997224867344,
0.02136031910777092,
0.00531256478279829,
-0.1274479478597641,
-0.09101876616477966,
-0.09322424232959747,
0.08143860846757889,
0.04373395815491676,
0.005451285280287266,
-0.07767528295516968,
-0.0475042499601841,
0.08355226367712021,
0.17000219225883484,
0.06743750721216202,
-0.19010081887245178,
0.007246569264680147,
0.001256984774954617,
-0.03911806643009186,
-0.09026675671339035,
0.054061997681856155,
-0.062464773654937744,
-0.06599602848291397,
0.0433725081384182,
0.01844186708331108,
0.060452986508607864,
-0.023060334846377373,
0.08555302768945694,
-0.07104431837797165,
-0.04997200891375542,
-0.018965845927596092,
0.10556712746620178,
-0.14805351197719574,
0.266117125749588,
0.006880792789161205,
0.03412295877933502,
-0.1231045052409172,
-0.02886222116649151,
-0.07911768555641174,
0.025955330580472946,
0.24298304319381714,
-0.002500387839972973,
0.18083840608596802,
0.012040561065077782,
-0.10789313912391663,
0.011013604700565338,
0.09411640465259552,
-0.14961867034435272,
0.09950584918260574,
-0.020042987540364265,
0.06758332997560501,
-0.06644796580076218,
0.13602149486541748,
-0.06647263467311859,
-0.13291683793067932,
0.06701480597257614,
-0.09146613627672195,
0.012315607629716396,
0.010460459627211094,
0.006129756569862366,
-0.013522529974579811,
0.15053851902484894,
-0.0928754135966301,
-0.0676787868142128,
-0.06656637787818909,
0.032410670071840286,
0.05904896557331085,
0.0017854870529845357,
-0.07614126801490784,
-0.05617044121026993,
0.049174919724464417,
-0.04968931898474693,
0.03967634215950966,
0.1216520145535469,
-0.060598038136959076,
-0.05979715660214424,
-0.09494064003229141,
0.15569762885570526,
0.05233199894428253,
0.03525122255086899,
0.04904787242412567,
0.030964331701397896,
-0.009497196413576603,
-0.08865436911582947,
-0.03256399556994438,
-0.012733682990074158,
0.11909269541501999,
0.02239670418202877,
-0.06485741585493088,
0.01072718296200037,
-0.13630786538124084,
-0.07278741151094437,
0.06695647537708282,
0.2262086421251297,
-0.03403891623020172,
-0.005940199363976717,
0.15726275742053986,
-0.09411592036485672,
-0.197171151638031,
-0.12456174194812775,
0.08680391311645508,
-0.04094138368964195,
0.0004867447423748672,
-0.019974755123257637,
0.02999807707965374,
0.2550528943538666,
-0.024012558162212372,
-0.1463787704706192,
-0.3529118299484253,
-0.0444280281662941,
0.11188316345214844,
-0.01075512170791626,
0.24172109365463257,
-0.1592654585838318,
-0.09663858264684677,
-0.04297611489892006,
-0.02172042243182659,
0.11256475001573563,
0.013538590632379055,
0.05900519713759422,
0.02195643074810505,
0.01891712285578251,
0.05457739904522896,
0.009969708509743214,
0.19967646896839142,
0.012121626175940037,
0.0271624568849802,
-0.06992112845182419,
-0.12292671948671341,
-0.009310008026659489,
0.002610347233712673,
0.04449383169412613,
0.10291040688753128,
-0.050810396671295166,
-0.14927567541599274,
-0.0825464203953743,
-0.0719427615404129,
0.02729967050254345,
-0.052572235465049744,
-0.028855497017502785,
-0.03173566237092018,
0.027654878795146942,
0.02566627226769924,
0.010449378751218319,
0.00686570955440402,
-0.17760074138641357,
0.0879509374499321,
0.1645532101392746,
0.08874965459108353,
0.008503207005560398,
-0.09517058730125427,
0.03436295688152313,
0.02810053341090679,
0.11109685152769089,
0.045563388615846634,
0.006947181653231382,
0.1207011491060257,
-0.03826356306672096,
0.11080048233270645,
0.01467214897274971,
-0.12312741577625275,
-0.0144919129088521,
0.06540224701166153,
-0.08995944261550903,
-0.10238588601350784,
-0.030082643032073975,
0.11832299828529358,
0.06863126903772354,
-0.01620352268218994,
0.10291936248540878,
-0.04456261172890663,
0.0016989789437502623,
-0.018139611929655075,
0.02175372652709484,
-0.08585614711046219,
0.06694748997688293,
-0.018193760886788368,
0.009375966154038906,
-0.07230427861213684,
0.08635719120502472,
0.05825537443161011,
-0.0750332772731781,
0.03492622449994087,
-0.01528785191476345,
0.015804463997483253,
-0.05743211880326271,
-0.1202051192522049,
0.07802877575159073,
-0.1656709462404251,
-0.08959763497114182,
-0.03481612354516983,
-0.11545264720916748,
0.008222085423767567,
0.07286054641008377,
0.04587560519576073,
0.01256436388939619,
0.008618415333330631,
-0.05099950730800629,
-0.08270417898893356,
-0.060367368161678314,
0.06399088352918625,
-0.09462663531303406,
0.08548413962125778,
0.08378879725933075,
0.01876104809343815,
-0.06993658095598221,
-0.039648327976465225,
-0.0666947141289711,
-0.12439756840467453,
0.03583569824695587,
-0.05324319377541542,
0.022218851372599602,
-0.08542788773775101,
-0.06882388144731522,
0.10406654328107834,
-0.00604093074798584,
-0.00891837477684021,
-0.048715148121118546,
-0.05680207908153534,
0.04780780151486397,
0.024101072922348976,
0.08518540114164352,
-0.03116738796234131,
-0.01561395451426506,
0.04902373254299164,
-0.04088957607746124,
-0.026916710659861565,
0.037194784730672836,
-0.0022973075974732637,
-0.008486482314765453,
-0.1053551658987999,
0.10396222770214081,
0.07466118782758713,
0.007260879501700401,
0.008329415693879128,
-0.0019372928654775023,
-0.0014828592538833618,
0.08712863177061081,
-0.012710648588836193,
-0.016386311501264572,
0.09409484267234802,
-0.12138151377439499,
0.08940840512514114,
0.13425859808921814,
-0.13529199361801147,
-0.035638824105262756,
0.004499721806496382,
0.015129548497498035,
-0.04138721525669098,
0.04979800060391426,
-0.06423907727003098,
0.003949703648686409,
-0.09370219707489014,
0.01531191822141409,
0.001151171512901783,
0.05711620673537254,
0.015383917838335037,
-0.04533868655562401,
0.026748834177851677,
0.04112517833709717,
0.19601860642433167,
0.12396671622991562,
0.1436496376991272,
0.03051433153450489,
-0.013482102192938328,
0.05641106516122818,
-0.020409787073731422,
0.0005393693572841585,
0.09086980670690536,
0.02586396224796772,
-0.08774241805076599,
0.002094178693369031,
-0.04045465216040611,
-0.029123179614543915,
0.049496497958898544,
0.0749470517039299,
0.0829121395945549,
0.08120140433311462,
0.05991365760564804,
0.006844568066298962,
0.002601834014058113,
0.025033408775925636,
0.135587677359581,
0.06081909313797951,
0.006493604276329279,
0.023624954745173454,
0.16853350400924683,
-0.06875146180391312,
0.05326754227280617,
-0.01475052535533905,
-0.036118283867836,
-0.12436142563819885,
-0.24202609062194824,
-0.046845365315675735,
-0.05224374681711197,
-0.010516498237848282,
-0.07291069626808167,
0.05452345311641693,
0.04369613900780678,
0.040106430649757385,
-0.046044863760471344,
0.03250967711210251,
-0.0866844579577446,
-0.16222339868545532,
0.010658455081284046,
-0.0014809654094278812,
0.13319669663906097,
-0.12002329528331757,
0.026746569201350212,
0.0058965180069208145,
0.09632701426744461,
0.0023413291200995445,
0.0931265577673912,
0.014620393514633179,
-0.06595020741224289,
-0.04781737178564072,
-0.06369147449731827,
-0.021384185180068016,
-0.04924296960234642,
-0.0004688029002863914,
0.2191651314496994,
0.019626665860414505,
-0.054849207401275635,
0.0026384482625871897,
0.08213553577661514,
-0.023478897288441658,
-0.09192086011171341,
-0.10043222457170486,
0.37137097120285034,
0.004595272708684206,
0.11498818546533585,
0.009000907652080059,
-0.09020140767097473,
0.051249612122774124,
0.23208574950695038,
0.11996195465326309,
-0.020288582891225815,
-0.03289830684661865,
0.04060151427984238,
0.01639043167233467,
0.05569426715373993,
0.031051771715283394,
-0.052555784583091736,
0.3061429262161255,
-0.09712154418230057,
0.13605976104736328,
-0.06546277552843094,
0.06352241337299347,
-0.05659068003296852,
0.06500881165266037,
0.0529211163520813,
-0.006256550550460815,
-0.019655101001262665,
0.14528094232082367,
-0.05864386633038521,
-0.20534177124500275,
0.10957872122526169,
-0.05213668569922447,
-0.10401617735624313,
-0.011150532402098179,
-0.029938654974102974,
0.036382947117090225,
0.1793406754732132,
0.0007595326169393957,
-0.11715235561132431,
0.08967303484678268,
0.03501158207654953,
-0.11744176596403122,
-0.08942335098981857,
0.05376804992556572,
0.05137157440185547,
0.18153820931911469,
0.02470386028289795,
0.11170846968889236,
0.07033147662878036,
0.009502273052930832,
-0.014094200916588306,
0.15360020101070404,
0.0023776604793965816,
0.015603635460138321,
0.02408873289823532,
-0.0938357412815094,
0.040812041610479355,
0.0024230005219578743,
0.0695689395070076,
-0.16273050010204315,
0.060713306069374084,
0.02481057494878769,
-0.04792967811226845,
-0.07662845402956009,
0.09356958419084549,
-0.11261162161827087,
0.09386204928159714,
0.13433904945850372,
0.012719166465103626,
-0.0009206600952893496,
-0.07422412186861038,
0.057504069060087204,
0.025890709832310677,
0.08219980448484421,
-0.002169364830479026,
-0.16167894005775452,
0.0049464888870716095,
0.02134600467979908,
-0.003105223411694169,
-0.21574434638023376,
0.00931935478001833,
0.029265373945236206,
0.016464414075016975,
0.026642806828022003,
-0.05212729796767235,
-0.04667087644338608,
0.061177823692560196,
-0.041013311594724655,
-0.13977302610874176,
-0.0015833168290555477,
0.12910526990890503,
-0.03853694722056389,
0.006817859597504139
] |
null | null | transformers |
# Prompsit/paraphrase-roberta-es
This model allows to evaluate paraphrases for a given phrase.
We have fine-tuned this model from pretrained "PlanTL-GOB-ES/roberta-base-bne".
Model built under a TSI-100905-2019-4 project, co-financed by Ministry of Economic Affairs and Digital Transformation from the Government of Spain.
# How to use it
The model answer the following question: Is "phrase B" a paraphrase of "phrase A".
Please note that we're considering phrases instead of sentences. Therefore, we must take into account that the model doesn't expect to find punctuation marks or long pieces of text.
Resulting probabilities correspond to classes:
* 0: Not a paraphrase
* 1: It's a paraphrase
So, considering the phrase "se buscarán acuerdos" and a candidate paraphrase like "se deberá obtener el acuerdo", you can use the model like this:
```
import torch
from transformers import AutoTokenizer, AutoModelForSequenceClassification
tokenizer = AutoTokenizer.from_pretrained("Prompsit/paraphrase-roberta-es")
model = AutoModelForSequenceClassification.from_pretrained("Prompsit/paraphrase-roberta-es")
input = tokenizer('se buscarán acuerdos','se deberá obtener el acuerdo',return_tensors='pt')
logits = model(**input).logits
soft = torch.nn.Softmax(dim=1)
print(soft(logits))
```
Code output is:
```
tensor([[0.2266, 0.7734]], grad_fn=<SoftmaxBackward>)
```
As the probability of 1 (=It's a paraphrase) is 0.77 and the probability of 0 (=It is not a paraphrase) is 0.22, we can conclude, for our previous example, that "se deberá obtener el acuerdo" is a paraphrase of "se buscarán acuerdos".
# Evaluation results
We have used as test dataset 16500 pairs of phrases human tagged.
Metrics obtained are:
```
metrics={
'test_loss': 0.4869941473007202,
'test_accuracy': 0.8003636363636364,
'test_precision': 0.6692456479690522,
'test_recall': 0.5896889646357052,
'test_f1': 0.6269535673839184,
'test_matthews_correlation': 0.49324489316659575,
'test_runtime': 27.1537,
'test_samples_per_second': 607.652,
'test_steps_per_second': 19.003
}
``` | {"language": "es", "tags": ["transformers"], "pipeline_tag": "text-classification", "inference": false} | text-classification | Prompsit/paraphrase-roberta-es | [
"transformers",
"pytorch",
"roberta",
"text-classification",
"es",
"autotrain_compatible",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [
"es"
] | TAGS
#transformers #pytorch #roberta #text-classification #es #autotrain_compatible #region-us
|
# Prompsit/paraphrase-roberta-es
This model allows to evaluate paraphrases for a given phrase.
We have fine-tuned this model from pretrained "PlanTL-GOB-ES/roberta-base-bne".
Model built under a TSI-100905-2019-4 project, co-financed by Ministry of Economic Affairs and Digital Transformation from the Government of Spain.
# How to use it
The model answer the following question: Is "phrase B" a paraphrase of "phrase A".
Please note that we're considering phrases instead of sentences. Therefore, we must take into account that the model doesn't expect to find punctuation marks or long pieces of text.
Resulting probabilities correspond to classes:
* 0: Not a paraphrase
* 1: It's a paraphrase
So, considering the phrase "se buscarán acuerdos" and a candidate paraphrase like "se deberá obtener el acuerdo", you can use the model like this:
Code output is:
As the probability of 1 (=It's a paraphrase) is 0.77 and the probability of 0 (=It is not a paraphrase) is 0.22, we can conclude, for our previous example, that "se deberá obtener el acuerdo" is a paraphrase of "se buscarán acuerdos".
# Evaluation results
We have used as test dataset 16500 pairs of phrases human tagged.
Metrics obtained are:
| [
"# Prompsit/paraphrase-roberta-es\n\nThis model allows to evaluate paraphrases for a given phrase. \n\nWe have fine-tuned this model from pretrained \"PlanTL-GOB-ES/roberta-base-bne\".\n\nModel built under a TSI-100905-2019-4 project, co-financed by Ministry of Economic Affairs and Digital Transformation from the Government of Spain.",
"# How to use it\n\nThe model answer the following question: Is \"phrase B\" a paraphrase of \"phrase A\".\n\nPlease note that we're considering phrases instead of sentences. Therefore, we must take into account that the model doesn't expect to find punctuation marks or long pieces of text.\n\nResulting probabilities correspond to classes: \n\n* 0: Not a paraphrase\n* 1: It's a paraphrase\n\nSo, considering the phrase \"se buscarán acuerdos\" and a candidate paraphrase like \"se deberá obtener el acuerdo\", you can use the model like this:\n\n\n\nCode output is:\n\n \n\nAs the probability of 1 (=It's a paraphrase) is 0.77 and the probability of 0 (=It is not a paraphrase) is 0.22, we can conclude, for our previous example, that \"se deberá obtener el acuerdo\" is a paraphrase of \"se buscarán acuerdos\".",
"# Evaluation results\n\nWe have used as test dataset 16500 pairs of phrases human tagged. \nMetrics obtained are:"
] | [
"TAGS\n#transformers #pytorch #roberta #text-classification #es #autotrain_compatible #region-us \n",
"# Prompsit/paraphrase-roberta-es\n\nThis model allows to evaluate paraphrases for a given phrase. \n\nWe have fine-tuned this model from pretrained \"PlanTL-GOB-ES/roberta-base-bne\".\n\nModel built under a TSI-100905-2019-4 project, co-financed by Ministry of Economic Affairs and Digital Transformation from the Government of Spain.",
"# How to use it\n\nThe model answer the following question: Is \"phrase B\" a paraphrase of \"phrase A\".\n\nPlease note that we're considering phrases instead of sentences. Therefore, we must take into account that the model doesn't expect to find punctuation marks or long pieces of text.\n\nResulting probabilities correspond to classes: \n\n* 0: Not a paraphrase\n* 1: It's a paraphrase\n\nSo, considering the phrase \"se buscarán acuerdos\" and a candidate paraphrase like \"se deberá obtener el acuerdo\", you can use the model like this:\n\n\n\nCode output is:\n\n \n\nAs the probability of 1 (=It's a paraphrase) is 0.77 and the probability of 0 (=It is not a paraphrase) is 0.22, we can conclude, for our previous example, that \"se deberá obtener el acuerdo\" is a paraphrase of \"se buscarán acuerdos\".",
"# Evaluation results\n\nWe have used as test dataset 16500 pairs of phrases human tagged. \nMetrics obtained are:"
] | [
31,
88,
201,
28
] | [
"passage: TAGS\n#transformers #pytorch #roberta #text-classification #es #autotrain_compatible #region-us \n# Prompsit/paraphrase-roberta-es\n\nThis model allows to evaluate paraphrases for a given phrase. \n\nWe have fine-tuned this model from pretrained \"PlanTL-GOB-ES/roberta-base-bne\".\n\nModel built under a TSI-100905-2019-4 project, co-financed by Ministry of Economic Affairs and Digital Transformation from the Government of Spain.# How to use it\n\nThe model answer the following question: Is \"phrase B\" a paraphrase of \"phrase A\".\n\nPlease note that we're considering phrases instead of sentences. Therefore, we must take into account that the model doesn't expect to find punctuation marks or long pieces of text.\n\nResulting probabilities correspond to classes: \n\n* 0: Not a paraphrase\n* 1: It's a paraphrase\n\nSo, considering the phrase \"se buscarán acuerdos\" and a candidate paraphrase like \"se deberá obtener el acuerdo\", you can use the model like this:\n\n\n\nCode output is:\n\n \n\nAs the probability of 1 (=It's a paraphrase) is 0.77 and the probability of 0 (=It is not a paraphrase) is 0.22, we can conclude, for our previous example, that \"se deberá obtener el acuerdo\" is a paraphrase of \"se buscarán acuerdos\".# Evaluation results\n\nWe have used as test dataset 16500 pairs of phrases human tagged. \nMetrics obtained are:"
] | [
-0.010637336410582066,
-0.057228293269872665,
-0.008321014232933521,
-0.009111937135457993,
0.06882040202617645,
-0.06443358212709427,
0.03889203444123268,
0.03163451701402664,
-0.03966661915183067,
0.0904393419623375,
0.0967886671423912,
0.13922695815563202,
0.027838928624987602,
-0.10529868304729462,
-0.004862761124968529,
-0.2707289457321167,
0.049561310559511185,
-0.05845019966363907,
0.09373161941766739,
0.13608179986476898,
0.0903039500117302,
-0.049363791942596436,
0.047598179429769516,
0.01864434964954853,
0.002534519648179412,
0.06042391434311867,
-0.017677342519164085,
-0.0420813225209713,
0.09140879660844803,
0.06153218448162079,
0.06956517696380615,
0.005621706135571003,
-0.020614510402083397,
-0.1358124166727066,
-0.021710209548473358,
0.00005004182457923889,
0.024322090670466423,
-0.009796199388802052,
0.006112551782280207,
-0.06049874797463417,
0.1321558654308319,
0.038142330944538116,
0.0324440561234951,
0.011770939454436302,
-0.1966223269701004,
-0.02712600864470005,
-0.030967766419053078,
-0.07343662530183792,
0.11683253943920135,
0.06786859780550003,
-0.07284942269325256,
0.11593098938465118,
-0.08902154117822647,
-0.03769956901669502,
0.06779055297374725,
-0.234435573220253,
-0.025480616837739944,
-0.08814363926649094,
0.08251850306987762,
0.1297110915184021,
0.027911869809031487,
0.02065448649227619,
0.07404529303312302,
-0.013660996221005917,
-0.06796100735664368,
-0.023387068882584572,
0.14017672836780548,
-0.03256702795624733,
-0.14762844145298004,
-0.10546185821294785,
0.2835982143878937,
0.013958174735307693,
-0.05722206458449364,
-0.033359114080667496,
-0.0449407622218132,
0.06461397558450699,
0.04056483879685402,
-0.0952715203166008,
-0.014715242199599743,
0.03409213200211525,
0.1346917301416397,
-0.01842852123081684,
-0.11949711292982101,
-0.03880617395043373,
-0.1512472778558731,
0.2089357227087021,
0.0012917225249111652,
0.01796913892030716,
-0.01749260723590851,
0.06052090972661972,
-0.13078126311302185,
-0.04651157557964325,
-0.003021189011633396,
-0.02975289151072502,
-0.01085573434829712,
0.03604857251048088,
-0.0963885560631752,
-0.12323508411645889,
-0.004070373252034187,
0.1339603215456009,
-0.015043225139379501,
-0.03618740662932396,
0.01074270810931921,
0.10524690896272659,
0.10059306770563126,
0.08515176177024841,
-0.052945613861083984,
-0.06886276602745056,
-0.08079773932695389,
-0.030030904337763786,
0.08110668510198593,
0.006547600030899048,
-0.07179026305675507,
-0.019587062299251556,
0.02815323881804943,
0.005021146032959223,
0.038899920880794525,
0.044453009963035583,
-0.10751405358314514,
-0.002331019612029195,
0.028840214014053345,
-0.05714866891503334,
-0.020491959527134895,
-0.03743563964962959,
-0.06564989686012268,
0.05777181684970856,
-0.00867027323693037,
0.052846334874629974,
-0.05458977445960045,
0.011766194365918636,
-0.07440857589244843,
-0.051584385335445404,
-0.07484498620033264,
-0.08061735332012177,
0.003926918376237154,
-0.052678681910037994,
-0.009409457445144653,
-0.09531908482313156,
-0.056548114866018295,
-0.08251011371612549,
-0.0022023089695721865,
-0.10007164627313614,
0.010752426460385323,
-0.06197802722454071,
0.021258069202303886,
-0.06314951926469803,
-0.028891978785395622,
-0.057906683534383774,
-0.02376849763095379,
0.08482032269239426,
0.034256406128406525,
0.03715215623378754,
-0.07879588007926941,
-0.0025192273315042257,
-0.1592477411031723,
0.018545614555478096,
-0.2060975581407547,
0.12401227653026581,
-0.001275848364457488,
0.025418035686016083,
-0.14283818006515503,
0.009600748308002949,
-0.13116146624088287,
0.08721168339252472,
-0.02086157351732254,
0.13837480545043945,
-0.14455066621303558,
-0.060147907584905624,
0.20254768431186676,
-0.156849205493927,
-0.03526011481881142,
0.09565775841474533,
-0.00021888213814236224,
0.13662022352218628,
0.08241467922925949,
0.15992192924022675,
-0.047007933259010315,
-0.06105893477797508,
0.034641142934560776,
-0.038982465863227844,
-0.026009894907474518,
0.060526229441165924,
0.13261140882968903,
-0.15159210562705994,
-0.08970761299133301,
0.011148273013532162,
-0.08489850163459778,
-0.08470872044563293,
-0.06544604897499084,
-0.02259676530957222,
0.03382138907909393,
0.029972927644848824,
-0.010059919208288193,
0.020734257996082306,
-0.0080534303560853,
-0.043814945966005325,
-0.12152164429426193,
0.11851660907268524,
-0.02889937348663807,
-0.006887270603328943,
0.01343007292598486,
-0.07810573279857635,
0.1197674348950386,
-0.08357587456703186,
-0.04840762913227081,
-0.14420439302921295,
0.05608999356627464,
-0.019922852516174316,
0.051079630851745605,
0.09221357852220535,
0.14177857339382172,
-0.0053129615262150764,
-0.003674023551866412,
0.03348173573613167,
0.04754073917865753,
0.01859617978334427,
0.017445949837565422,
-0.041906438767910004,
-0.09831029921770096,
0.04182032495737076,
-0.019429512321949005,
0.20345516502857208,
0.004931719973683357,
0.013569783419370651,
0.01705954410135746,
0.030884893611073494,
-0.03011239878833294,
-0.03851110860705376,
-0.07346542924642563,
0.06903637200593948,
0.003460158593952656,
0.07324780523777008,
0.09445098042488098,
0.021319236606359482,
-0.09667953848838806,
0.0663810670375824,
-0.16872499883174896,
-0.11368167400360107,
0.09234166145324707,
-0.12020862847566605,
-0.06545276194810867,
-0.12107200920581818,
-0.05875058472156525,
0.007416145410388708,
0.02176724374294281,
-0.12026126682758331,
0.17983156442642212,
0.04012751206755638,
0.08872461318969727,
-0.14364558458328247,
0.0228609386831522,
0.03038681112229824,
-0.15208342671394348,
-0.024832604452967644,
0.06629551202058792,
0.08233924955129623,
-0.13996069133281708,
0.07278399169445038,
0.06526195257902145,
-0.014825322665274143,
0.11924633383750916,
0.01793566346168518,
-0.0859464481472969,
-0.06810513138771057,
0.06108672916889191,
-0.002137337811291218,
0.005963141098618507,
-0.2535901367664337,
0.0028492629062384367,
0.024944595992565155,
-0.001282517216168344,
0.051044318825006485,
-0.10168623924255371,
0.017424944788217545,
0.025516467168927193,
0.02170671336352825,
0.06353801488876343,
-0.0207325778901577,
-0.024350035935640335,
0.05832233279943466,
0.04425114020705223,
-0.07248557358980179,
0.0035130472388118505,
-0.01685050129890442,
-0.1378757357597351,
0.12127523124217987,
-0.06742427498102188,
-0.1507633626461029,
-0.1257314234972,
0.13743950426578522,
-0.0262721236795187,
0.036995779722929,
0.07556440681219101,
0.008666987530887127,
-0.0020713978447020054,
-0.04377308115363121,
0.14328345656394958,
-0.032817043364048004,
-0.0554080605506897,
-0.037359818816185,
-0.0172415841370821,
0.011810233816504478,
-0.04456852376461029,
-0.0535866878926754,
-0.08678381890058517,
-0.13714000582695007,
0.02266949601471424,
-0.16551847755908966,
0.07758291810750961,
0.13421103358268738,
0.016279900446534157,
-0.0064324247650802135,
-0.01927858404815197,
0.26702794432640076,
-0.0602136105298996,
-0.014489036053419113,
0.11840123683214188,
0.013788765296339989,
0.044555697590112686,
0.1415531188249588,
0.007025591563433409,
-0.06078185886144638,
0.004900712985545397,
0.07140636444091797,
-0.04308953508734703,
-0.1749810129404068,
-0.11970914155244827,
0.013198688626289368,
-0.12542134523391724,
0.030337195843458176,
-0.01641698367893696,
0.04816097393631935,
0.12357581406831741,
-0.028378671035170555,
-0.07485378533601761,
0.00412960397079587,
0.06105365604162216,
0.18758010864257812,
-0.0668846070766449,
0.10532526671886444,
-0.004564313217997551,
-0.13666453957557678,
0.07745616137981415,
-0.002259967615827918,
0.0888361930847168,
0.1236434206366539,
-0.01629049889743328,
0.14830131828784943,
0.025464994832873344,
-0.023476414382457733,
-0.01757475920021534,
-0.02880740724503994,
0.021283017471432686,
-0.13869458436965942,
-0.0728844478726387,
-0.101908378303051,
0.06698029488325119,
0.04536707326769829,
0.02028643898665905,
-0.07417032867670059,
-0.05698424205183983,
0.09292612224817276,
0.12813371419906616,
0.07788112014532089,
-0.149499773979187,
-0.009294182993471622,
0.01853303797543049,
-0.04754350334405899,
-0.0518265962600708,
0.038881704211235046,
-0.03209660202264786,
-0.051851511001586914,
0.061238229274749756,
0.008682006038725376,
0.05753736197948456,
-0.004903810098767281,
0.09762798994779587,
-0.11242825537919998,
-0.05814504623413086,
0.009181156754493713,
0.09316191077232361,
-0.16752785444259644,
0.25383323431015015,
0.005718851462006569,
0.009277728386223316,
-0.0625544860959053,
-0.014155098237097263,
-0.1295447200536728,
0.06437516957521439,
0.2047160267829895,
-0.0037304952275007963,
0.1371728479862213,
0.04761470854282379,
-0.06673279404640198,
0.021882493048906326,
0.08186347037553787,
-0.12977412343025208,
0.10754051059484482,
-0.02141900174319744,
0.0825115367770195,
-0.03206745162606239,
0.1301051676273346,
-0.03148404136300087,
-0.15957151353359222,
0.04333024099469185,
-0.1038496270775795,
0.01139097847044468,
0.008099813014268875,
0.004830069839954376,
-0.002685145940631628,
0.11239851266145706,
-0.18891172111034393,
-0.07859206944704056,
-0.04007549211382866,
-0.006962544750422239,
0.08452341705560684,
-0.026495467871427536,
-0.029916912317276,
-0.04622715339064598,
0.019685186445713043,
-0.04398711398243904,
0.04996512085199356,
0.0914212241768837,
-0.057635821402072906,
-0.07038330286741257,
-0.07640872150659561,
0.10940859466791153,
0.04227619990706444,
0.028163885697722435,
0.04397235065698624,
0.04901943355798721,
-0.023152299225330353,
-0.10303731262683868,
-0.048551395535469055,
0.03944450989365578,
0.09007880091667175,
0.00470240693539381,
-0.02738308347761631,
0.029501138255000114,
-0.10522926598787308,
-0.06666377186775208,
0.07229722291231155,
0.2205224186182022,
-0.056295957416296005,
0.02070648781955242,
0.22634372115135193,
-0.09460458159446716,
-0.1786976307630539,
-0.1444513201713562,
0.0962664857506752,
-0.020513810217380524,
0.02897609956562519,
-0.08106278628110886,
0.004150274209678173,
0.2193204015493393,
-0.026778584346175194,
-0.22465895116329193,
-0.3850376009941101,
-0.08019857853651047,
0.09862104058265686,
-0.04183628037571907,
0.24758058786392212,
-0.12601737678050995,
-0.12921550869941711,
-0.02370966598391533,
0.03489263355731964,
0.12974663078784943,
-0.004719601944088936,
0.05696842819452286,
0.025046326220035553,
0.04450276121497154,
0.05180114880204201,
0.01871321350336075,
0.21489080786705017,
-0.003467068774625659,
0.005037233699113131,
-0.03639301285147667,
-0.09114447236061096,
-0.00870322622358799,
-0.0128521379083395,
0.052310243248939514,
0.05796843394637108,
-0.013407272286713123,
-0.12821435928344727,
-0.07199794799089432,
-0.06510544568300247,
0.03346380963921547,
-0.043358124792575836,
-0.023789454251527786,
-0.0496567040681839,
-0.0074826437048614025,
0.0035539937671273947,
-0.011533019132912159,
0.06647565960884094,
-0.15607044100761414,
0.08735044300556183,
0.1864290088415146,
0.13670678436756134,
0.011477246880531311,
-0.053542472422122955,
0.03882243111729622,
0.007334825582802296,
0.13504312932491302,
-0.010298791341483593,
0.005605415906757116,
0.09241067618131638,
-0.033812474459409714,
0.09978076815605164,
0.013395861722528934,
-0.15668536722660065,
0.01638517901301384,
0.08153483271598816,
-0.07156595587730408,
-0.12915290892124176,
-0.008262837305665016,
0.08004453778266907,
0.015004544518887997,
-0.013677260838449001,
0.12945674359798431,
-0.029708636924624443,
-0.00020053330808877945,
-0.005568124353885651,
0.0043658181093633175,
-0.06282997876405716,
0.07280764728784561,
0.007120553869754076,
0.0021007407922297716,
-0.05177612602710724,
0.08477804809808731,
0.07170023024082184,
-0.08670562505722046,
0.048170436173677444,
-0.06281045079231262,
0.005543546285480261,
-0.04978067800402641,
-0.17979317903518677,
0.06250602751970291,
-0.23694129288196564,
-0.0912255048751831,
-0.03641197085380554,
-0.0855506956577301,
-0.013463184237480164,
0.10245212912559509,
0.042774077504873276,
0.046536196023225784,
-0.004840597975999117,
-0.007784816902130842,
-0.04249582067131996,
-0.024238212034106255,
0.05477053299546242,
-0.06384439021348953,
0.04529759660363197,
0.06749748438596725,
0.03243282809853554,
-0.05411011353135109,
-0.035621657967567444,
-0.05644392967224121,
-0.12626756727695465,
0.05050347000360489,
-0.05397353321313858,
0.03303657844662666,
-0.09050621092319489,
-0.06805294752120972,
0.09210235625505447,
-0.008131394162774086,
-0.02395566552877426,
-0.06400445848703384,
-0.03397635743021965,
0.06656980514526367,
0.01481291651725769,
0.09263216704130173,
-0.07418859750032425,
0.012885266914963722,
0.07586567848920822,
-0.034544702619314194,
-0.004974171984940767,
0.013490712270140648,
0.010621131397783756,
0.04217368736863136,
-0.0824374333024025,
0.10545287281274796,
0.07718949019908905,
0.01709534414112568,
-0.014927830547094345,
0.020269794389605522,
-0.012486884370446205,
0.07129526883363724,
-0.006749284453690052,
-0.02371896617114544,
0.0574321411550045,
-0.11823101341724396,
0.10081779211759567,
0.15485143661499023,
-0.14670060575008392,
-0.07034242153167725,
0.022149087861180305,
0.006755468435585499,
-0.002832959406077862,
0.0823054239153862,
-0.06995450705289841,
-0.022752702236175537,
-0.05465083196759224,
0.03238794580101967,
0.007719780318439007,
0.07673396915197372,
0.01152451429516077,
-0.06633477658033371,
0.001957920379936695,
0.02265884168446064,
0.21608290076255798,
0.09679294377565384,
0.16546350717544556,
0.026297984644770622,
-0.005716457962989807,
0.05811229720711708,
-0.0004131777095608413,
0.03684411197900772,
0.10798471421003342,
0.01483676116913557,
-0.1127711832523346,
0.021475262939929962,
-0.055788811296224594,
-0.05436856672167778,
0.007575798314064741,
0.08129420876502991,
0.03384588286280632,
0.08281148225069046,
0.03248192369937897,
0.009041791781783104,
0.018239526078104973,
0.01303941011428833,
0.11363878846168518,
0.008112669922411442,
0.011133330874145031,
0.0741940513253212,
0.1971997320652008,
-0.06374964118003845,
0.07243996113538742,
0.020211966708302498,
-0.03364257141947746,
-0.15229110419750214,
-0.19119907915592194,
-0.04604724794626236,
-0.0667293593287468,
-0.008838354609906673,
-0.07226673513650894,
0.03682563081383705,
0.05808905139565468,
0.006036720704287291,
-0.05066176876425743,
0.0010455903830006719,
-0.0042831776663661,
-0.16715072095394135,
-0.036564864218235016,
0.003739212639629841,
0.14549580216407776,
-0.09528028964996338,
0.02547052688896656,
0.042644694447517395,
0.15751853585243225,
0.004724110476672649,
0.10792747139930725,
0.018098236992955208,
-0.06030801311135292,
-0.02830645628273487,
-0.05607660859823227,
-0.029497208073735237,
-0.014207852073013783,
-0.011967149563133717,
0.13911497592926025,
0.012694678269326687,
-0.061133697628974915,
-0.00468837795779109,
0.15187014639377594,
-0.04817579686641693,
-0.09438776969909668,
-0.1028921827673912,
0.33915311098098755,
0.030291007831692696,
0.15634559094905853,
-0.009060265496373177,
-0.12105906009674072,
0.01813049241900444,
0.195957750082016,
0.11276769638061523,
-0.01845547743141651,
-0.009831266477704048,
0.047354504466056824,
0.026920052245259285,
0.061072949320077896,
0.03247428312897682,
-0.06800594180822372,
0.32738739252090454,
-0.07282264530658722,
0.12809047102928162,
-0.06936519593000412,
0.05662995204329491,
-0.03857725113630295,
0.11546416580677032,
0.015589850954711437,
-0.011518290266394615,
-0.05517134815454483,
0.13057661056518555,
-0.05462789535522461,
-0.20675073564052582,
0.0718248188495636,
-0.044461026787757874,
-0.08518671244382858,
-0.0015760608948767185,
0.05094766244292259,
0.03609400615096092,
0.1396019458770752,
0.0058069792576134205,
-0.11496739089488983,
0.06927450001239777,
0.04793142154812813,
-0.1102958396077156,
-0.06387648731470108,
0.07144223898649216,
0.07089848071336746,
0.14748987555503845,
0.016152674332261086,
0.08662020415067673,
0.0577884167432785,
0.028838682919740677,
-0.02083020657300949,
0.14592434465885162,
0.01621328666806221,
0.035267457365989685,
0.03831678256392479,
-0.08216279000043869,
0.0494840107858181,
0.03042743355035782,
0.07819075137376785,
-0.15440146625041962,
0.0661095380783081,
0.072450190782547,
-0.06619486957788467,
-0.08799395710229874,
0.06957029551267624,
-0.1001875177025795,
0.08577389270067215,
0.09439494460821152,
0.009206395596265793,
0.0037944717332720757,
-0.0609271302819252,
0.01680765300989151,
0.01025361381471157,
0.11639764159917831,
-0.005660958122462034,
-0.14406734704971313,
-0.008348940871655941,
0.0969134047627449,
0.010472671128809452,
-0.20860522985458374,
0.026403553783893585,
0.06570529192686081,
0.048785749822854996,
0.05347221717238426,
-0.015689389780163765,
-0.026711799204349518,
0.05802266299724579,
-0.03886778652667999,
-0.14697211980819702,
0.012089064344763756,
0.12379004806280136,
-0.05636567994952202,
-0.009622541256248951
] |
null | null | transformers |
FinBERT is a pre-trained NLP model to analyze sentiment of financial text. It is built by further training the BERT language model in the finance domain, using a large financial corpus and thereby fine-tuning it for financial sentiment classification. [Financial PhraseBank](https://www.researchgate.net/publication/251231107_Good_Debt_or_Bad_Debt_Detecting_Semantic_Orientations_in_Economic_Texts) by Malo et al. (2014) is used for fine-tuning. For more details, please see the paper [FinBERT: Financial Sentiment Analysis with Pre-trained Language Models](https://arxiv.org/abs/1908.10063) and our related [blog post](https://medium.com/prosus-ai-tech-blog/finbert-financial-sentiment-analysis-with-bert-b277a3607101) on Medium.
The model will give softmax outputs for three labels: positive, negative or neutral.
---
About Prosus
Prosus is a global consumer internet group and one of the largest technology investors in the world. Operating and investing globally in markets with long-term growth potential, Prosus builds leading consumer internet companies that empower people and enrich communities. For more information, please visit www.prosus.com.
Contact information
Please contact Dogu Araci dogu.araci[at]prosus[dot]com and Zulkuf Genc zulkuf.genc[at]prosus[dot]com about any FinBERT related issues and questions.
| {"language": "en", "tags": ["financial-sentiment-analysis", "sentiment-analysis"], "widget": [{"text": "Stocks rallied and the British pound gained."}]} | text-classification | ProsusAI/finbert | [
"transformers",
"pytorch",
"tf",
"jax",
"bert",
"text-classification",
"financial-sentiment-analysis",
"sentiment-analysis",
"en",
"arxiv:1908.10063",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [
"1908.10063"
] | [
"en"
] | TAGS
#transformers #pytorch #tf #jax #bert #text-classification #financial-sentiment-analysis #sentiment-analysis #en #arxiv-1908.10063 #autotrain_compatible #endpoints_compatible #has_space #region-us
|
FinBERT is a pre-trained NLP model to analyze sentiment of financial text. It is built by further training the BERT language model in the finance domain, using a large financial corpus and thereby fine-tuning it for financial sentiment classification. Financial PhraseBank by Malo et al. (2014) is used for fine-tuning. For more details, please see the paper FinBERT: Financial Sentiment Analysis with Pre-trained Language Models and our related blog post on Medium.
The model will give softmax outputs for three labels: positive, negative or neutral.
---
About Prosus
Prosus is a global consumer internet group and one of the largest technology investors in the world. Operating and investing globally in markets with long-term growth potential, Prosus builds leading consumer internet companies that empower people and enrich communities. For more information, please visit URL.
Contact information
Please contact Dogu Araci URL[at]prosus[dot]com and Zulkuf Genc URL[at]prosus[dot]com about any FinBERT related issues and questions.
| [] | [
"TAGS\n#transformers #pytorch #tf #jax #bert #text-classification #financial-sentiment-analysis #sentiment-analysis #en #arxiv-1908.10063 #autotrain_compatible #endpoints_compatible #has_space #region-us \n"
] | [
71
] | [
"passage: TAGS\n#transformers #pytorch #tf #jax #bert #text-classification #financial-sentiment-analysis #sentiment-analysis #en #arxiv-1908.10063 #autotrain_compatible #endpoints_compatible #has_space #region-us \n"
] | [
-0.05335650593042374,
0.12336672097444534,
-0.005805745255202055,
0.0520545095205307,
0.11171729862689972,
0.024266839027404785,
0.004342292435467243,
0.13462914526462555,
0.09922128170728683,
0.03977293521165848,
0.07377824932336807,
0.15790002048015594,
0.0035466377157717943,
0.10885810852050781,
-0.09179454296827316,
-0.2615780830383301,
-0.0114156324416399,
0.07376503944396973,
0.09237530827522278,
0.11319588869810104,
0.10532955080270767,
-0.12068317085504532,
0.09888362884521484,
0.0022287347819656134,
-0.08130519092082977,
0.011894738301634789,
0.06711094081401825,
-0.08504164963960648,
0.16935940086841583,
0.06453580409288406,
0.14582690596580505,
0.07270507514476776,
-0.01525961048901081,
-0.11814731359481812,
0.06610456854104996,
-0.031026821583509445,
-0.08698631078004837,
0.04478798434138298,
0.014375272206962109,
-0.09670653939247131,
0.18724140524864197,
0.05034447833895683,
0.054437458515167236,
0.04026130586862564,
-0.1697334498167038,
-0.13575118780136108,
-0.0326361320912838,
0.07728897780179977,
-0.00272888271138072,
0.06372690945863724,
-0.02621840499341488,
0.15606598556041718,
-0.1135857030749321,
0.059959594160318375,
0.14153265953063965,
-0.21836033463478088,
-0.007894485257565975,
0.04948294162750244,
0.027280177921056747,
-0.0004909764975309372,
-0.126990407705307,
0.028926566243171692,
0.058597322553396225,
0.01502462662756443,
0.08462093770503998,
-0.07706527411937714,
-0.12433307617902756,
0.0623716339468956,
-0.11913856863975525,
-0.04378276318311691,
0.2638323903083801,
0.08459778875112534,
0.037301018834114075,
-0.04182479903101921,
-0.049553729593753815,
-0.09278488159179688,
0.016689011827111244,
-0.05779425799846649,
-0.0024484938476234674,
0.034613337367773056,
-0.0010673499200493097,
0.05898773670196533,
-0.12577024102210999,
0.14188946783542633,
-0.2294391691684723,
0.09515538811683655,
-0.07754848152399063,
0.027166036888957024,
-0.08341940492391586,
0.03240746632218361,
-0.0737200453877449,
-0.0927029550075531,
0.03183959797024727,
-0.0945415198802948,
0.08887278288602829,
-0.02527150698006153,
-0.06063374876976013,
0.05619806423783302,
-0.04499846696853638,
0.02455747313797474,
-0.02651049569249153,
-0.010049670934677124,
-0.012347160838544369,
0.074942946434021,
0.12025173008441925,
0.13980357348918915,
-0.06645943224430084,
-0.10327935963869095,
-0.06756769865751266,
-0.014989868737757206,
0.005922358017414808,
-0.008038383908569813,
-0.10796140134334564,
0.04415423050522804,
0.026131127029657364,
0.017228659242391586,
-0.07512335479259491,
0.07061471790075302,
-0.1383472979068756,
0.027664288878440857,
-0.06408695131540298,
-0.026454616338014603,
0.023021617904305458,
-0.006793071515858173,
-0.020202120766043663,
0.12908823788166046,
-0.026601634919643402,
-0.0045128255151212215,
0.02302214689552784,
0.12491632997989655,
-0.04561636969447136,
0.0010773257818073034,
-0.03174168989062309,
-0.10038881748914719,
0.10074379295110703,
-0.0640157088637352,
0.10174691677093506,
-0.17687749862670898,
-0.029345152899622917,
-0.02229505591094494,
0.03423531353473663,
-0.04764345660805702,
-0.05394766479730606,
0.012179982848465443,
-0.031122706830501556,
0.06370712071657181,
-0.05194077640771866,
0.01458765659481287,
-0.09958735853433609,
0.028091412037611008,
-0.07509247213602066,
0.13356605172157288,
-0.0007845528889447451,
-0.001942576956935227,
-0.1251966953277588,
-0.06632508337497711,
-0.013726234436035156,
-0.025595461949706078,
-0.03500229865312576,
0.15381772816181183,
0.05584295839071274,
-0.0674595832824707,
-0.12067142128944397,
0.05058496445417404,
-0.10000213980674744,
0.13446079194545746,
-0.16158917546272278,
-0.06997401267290115,
0.09322705864906311,
-0.07343835383653641,
-0.08983701467514038,
0.10143373906612396,
-0.03814736008644104,
0.1398387849330902,
0.05722177401185036,
0.1936643123626709,
0.02226586639881134,
-0.09343820810317993,
-0.05015269294381142,
0.1510305553674698,
-0.06679532676935196,
0.04524639621376991,
0.01913836970925331,
0.0961570143699646,
-0.08619073778390884,
0.006930382922291756,
0.0849991887807846,
0.046358127146959305,
-0.08671446144580841,
-0.04173032194375992,
-0.006310067605227232,
-0.05317099392414093,
0.07450935244560242,
0.10370418429374695,
0.047198064625263214,
-0.1394108086824417,
-0.05173838511109352,
-0.0958588644862175,
0.021305426955223083,
0.09533528983592987,
-0.008259461261332035,
-0.07267836481332779,
0.167720764875412,
0.02083241194486618,
-0.007451099343597889,
-0.15444059669971466,
0.04490610212087631,
-0.06009015440940857,
0.10154640674591064,
0.03430064022541046,
0.2727605700492859,
0.03835083544254303,
-0.15035004913806915,
-0.06607012450695038,
0.02173340506851673,
0.1434246003627777,
0.04958656430244446,
0.02845512330532074,
-0.1549452245235443,
0.06155282258987427,
-0.08371389657258987,
0.023485371842980385,
-0.016427675262093544,
-0.013317731209099293,
0.18202881515026093,
0.0799931213259697,
-0.0020007321145385504,
0.08364124596118927,
0.01726001687347889,
0.0012848194455727935,
-0.051442909985780716,
0.0057410369627177715,
0.05396982282400131,
0.0005440246895886958,
-0.07368945330381393,
0.19868889451026917,
-0.13239161670207977,
0.3835761547088623,
0.18157416582107544,
-0.2713252305984497,
-0.0703602284193039,
0.042053867131471634,
-0.033208709210157394,
0.07508859783411026,
-0.008512545377016068,
0.04364936426281929,
0.05654425173997879,
-0.09455402940511703,
0.06541602313518524,
-0.0315202921628952,
-0.020674407482147217,
0.03193463757634163,
-0.015284805558621883,
-0.10163930058479309,
0.08523605018854141,
-0.07096032798290253,
-0.1624295562505722,
0.15360893309116364,
0.28923073410987854,
0.09762328118085861,
0.15227489173412323,
0.026517855003476143,
0.06542693078517914,
-0.021728912368416786,
-0.0865107849240303,
-0.09309092164039612,
0.0605216845870018,
-0.17283028364181519,
-0.09496300667524338,
0.07448436319828033,
-0.03207159414887428,
-0.03093758225440979,
-0.1368613839149475,
-0.09267528355121613,
0.06165206804871559,
0.04421766847372055,
-0.029596855863928795,
0.12636588513851166,
0.027075961232185364,
0.15233907103538513,
-0.006281923037022352,
-0.07395704835653305,
0.03778940066695213,
-0.005312344990670681,
-0.06832323968410492,
0.09300164878368378,
-0.10121050477027893,
-0.29097554087638855,
-0.026960009709000587,
-0.1373247653245926,
0.05884311720728874,
-0.01822807267308235,
0.08096032589673996,
-0.09219380468130112,
-0.03249837085604668,
0.0012136023724451661,
0.055027104914188385,
-0.09733159840106964,
-0.013069371692836285,
0.02294573374092579,
-0.028210775926709175,
-0.06155552715063095,
-0.0753195658326149,
-0.07804153114557266,
-0.056404292583465576,
0.04691968113183975,
0.0972767099738121,
-0.036180801689624786,
0.08357208967208862,
0.20428767800331116,
-0.026650788262486458,
0.02660883031785488,
-0.09412331879138947,
0.18291598558425903,
-0.11332860589027405,
0.0067121172323822975,
0.10338670760393143,
-0.009194230660796165,
0.07747890800237656,
0.19079630076885223,
0.0407906100153923,
-0.06775961816310883,
0.0036437909584492445,
-0.00625065341591835,
-0.08812937140464783,
-0.12366887181997299,
-0.07726554572582245,
-0.0933992862701416,
0.0836600512266159,
-0.015693331137299538,
0.05496879667043686,
0.09891758114099503,
0.04771745577454567,
0.03598945960402489,
-0.0988084003329277,
-0.06556841731071472,
0.05226856842637062,
0.22514192759990692,
-0.06812597066164017,
0.12389139831066132,
-0.019481219351291656,
-0.0795130506157875,
0.13832800090312958,
-0.06705277413129807,
-0.010491126216948032,
0.06912905722856522,
0.004415575414896011,
0.01261970866471529,
0.12229502201080322,
0.08288927376270294,
0.07387679815292358,
-0.025492608547210693,
-0.03608286753296852,
-0.0918249562382698,
0.005701770074665546,
-0.03479144722223282,
0.07286874204874039,
0.0626227855682373,
0.010120920836925507,
-0.11083921790122986,
-0.18067887425422668,
0.06799080222845078,
0.038715191185474396,
0.09547333419322968,
-0.13735301792621613,
-0.023604130372405052,
0.05199555307626724,
-0.035839129239320755,
-0.04413876682519913,
0.08473707735538483,
-0.05818575620651245,
-0.10178888589143753,
0.08771289885044098,
0.014454221352934837,
0.0798388198018074,
-0.0004394823336042464,
0.07675398886203766,
-0.12373878806829453,
-0.14285162091255188,
-0.016594400629401207,
0.056972771883010864,
-0.24922439455986023,
0.25350421667099,
-0.0024909288622438908,
-0.09564553201198578,
-0.06295343488454819,
-0.07650433480739594,
0.08022875338792801,
0.14688661694526672,
0.13708654046058655,
0.013786385767161846,
0.011011120863258839,
-0.06649937480688095,
0.09638337790966034,
-0.000750192382838577,
0.04165615141391754,
-0.029637830331921577,
-0.022209564223885536,
-0.02061099372804165,
0.0061569721437990665,
0.03575406223535538,
0.12274570763111115,
0.04984302446246147,
-0.11761132627725601,
0.07220964133739471,
-0.018471548333764076,
0.007993131875991821,
0.05003013834357262,
-0.10050924122333527,
-0.17781339585781097,
0.0897611528635025,
0.10999645292758942,
0.020987819880247116,
-0.11246765404939651,
-0.011414562352001667,
-0.010755052790045738,
-0.03447949141263962,
-0.005795856472104788,
-0.03555542975664139,
0.003368287580087781,
-0.01732117123901844,
-0.12994909286499023,
0.12224843353033066,
-0.0763794481754303,
-0.06236033886671066,
-0.06234735995531082,
0.1039397120475769,
-0.08691857010126114,
0.0896492674946785,
-0.034379471093416214,
0.020993314683437347,
-0.09781622886657715,
-0.05061495304107666,
0.07226899266242981,
-0.057030100375413895,
0.05402527004480362,
0.008243793621659279,
0.0022352805826812983,
0.004150568041950464,
-0.0422743558883667,
-0.08256614953279495,
0.150611013174057,
0.21332204341888428,
-0.09922395646572113,
0.08368386328220367,
0.09853053838014603,
-0.015423574484884739,
-0.28118962049484253,
-0.034212976694107056,
-0.18287302553653717,
0.017059065401554108,
0.10571414232254028,
-0.013795703649520874,
0.013306833803653717,
-0.08790247142314911,
-0.0659419447183609,
-0.033792223781347275,
-0.11964445561170578,
-0.0643489807844162,
0.20820438861846924,
-0.05630092695355415,
0.4001160264015198,
-0.12217975407838821,
-0.011933366768062115,
0.06332867592573166,
-0.12747910618782043,
0.190190851688385,
-0.05870352312922478,
0.04537790268659592,
0.020688286051154137,
0.16729505360126495,
0.029126450419425964,
0.0010124807013198733,
0.07875619828701019,
-0.052804309874773026,
0.025049515068531036,
-0.12243619561195374,
-0.17146843671798706,
0.05803618207573891,
-0.030870864167809486,
-0.007039572577923536,
-0.0010525258257985115,
-0.008619825355708599,
-0.15721535682678223,
-0.014982790686190128,
-0.14471229910850525,
0.05567016080021858,
0.021427245810627937,
-0.06594734638929367,
-0.1379295140504837,
0.08472400158643723,
0.06340967863798141,
-0.06209808960556984,
0.1501697599887848,
-0.07762035727500916,
0.13113971054553986,
0.16378788650035858,
0.20566649734973907,
-0.1408880650997162,
0.03181016445159912,
-0.0060035758651793,
-0.10512489080429077,
0.046286117285490036,
-0.17133845388889313,
0.011075147427618504,
0.09021534770727158,
0.0045099868439137936,
0.09748312830924988,
0.11050458252429962,
-0.008532867766916752,
-0.0034672936890274286,
0.11948508024215698,
-0.15437191724777222,
-0.0691814124584198,
-0.06594311445951462,
-0.07435479015111923,
0.01926923170685768,
-0.10783621668815613,
0.06676431000232697,
-0.023173511028289795,
-0.007813847623765469,
0.030113670974969864,
-0.04163780063390732,
-0.01576850190758705,
-0.021487029269337654,
0.033041391521692276,
0.005172002129256725,
-0.09587772935628891,
-0.013200475834310055,
0.004559692461043596,
-0.26799196004867554,
-0.0010909746633842587,
0.03866706043481827,
-0.08961720764636993,
-0.13899847865104675,
-0.1058153435587883,
0.16587097942829132,
-0.19359523057937622,
0.01679653860628605,
-0.019554853439331055,
-0.19261051714420319,
0.06904326379299164,
0.23005463182926178,
0.09364743530750275,
0.04701239988207817,
-0.07832442969083786,
-0.01678384654223919,
0.04555608704686165,
0.035815365612506866,
0.11018632352352142,
-0.018239004537463188,
-0.09159320592880249,
-0.014540745876729488,
-0.03036651574075222,
0.13754592835903168,
-0.10970117896795273,
-0.0248262919485569,
-0.12170949578285217,
-0.0170550886541605,
-0.10582160204648972,
-0.059890829026699066,
-0.041648752987384796,
-0.0169754009693861,
0.017154235392808914,
-0.0736783817410469,
-0.0824672281742096,
-0.09114102274179459,
-0.1289423555135727,
0.1048164889216423,
0.04965199530124664,
0.08712668716907501,
-0.0002225152711616829,
-0.04988940805196762,
0.060630809515714645,
0.005115840584039688,
0.07637399435043335,
0.059169951826334,
-0.03454830124974251,
0.12310465425252914,
-0.09208284318447113,
-0.03273056820034981,
0.13328345119953156,
-0.0337991900742054,
0.08022340387105942,
0.0822201669216156,
-0.016346633434295654,
0.021443026140332222,
0.001480120001360774,
0.03720554709434509,
-0.021715251728892326,
-0.053606413304805756,
0.055064670741558075,
0.10054220259189606,
-0.15372705459594727,
-0.027225539088249207,
-0.030652252957224846,
0.07518687844276428,
-0.028825387358665466,
0.12997762858867645,
-0.03386440500617027,
0.005013236775994301,
-0.10705174505710602,
0.004679610952734947,
-0.014263954013586044,
-0.12608373165130615,
-0.10930389165878296,
-0.10040194541215897,
0.032781653106212616,
-0.030248606577515602,
0.2793888449668884,
0.09639803320169449,
0.029214181005954742,
0.048397988080978394,
0.08170811086893082,
0.008676772005856037,
-0.021466894075274467,
0.06730328500270844,
0.04340789467096329,
-0.0861264020204544,
-0.11952808499336243,
0.05896807834506035,
0.011996859684586525,
-0.01574285887181759,
0.11371290683746338,
0.053989868611097336,
0.13522681593894958,
0.08580920845270157,
-0.07320129126310349,
0.020626122131943703,
0.007711111102253199,
-0.12705859541893005,
-0.09644607454538345,
0.06695573031902313,
-0.023302460089325905,
0.14608503878116608,
0.1658836305141449,
0.007583468686789274,
0.1223444864153862,
-0.09371504187583923,
-0.008375464007258415,
-0.136860191822052,
-0.1544668823480606,
-0.07812751829624176,
-0.10736528784036636,
-0.013273200020194054,
-0.11306855827569962,
0.06837796419858932,
0.03448467701673508,
0.07786993682384491,
-0.0612921267747879,
0.012889169156551361,
-0.04450380429625511,
-0.04653548076748848,
0.1148131713271141,
0.03863988816738129,
0.0037341236602514982,
-0.12120725959539413,
0.015719639137387276,
-0.12182586640119553,
-0.010816537775099277,
-0.04361915960907936,
-0.0034353891387581825,
-0.08759573847055435,
-0.08659642189741135,
-0.08063022047281265,
-0.10118987411260605,
-0.028298314660787582,
0.016949504613876343,
-0.029199866577982903,
0.08480410277843475,
-0.02607826143503189,
0.08153922855854034,
0.03903346508741379,
0.16358473896980286,
-0.09609705954790115,
0.08827624469995499,
-0.06673325598239899,
0.16567090153694153,
-0.036374781280756,
0.13018907606601715,
-0.007207317277789116,
-0.013987435959279537,
-0.05479593202471733,
0.2589694559574127,
0.30416369438171387,
-0.09568055719137192,
0.04435858130455017,
0.007836302742362022,
0.05575001984834671,
0.004593652207404375,
0.05683688446879387,
0.10755017399787903,
0.13600370287895203,
-0.12653839588165283,
0.07147768139839172,
-0.0741986334323883,
0.0007104651303961873,
0.04633619636297226,
0.08053959161043167,
0.1155405044555664,
-0.035459939390420914,
-0.08332176506519318,
0.04719128832221031,
-0.11098406463861465,
0.02672710083425045,
0.06336211413145065,
-0.26228174567222595,
-0.03928988799452782,
0.017794858664274216,
0.0640544667840004,
0.05996575579047203,
0.09246913343667984,
-0.038343753665685654,
-0.09703918546438217,
0.08143837749958038,
0.030817372724413872,
-0.24303101003170013,
-0.04197799786925316,
0.10740136355161667,
-0.0907338336110115,
-0.025596125051379204,
-0.05301419273018837,
-0.028287257999181747,
0.12479151040315628,
0.03946008160710335,
0.031302399933338165,
0.021564019843935966,
0.06193574145436287,
-0.06717458367347717,
-0.07260186225175858,
0.06286540627479553,
0.054959945380687714,
-0.017852727323770523,
0.07738496363162994,
-0.2505027651786804,
0.05954274535179138,
-0.0942855104804039,
-0.07128791511058807,
-0.004715211223810911,
0.10966964811086655,
-0.03035944700241089,
0.05934135615825653,
0.11513001471757889,
0.03606341779232025,
-0.02145516686141491,
-0.04082201048731804,
-0.014499939046800137,
0.05310122296214104,
-0.028046082705259323,
-0.047992948442697525,
0.0450931154191494,
-0.07455327361822128,
0.15488885343074799,
-0.0041725593619048595,
-0.19259041547775269,
-0.055316630750894547,
-0.06227453052997589,
0.04071351885795593,
-0.07546377927064896,
0.040602196007966995,
-0.012929185293614864,
0.06526987999677658,
0.011966804973781109,
-0.1220293939113617,
0.11265826225280762,
0.13109102845191956,
-0.12378177791833878,
-0.01967386156320572
] |
null | null | transformers |
# Shrek DialoGPT Model | {"tags": ["conversational"]} | text-generation | Pupihed/DialoGPT-small-shrek | [
"transformers",
"pytorch",
"gpt2",
"text-generation",
"conversational",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [] | TAGS
#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
# Shrek DialoGPT Model | [
"# Shrek DialoGPT Model"
] | [
"TAGS\n#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n",
"# Shrek DialoGPT Model"
] | [
51,
8
] | [
"passage: TAGS\n#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n# Shrek DialoGPT Model"
] | [
0.005086915567517281,
0.10801542550325394,
-0.006145445629954338,
-0.003986467607319355,
0.1659800261259079,
-0.00010187663428951055,
0.16169971227645874,
0.12252102047204971,
-0.07989532500505447,
-0.06302627176046371,
0.11771722137928009,
0.12222056090831757,
0.018448352813720703,
0.09514905512332916,
-0.0322587750852108,
-0.3631899356842041,
0.0803513452410698,
0.011254367418587208,
-0.07886461168527603,
0.12232467532157898,
0.08030135184526443,
-0.028054673224687576,
0.0695023387670517,
-0.002906989771872759,
-0.1585233360528946,
-0.010639590211212635,
-0.00603237422183156,
-0.12346073240041733,
0.12701201438903809,
0.04021499678492546,
0.05096174404025078,
0.01974629983305931,
-0.02849355898797512,
-0.1198478490114212,
0.0475037544965744,
0.0033613017294555902,
-0.010185317136347294,
0.046127449721097946,
0.010964109562337399,
-0.061253730207681656,
0.13417290151119232,
0.1367703080177307,
0.0020106742158532143,
0.054609689861536026,
-0.10145077109336853,
-0.08843866735696793,
0.05356145650148392,
0.11220628768205643,
0.06439483165740967,
0.10091188549995422,
-0.04157700389623642,
0.11480692028999329,
-0.0687495619058609,
0.11587032675743103,
0.12453477084636688,
-0.2996978461742401,
-0.017982210963964462,
0.15483982861042023,
0.019939212128520012,
0.0694829672574997,
-0.0704471617937088,
0.09316997230052948,
0.00048174921539612114,
0.006919448729604483,
-0.021411031484603882,
-0.0864195004105568,
-0.020770413801074028,
0.017888268455863,
-0.12244720011949539,
0.01846548169851303,
0.23717278242111206,
-0.054293420165777206,
0.06408735364675522,
-0.05767035111784935,
-0.07480558753013611,
-0.03131217509508133,
-0.04415521025657654,
-0.015431994572281837,
-0.07939103245735168,
0.07101787626743317,
-0.07486791163682938,
-0.09102323651313782,
-0.12053124606609344,
-0.03846672549843788,
-0.10098535567522049,
0.15251365303993225,
0.07123775780200958,
0.060653820633888245,
-0.22836339473724365,
0.07094082981348038,
-0.04056423157453537,
-0.08976063132286072,
0.022719573229551315,
-0.09097184985876083,
-0.009180809371173382,
0.03964020684361458,
-0.060873184353113174,
-0.023707224056124687,
0.09710249304771423,
0.17403703927993774,
0.007180026266723871,
0.053687117993831635,
0.012954389676451683,
0.07552903890609741,
0.05407864600419998,
0.10378091037273407,
-0.03892802074551582,
-0.07940270006656647,
0.05356304347515106,
-0.06778916716575623,
0.007848148234188557,
-0.06946300715208054,
-0.19141808152198792,
-0.029512640088796616,
0.05429890751838684,
0.02676277793943882,
0.023172099143266678,
0.1586286425590515,
0.06561963260173798,
-0.04657185822725296,
0.06092649698257446,
-0.016440683975815773,
-0.05844555050134659,
0.0319252535700798,
-0.026910169050097466,
0.14217320084571838,
-0.009931493550539017,
0.06458447873592377,
-0.0949135273694992,
0.08388982713222504,
-0.02456112951040268,
0.0048993779346346855,
0.005869019776582718,
-0.041517890989780426,
-0.008473815396428108,
0.002033620374277234,
0.0014319013571366668,
-0.15560415387153625,
-0.15100304782390594,
0.019474836066365242,
-0.04527120292186737,
-0.04222680255770683,
-0.09999219328165054,
-0.0917453020811081,
-0.08025489002466202,
0.018905235454440117,
-0.015258719213306904,
-0.055451054126024246,
-0.031056169420480728,
0.08952605724334717,
-0.053683169186115265,
0.09729007631540298,
-0.13546201586723328,
0.07142656296491623,
-0.10649929940700531,
-0.022285159677267075,
-0.154074564576149,
0.11281473934650421,
0.00052178546320647,
0.07379604876041412,
-0.02381008490920067,
-0.053297776728868484,
-0.10679013282060623,
0.04233662039041519,
-0.0585312694311142,
0.20109352469444275,
-0.10139904171228409,
-0.1097971498966217,
0.270341157913208,
-0.09597057104110718,
-0.12091857194900513,
0.16330768167972565,
-0.004019445274025202,
0.06973898410797119,
0.09630998224020004,
0.23455692827701569,
0.00728529691696167,
0.05020720139145851,
0.06837320327758789,
0.1019807830452919,
-0.11146018654108047,
0.03777381777763367,
0.0682663768529892,
0.019397754222154617,
-0.020387370139360428,
0.04216492921113968,
0.034469474107027054,
0.07524032890796661,
-0.021117841824889183,
-0.02171364054083824,
0.005656230263411999,
-0.02570817992091179,
0.10230319947004318,
-0.03711160644888878,
0.1613057255744934,
-0.018724165856838226,
-0.04450302943587303,
-0.03690537437796593,
0.0332079716026783,
-0.04203319922089577,
0.06424444168806076,
-0.043238315731287,
0.07485232502222061,
0.04618055373430252,
0.09197160601615906,
-0.1406460404396057,
-0.016812097281217575,
-0.0531318336725235,
0.18825139105319977,
0.0863228365778923,
0.09150149673223495,
0.04124610126018524,
-0.037541139870882034,
-0.03806278482079506,
0.02344605326652527,
0.16372448205947876,
-0.009361037984490395,
-0.04536590725183487,
-0.08874440938234329,
0.09199598431587219,
-0.04429079219698906,
0.10118543356657028,
-0.02509378083050251,
0.058787763118743896,
0.012079357169568539,
0.12018521875143051,
-0.06341318041086197,
0.025684339925646782,
0.023908251896500587,
-0.031092366203665733,
-0.06596764177083969,
-0.025426633656024933,
0.10854357481002808,
0.016619516536593437,
-0.10326656699180603,
0.230965718626976,
-0.20684391260147095,
0.1402006447315216,
0.19144165515899658,
-0.21930204331874847,
-0.05624471604824066,
-0.09261152893304825,
-0.02139153517782688,
0.00126470101531595,
0.037312787026166916,
-0.0608254037797451,
0.2117665410041809,
0.0061302329413592815,
0.1940322369337082,
-0.026255741715431213,
-0.027558844536542892,
-0.04942316189408302,
-0.07321520894765854,
0.0577947273850441,
0.08175231516361237,
0.10711324959993362,
-0.15400676429271698,
0.17428401112556458,
0.06992821395397186,
0.016695337370038033,
0.20865385234355927,
0.03473352640867233,
0.029404787346720695,
0.07090286165475845,
-0.008106508292257786,
-0.04047497361898422,
-0.0903073325753212,
-0.1997331976890564,
-0.042809512466192245,
0.0812411829829216,
0.01365786511451006,
0.08589290082454681,
-0.1152251586318016,
-0.04356401413679123,
0.00006609871343243867,
0.011648212559521198,
-0.021693186834454536,
0.14346130192279816,
0.012259796261787415,
0.12347112596035004,
-0.009782091714441776,
-0.0728946253657341,
0.05097967013716698,
0.012830805964767933,
-0.0809885635972023,
0.1620510220527649,
-0.13854612410068512,
-0.33231064677238464,
-0.14849980175495148,
-0.12063120305538177,
-0.06253933161497116,
0.039389096200466156,
0.10347340255975723,
-0.15211829543113708,
-0.0031370839569717646,
0.014437711797654629,
0.11307374387979507,
-0.18952172994613647,
-0.00037851810338906944,
-0.04772266373038292,
0.029260195791721344,
-0.15943874418735504,
-0.07499553263187408,
-0.03860791027545929,
-0.00978543795645237,
-0.06539163738489151,
0.12377075105905533,
-0.14965088665485382,
-0.0059736561961472034,
0.2597236633300781,
0.03252512961626053,
0.07246451079845428,
-0.06581909954547882,
0.22896501421928406,
-0.11206167936325073,
0.02894105389714241,
0.15246009826660156,
-0.09729163348674774,
0.0808795765042305,
0.14044059813022614,
-0.0012926374329254031,
-0.03991297259926796,
0.05222426727414131,
-0.047183357179164886,
-0.11532827466726303,
-0.1700390726327896,
-0.0711754709482193,
-0.08831410855054855,
0.1661023646593094,
-0.014105322770774364,
0.05163683742284775,
0.1953457146883011,
0.08632437139749527,
-0.0475868321955204,
-0.012023107148706913,
0.07634518295526505,
0.10503266751766205,
0.24620382487773895,
-0.04354143142700195,
0.15384292602539062,
-0.024547776207327843,
-0.13102632761001587,
0.02687404677271843,
0.042942896485328674,
0.0907265841960907,
0.03476865589618683,
0.06837434321641922,
-0.016281690448522568,
0.03692244738340378,
0.15254338085651398,
0.032388001680374146,
0.03932991623878479,
-0.019545892253518105,
-0.028659123927354813,
-0.02705506980419159,
-0.0198717899620533,
0.08134470134973526,
0.049153849482536316,
-0.1546313613653183,
0.0021596455480903387,
0.07364732027053833,
0.06002604216337204,
0.08290354162454605,
0.013617166317999363,
-0.1273355484008789,
-0.023746810853481293,
0.0447208508849144,
-0.017854077741503716,
-0.04457857087254524,
0.09341585636138916,
0.060503821820020676,
-0.17626018822193146,
0.02401840128004551,
-0.01945367455482483,
0.09627758711576462,
-0.009124896489083767,
0.09415288269519806,
-0.08438817411661148,
-0.05417318269610405,
-0.013634253293275833,
0.10158322006464005,
-0.21654927730560303,
0.16575105488300323,
-0.029641708359122276,
-0.06886542588472366,
-0.07276047021150589,
-0.02660764940083027,
0.06700099259614944,
0.12418951094150543,
0.09994986653327942,
0.0013555109035223722,
-0.049573853611946106,
-0.06617461144924164,
-0.029338032007217407,
0.005953523796051741,
0.1486646980047226,
-0.05922454223036766,
-0.0127913448959589,
-0.07020283490419388,
-0.007643367629498243,
-0.01255607046186924,
-0.0014074177015572786,
0.08168323338031769,
-0.15138204395771027,
0.0841347947716713,
0.09950513392686844,
0.07613307237625122,
0.03039219044148922,
-0.037160176783800125,
-0.11940182745456696,
0.17960494756698608,
0.01737144961953163,
-0.13217005133628845,
-0.08018728345632553,
-0.0020367512479424477,
0.059248991310596466,
-0.08161506801843643,
0.026007799431681633,
-0.0925489291548729,
0.054815664887428284,
-0.07138216495513916,
-0.1714605689048767,
0.06794214993715286,
-0.07670363038778305,
-0.10567431896924973,
0.01103570219129324,
0.1988215297460556,
-0.048226673156023026,
0.08058442920446396,
0.0182450283318758,
0.028857318684458733,
-0.09215226024389267,
-0.09368884563446045,
-0.0162630844861269,
-0.01164376363158226,
-0.013496850617229939,
0.009832700714468956,
0.021168915554881096,
0.0008357568294741213,
-0.10798481851816177,
-0.028837651014328003,
0.3529883027076721,
0.13336831331253052,
-0.06017538905143738,
0.15263544023036957,
0.08087158203125,
-0.022366585209965706,
-0.2919805645942688,
-0.1142742708325386,
-0.06329827010631561,
-0.06720637530088425,
-0.05972100421786308,
-0.1804094761610031,
0.08197560906410217,
-0.057962656021118164,
0.0012899086577817798,
0.04228881374001503,
-0.2669871747493744,
-0.12492097169160843,
0.16423751413822174,
-0.049254462122917175,
0.37224826216697693,
-0.13310852646827698,
-0.04771352931857109,
-0.03888363391160965,
-0.14382724463939667,
0.08584441989660263,
-0.0015393944922834635,
0.1263461709022522,
0.00044722206075675786,
0.17310133576393127,
0.034703537821769714,
-0.014489758759737015,
0.07016544044017792,
0.045376941561698914,
-0.07989105582237244,
-0.08169867098331451,
-0.11094275116920471,
-0.004193703178316355,
0.02516975626349449,
0.03444114327430725,
-0.08082056790590286,
0.023811327293515205,
-0.06844264268875122,
-0.035830240696668625,
-0.09915833175182343,
0.0025536944158375263,
0.0328688845038414,
-0.08230111747980118,
-0.0409710593521595,
-0.029600612819194794,
0.014544184319674969,
0.004252959508448839,
0.1744457483291626,
-0.08243248611688614,
0.12482819706201553,
0.07511447370052338,
0.08098136633634567,
-0.05008825659751892,
-0.050525564700365067,
-0.08478686213493347,
-0.06732102483510971,
0.05344727262854576,
-0.1121881827712059,
0.0017109834589064121,
0.1108822375535965,
-0.0072695291601121426,
0.08547867834568024,
0.10694137215614319,
0.0012512330431491137,
0.01771034486591816,
0.06440582871437073,
-0.1928732693195343,
-0.11233340203762054,
-0.07833066582679749,
0.0640321746468544,
0.07555369287729263,
0.09102427959442139,
0.1974591314792633,
-0.06393927335739136,
-0.051079582422971725,
0.025118645280599594,
0.003977516200393438,
-0.018025852739810944,
0.08562021702528,
0.013656320981681347,
0.04518971964716911,
-0.15721182525157928,
0.07551366090774536,
-0.022079478949308395,
-0.017690666019916534,
0.07884570211172104,
0.14022280275821686,
-0.13280102610588074,
-0.08933243155479431,
-0.06882304698228836,
0.04706289619207382,
-0.11780791729688644,
-0.014530777931213379,
-0.06565816700458527,
-0.11717958003282547,
0.030932975932955742,
0.11133697628974915,
0.0545802116394043,
0.0372476801276207,
-0.13568687438964844,
-0.01193762756884098,
-0.03444696217775345,
0.01988455466926098,
0.06350558251142502,
-0.010304709896445274,
-0.0742378830909729,
0.14186374843120575,
-0.01017991453409195,
0.09963461011648178,
-0.09310811758041382,
-0.12827803194522858,
-0.14471548795700073,
0.03941257670521736,
-0.12678831815719604,
-0.07843077927827835,
-0.11300325393676758,
-0.0764581635594368,
-0.009381020441651344,
-0.04692240431904793,
-0.027689248323440552,
-0.01256111916154623,
-0.1251908540725708,
0.01328915823251009,
-0.06966762244701385,
-0.0242573581635952,
-0.07158177345991135,
0.042332522571086884,
0.06905796378850937,
-0.02338818646967411,
0.1372278928756714,
0.16491015255451202,
-0.1509639471769333,
0.10449822247028351,
-0.07669726759195328,
-0.09187593311071396,
0.09415137022733688,
-0.0037326531019061804,
0.01103350892663002,
0.09817489236593246,
-0.013614069670438766,
-0.004313063807785511,
0.04885174706578255,
0.06797004491090775,
0.06388791650533676,
-0.0636039599776268,
0.036409590393304825,
-0.060845401138067245,
-0.07967542856931686,
-0.042245782911777496,
-0.0687858834862709,
0.01166402269154787,
0.08444931358098984,
0.09443662315607071,
-0.06266824901103973,
0.11757944524288177,
-0.04660765454173088,
0.030178502202033997,
0.016154032200574875,
-0.16954076290130615,
0.008762034587562084,
-0.09522008895874023,
0.029819421470165253,
-0.0066068763844668865,
0.15676498413085938,
-0.014838446862995625,
0.008813617751002312,
0.0320032574236393,
0.1298569291830063,
0.038334667682647705,
0.013472271151840687,
0.18954169750213623,
0.12545543909072876,
-0.077345110476017,
-0.07297717034816742,
0.08995134383440018,
0.07090200483798981,
-0.03172270581126213,
0.12767529487609863,
-0.02682497352361679,
-0.011515028774738312,
0.06351539492607117,
-0.017393918707966805,
0.07525771856307983,
-0.10756882280111313,
-0.11956232786178589,
-0.02069065533578396,
0.040830936282873154,
-0.06559139490127563,
0.15475811064243317,
0.178684264421463,
0.004831888247281313,
0.021252628415822983,
-0.05211875960230827,
-0.05504324287176132,
-0.18751439452171326,
-0.164894238114357,
-0.051964711397886276,
-0.16627641022205353,
0.012935553677380085,
-0.1023525670170784,
0.007367437705397606,
0.0472845584154129,
0.1107335314154625,
-0.042814843356609344,
0.10851702094078064,
-0.010909317061305046,
-0.09688800573348999,
0.0696541890501976,
-0.04761887341737747,
0.045855291187763214,
0.02069362811744213,
-0.01602850668132305,
-0.013171711936593056,
0.027580194175243378,
0.03886052593588829,
0.023179197683930397,
-0.06477142870426178,
-0.012543810531497002,
-0.15511023998260498,
-0.09631818532943726,
-0.052425846457481384,
0.060763221234083176,
-0.023114075884222984,
0.12737104296684265,
0.03680482134222984,
-0.02973334677517414,
0.0012213099980726838,
0.2506111264228821,
-0.06706899404525757,
-0.11680415272712708,
-0.11464723199605942,
0.14651477336883545,
-0.013763743452727795,
0.04437640309333801,
-0.03557487577199936,
-0.008675938472151756,
-0.11397508531808853,
0.332417756319046,
0.30197271704673767,
-0.10684211552143097,
0.0075968969613313675,
0.0031167760025709867,
0.045928578823804855,
0.08755028992891312,
0.10824639350175858,
0.11157452315092087,
0.225351944565773,
-0.08389808982610703,
-0.035538189113140106,
-0.03818747401237488,
-0.07720468938350677,
-0.05692305788397789,
-0.013138181529939175,
0.08174507319927216,
-0.06752977520227432,
-0.04507405310869217,
0.09909871965646744,
-0.30498188734054565,
0.03943890705704689,
-0.22721892595291138,
-0.15516744554042816,
-0.04892224073410034,
0.014855630695819855,
0.059254106134176254,
0.05183885991573334,
0.07672753185033798,
0.00846369843930006,
-0.03660959750413895,
0.0785292237997055,
0.014262611046433449,
-0.17442865669727325,
0.018644610419869423,
0.07910136878490448,
-0.15363405644893646,
-0.07724007219076157,
-0.04620570316910744,
0.03723268210887909,
0.0715596005320549,
0.0760006532073021,
0.017016563564538956,
0.026377851143479347,
0.005916821304708719,
-0.026963841170072556,
-0.026719115674495697,
0.13518597185611725,
0.0055424221791327,
-0.06318291276693344,
0.08288037776947021,
-0.12918993830680847,
0.03693895787000656,
0.013597346842288971,
0.0004608687886502594,
-0.018259389325976372,
0.04202429950237274,
-0.06477976590394974,
0.06645749509334564,
0.11013966798782349,
-0.036427322775125504,
-0.007327505387365818,
-0.008123215287923813,
-0.053041599690914154,
0.0005537522374652326,
-0.13605250418186188,
-0.0897136926651001,
-0.19089238345623016,
-0.12600110471248627,
0.005307669751346111,
-0.0069648451171815395,
-0.17309826612472534,
0.0016765764448791742,
-0.14315184950828552,
0.0693463534116745,
-0.1057482659816742,
0.09064384549856186,
0.05589143559336662,
0.024291085079312325,
0.01025785319507122,
-0.0034012056421488523,
0.034161683171987534,
0.08438711613416672,
-0.1608397662639618,
-0.0851927250623703
] |
null | null | transformers |
# Jarvis DialoGPT Model | {"tags": ["conversational"]} | text-generation | PurpleJacketGuy/My_Jarvis | [
"transformers",
"pytorch",
"gpt2",
"text-generation",
"conversational",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [] | TAGS
#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
# Jarvis DialoGPT Model | [
"# Jarvis DialoGPT Model"
] | [
"TAGS\n#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n",
"# Jarvis DialoGPT Model"
] | [
51,
8
] | [
"passage: TAGS\n#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n# Jarvis DialoGPT Model"
] | [
-0.028724955394864082,
0.07407274097204208,
-0.005834823939949274,
0.023433813825249672,
0.1246638298034668,
-0.011197861284017563,
0.13986065983772278,
0.12920737266540527,
0.006953559350222349,
-0.06446392834186554,
0.1267763078212738,
0.20062048733234406,
-0.013820264488458633,
0.04989049211144447,
-0.07979270070791245,
-0.3160135746002197,
0.05021711811423302,
0.05453893914818764,
0.025644347071647644,
0.12028034031391144,
0.08469628542661667,
-0.04969409480690956,
0.08766992390155792,
0.014502068981528282,
-0.1231098547577858,
-0.009229401126503944,
0.020790129899978638,
-0.11993566900491714,
0.11178915202617645,
0.05145671218633652,
0.028923071920871735,
0.040120698511600494,
-0.0469035767018795,
-0.1391468197107315,
0.030170980840921402,
-0.01779216341674328,
-0.02611646242439747,
0.05819234997034073,
0.01532368641346693,
-0.10071725398302078,
0.13818569481372833,
0.10986344516277313,
0.009786374866962433,
0.04206942766904831,
-0.1763571947813034,
0.011633799411356449,
0.007480927277356386,
0.037772487848997116,
0.08922077715396881,
0.13162215054035187,
-0.03783842921257019,
0.11453060805797577,
-0.06034768000245094,
0.11005792021751404,
0.0901191458106041,
-0.3333207070827484,
-0.01252543181180954,
0.11713425815105438,
0.06325526535511017,
0.04281870275735855,
-0.04157290235161781,
0.07460950314998627,
0.017148630693554878,
0.002558201551437378,
-0.029336335137486458,
-0.08326403796672821,
-0.11960744112730026,
0.025943249464035034,
-0.0958387479186058,
-0.01642487198114395,
0.2635208070278168,
-0.03299141675233841,
0.045670948922634125,
-0.06727854162454605,
-0.09215495735406876,
-0.014930838719010353,
-0.0276251882314682,
-0.03944746032357216,
-0.08661610633134842,
0.08678511530160904,
-0.023032160475850105,
-0.10082648694515228,
-0.11236582696437836,
-0.024453405290842056,
-0.1788819432258606,
0.16535693407058716,
0.018432052806019783,
0.04092192277312279,
-0.2225951850414276,
0.10763050615787506,
0.04071442037820816,
-0.09701156616210938,
0.010043409653007984,
-0.10200215131044388,
0.026231074705719948,
0.000974346708972007,
-0.032374296337366104,
-0.04516354203224182,
0.03585856780409813,
0.07345622032880783,
-0.005490866955369711,
-0.008279246278107166,
-0.0023671817034482956,
0.034084253013134,
0.06236754357814789,
0.07986926287412643,
-0.008260391652584076,
-0.04819576442241669,
0.01917174831032753,
-0.10077600181102753,
-0.01404132042080164,
-0.06159525737166405,
-0.17954261600971222,
-0.024861468002200127,
0.06059776619076729,
0.05018831044435501,
0.030141718685626984,
0.120445117354393,
-0.027303654700517654,
-0.07678719609975815,
0.03894506394863129,
-0.004095045849680901,
-0.01891145668923855,
0.002460737945511937,
-0.012222259305417538,
0.08562415838241577,
0.021218465641140938,
0.06035950779914856,
-0.11216261237859726,
-0.0039659831672906876,
-0.06221126392483711,
-0.0019792644307017326,
-0.017066283151507378,
-0.05726960301399231,
-0.022101402282714844,
-0.025740308687090874,
0.0260939858853817,
-0.13017575442790985,
-0.19093196094036102,
0.002637390047311783,
-0.010585050098598003,
-0.04783153533935547,
-0.12444384396076202,
-0.09703589975833893,
-0.027112217620015144,
0.037101540714502335,
-0.06442387402057648,
-0.02624974213540554,
-0.06278516352176666,
0.07361729443073273,
-0.027189156040549278,
0.08692190051078796,
-0.07533649355173111,
0.0788595974445343,
-0.10190857946872711,
-0.043259281665086746,
-0.06242547929286957,
0.152882382273674,
0.019782183691859245,
0.057174280285835266,
-0.03296364098787308,
-0.014946426264941692,
-0.0772169977426529,
0.06222575902938843,
-0.0513303168118,
0.25273123383522034,
-0.12171001732349396,
-0.11129625141620636,
0.23760539293289185,
-0.03575725108385086,
-0.12098793685436249,
0.10641667246818542,
-0.02180953323841095,
0.08944790065288544,
0.12785297632217407,
0.17467059195041656,
-0.012370780110359192,
0.014495102688670158,
0.08202231675386429,
0.08953217417001724,
-0.07762133330106735,
0.010499859228730202,
0.022684618830680847,
-0.02511371485888958,
-0.10109779238700867,
0.021839041262865067,
0.08066894114017487,
0.061613090336322784,
-0.0615631602704525,
-0.023919325321912766,
-0.000054330106650013477,
-0.00001123027232097229,
0.09208830446004868,
-0.026808064430952072,
0.13684028387069702,
-0.024504397064447403,
-0.057841259986162186,
0.010001561604440212,
-0.0004933199961669743,
-0.04709082841873169,
0.027116525918245316,
-0.08150961995124817,
0.04716242849826813,
-0.02983279526233673,
0.048291172832250595,
-0.14555862545967102,
-0.05460355058312416,
-0.024690264835953712,
0.1643878072500229,
0.06425291299819946,
0.0964064672589302,
0.057472676038742065,
-0.030470551922917366,
-0.007327340543270111,
0.023110942915081978,
0.15473513305187225,
-0.006806367542594671,
-0.08998996764421463,
-0.09599178284406662,
0.0767839252948761,
-0.06290669739246368,
0.09940192103385925,
-0.06836964935064316,
0.013265252113342285,
-0.0003538935852702707,
0.10273724794387817,
0.013013976626098156,
0.021943405270576477,
0.04780351743102074,
-0.016768358647823334,
-0.05335192382335663,
0.00039953039959073067,
0.10065773874521255,
-0.0026151584461331367,
-0.08283500373363495,
0.21575133502483368,
-0.16274574398994446,
0.11507182568311691,
0.17688411474227905,
-0.22587908804416656,
0.0035741892643272877,
-0.09320090711116791,
-0.02911660447716713,
0.012986136600375175,
0.04193082079291344,
-0.013512968085706234,
0.2868538796901703,
-0.015745481476187706,
0.15488483011722565,
-0.03525629639625549,
-0.04365459829568863,
-0.041145291179418564,
-0.04184336215257645,
0.009840158745646477,
0.08669433742761612,
0.07837274670600891,
-0.1606738120317459,
0.15056751668453217,
0.09772747755050659,
0.060334645211696625,
0.18346990644931793,
0.032311514019966125,
0.0017100629629567266,
0.05952059105038643,
-0.02360295131802559,
-0.04877248778939247,
-0.0822998583316803,
-0.31281545758247375,
-0.03653373196721077,
0.07137937098741531,
0.03230374678969383,
0.09856313467025757,
-0.09207519888877869,
-0.03250809386372566,
0.013535907492041588,
-0.018245304003357887,
0.015381118282675743,
0.10139873623847961,
0.034380923956632614,
0.09754294157028198,
-0.026423681527376175,
-0.07583978027105331,
0.07115431874990463,
0.021844753995537758,
-0.08759263902902603,
0.18592306971549988,
-0.11179307848215103,
-0.31430837512016296,
-0.09333854168653488,
-0.1731521338224411,
-0.06803490221500397,
0.05851156264543533,
0.09780998528003693,
-0.09703154116868973,
-0.013259954750537872,
-0.014382905326783657,
0.13825514912605286,
-0.07531143724918365,
0.01739112287759781,
-0.042957853525877,
-0.006613840814679861,
-0.13227160274982452,
-0.10042990744113922,
-0.0443732775747776,
-0.04921574518084526,
-0.07468399405479431,
0.11027255654335022,
-0.12926431000232697,
0.007907571271061897,
0.22697941958904266,
0.054130252450704575,
0.053005415946245193,
-0.02889549545943737,
0.24429047107696533,
-0.09617699682712555,
-0.007782569155097008,
0.17568200826644897,
-0.03214964270591736,
0.028945090249180794,
0.1430768221616745,
-0.005454486235976219,
-0.08216629922389984,
0.03437422588467598,
-0.023625463247299194,
-0.05254097655415535,
-0.21236687898635864,
-0.13641506433486938,
-0.11246171593666077,
0.08837425708770752,
0.028856061398983,
0.029008490964770317,
0.17980797588825226,
0.06998403370380402,
-0.024225519970059395,
0.008345061913132668,
0.052182964980602264,
0.08681490272283554,
0.23569262027740479,
-0.06456708163022995,
0.14927791059017181,
0.00410221703350544,
-0.18166150152683258,
0.0628277137875557,
0.04515660181641579,
0.08813724666833878,
0.0582970455288887,
0.06131142005324364,
0.00355919380672276,
0.03240963816642761,
0.1446857452392578,
0.0793708935379982,
0.01701505482196808,
-0.049107473343610764,
-0.04447033256292343,
-0.03657027706503868,
-0.03904195502400398,
0.04817810654640198,
0.08848144859075546,
-0.1286434531211853,
-0.046292904764413834,
-0.036265984177589417,
0.03642420843243599,
0.07392685860395432,
0.13249701261520386,
-0.19443918764591217,
-0.040899429470300674,
0.07547131180763245,
-0.04706265404820442,
-0.11716531962156296,
0.08500263094902039,
0.007305088918656111,
-0.13068830966949463,
0.02335016056895256,
-0.01591906137764454,
0.11393614858388901,
-0.11107517778873444,
0.08079040050506592,
-0.11890203505754471,
-0.08969497680664062,
-0.0024596217554062605,
0.08675013482570648,
-0.25016066431999207,
0.2085849940776825,
-0.010405858047306538,
-0.05011548101902008,
-0.09866302460432053,
-0.005153730046004057,
0.012952320277690887,
0.10962537676095963,
0.11347955465316772,
-0.017160089686512947,
0.04657955467700958,
0.023680271580815315,
-0.0716506689786911,
0.03091360069811344,
0.07532316446304321,
-0.07377494126558304,
-0.00210787751711905,
-0.028751103207468987,
-0.0015667197294533253,
-0.0037975553423166275,
-0.02157602645456791,
-0.026713673025369644,
-0.17328990995883942,
0.07057955116033554,
0.04507847875356674,
0.11449800431728363,
0.03573475033044815,
-0.040371526032686234,
-0.03457089886069298,
0.2549757659435272,
0.009089363738894463,
-0.0932002142071724,
-0.08367221802473068,
-0.049158982932567596,
0.06671328842639923,
-0.05973284691572189,
0.009215520694851875,
-0.04506795480847359,
0.017958952113986015,
-0.03619559854269028,
-0.18073822557926178,
0.12886932492256165,
-0.08524782955646515,
-0.025213321670889854,
-0.04239622876048088,
0.19000285863876343,
-0.023934034630656242,
0.028936181217432022,
0.04323231056332588,
0.014129805378615856,
-0.1184040755033493,
-0.08949636667966843,
-0.007749824319034815,
0.0282608512789011,
0.008089686743915081,
0.04343842715024948,
-0.028428029268980026,
-0.056490883231163025,
-0.060577694326639175,
0.0005143927992321551,
0.3266129791736603,
0.12531548738479614,
-0.04014194756746292,
0.1638588160276413,
0.102069191634655,
-0.04681462422013283,
-0.24390307068824768,
-0.09674343466758728,
-0.0685282051563263,
-0.035416919738054276,
-0.09263017773628235,
-0.18166717886924744,
0.10428355634212494,
-0.03567168861627579,
-0.009429809637367725,
0.061497658491134644,
-0.27015578746795654,
-0.11015525460243225,
0.2084948718547821,
-0.019251808524131775,
0.4108446538448334,
-0.09035273641347885,
-0.08560672402381897,
-0.034404605627059937,
-0.18611980974674225,
0.20530439913272858,
-0.0176062174141407,
0.12445810437202454,
-0.012248687446117401,
0.15911716222763062,
0.05640494450926781,
0.006764915306121111,
0.07689318805932999,
0.0375145822763443,
-0.05476841330528259,
-0.08420978486537933,
-0.08418436348438263,
-0.03343486785888672,
0.030808717012405396,
0.037713974714279175,
-0.018507065251469612,
0.03895849734544754,
-0.13813582062721252,
-0.060191020369529724,
-0.08382532000541687,
0.04864807426929474,
0.03709883987903595,
-0.09080565720796585,
0.022937729954719543,
-0.051426008343696594,
-0.013946615159511566,
0.009243153035640717,
0.18950949609279633,
-0.09211190789937973,
0.15301349759101868,
0.10249796509742737,
0.1602579802274704,
-0.15450111031532288,
0.0073991394601762295,
-0.045979246497154236,
-0.053626302629709244,
0.0723934918642044,
-0.07310688495635986,
0.016686048358678818,
0.10177405178546906,
-0.020466594025492668,
0.08702471852302551,
0.10455366969108582,
0.014056392014026642,
0.014223670586943626,
0.07381457090377808,
-0.25049594044685364,
-0.07693623006343842,
-0.09078479558229446,
-0.003200283506885171,
0.06433140486478806,
0.10453234612941742,
0.21748274564743042,
-0.013566804118454456,
-0.04283350706100464,
0.0010352939134463668,
0.00625515915453434,
-0.03756700083613396,
0.06964551657438278,
-0.014319177716970444,
0.017711857333779335,
-0.15306653082370758,
0.05433988198637962,
0.012669985182583332,
-0.09580761939287186,
0.011869417503476143,
0.144877091050148,
-0.10455725342035294,
-0.1272098422050476,
-0.09316560626029968,
0.14136214554309845,
-0.14257429540157318,
0.02615646831691265,
-0.024316754192113876,
-0.14349636435508728,
0.06975545734167099,
0.09523206949234009,
0.051848169416189194,
0.0765124261379242,
-0.12125283479690552,
-0.03275693207979202,
-0.02243814989924431,
-0.009171457961201668,
0.04990307241678238,
-0.016414111480116844,
-0.0595385767519474,
0.0709061548113823,
-0.04528675973415375,
0.1117570698261261,
-0.09652837365865707,
-0.10905763506889343,
-0.14447017014026642,
0.035887088626623154,
-0.10863792151212692,
-0.09975949674844742,
-0.11079566180706024,
-0.04249405115842819,
-0.007833102717995644,
-0.039758577942848206,
-0.03923891484737396,
-0.042374469339847565,
-0.11342903226613998,
0.031014585867524147,
-0.04215342923998833,
0.022732378914952278,
-0.06524349004030228,
0.024729691445827484,
0.05656211078166962,
-0.03480573743581772,
0.1618380844593048,
0.1542748361825943,
-0.10700564831495285,
0.07503072917461395,
-0.1329660266637802,
-0.07290257513523102,
0.10728160291910172,
0.005636665504425764,
0.07303185760974884,
0.03020731918513775,
0.004278136882930994,
0.059855442494153976,
0.05505819991230965,
0.041162483394145966,
0.058300044387578964,
-0.07535979151725769,
0.013305122964084148,
-0.015411239117383957,
-0.12322933226823807,
-0.04696159437298775,
-0.024948565289378166,
0.03553123027086258,
0.03812912479043007,
0.11296061426401138,
-0.04572174325585365,
0.08800002932548523,
-0.07006838917732239,
0.0349404402077198,
0.006336439400911331,
-0.16684605181217194,
-0.01303866133093834,
-0.08324834704399109,
0.054168712347745895,
0.003874107962474227,
0.19242747128009796,
0.03083484247326851,
0.005074565764516592,
0.01857074163854122,
0.07604409009218216,
0.08988475799560547,
0.006509547121822834,
0.20836439728736877,
0.12420791387557983,
-0.05382644385099411,
-0.07717001438140869,
0.10774771869182587,
0.03978855535387993,
0.08906969428062439,
0.12361247837543488,
-0.024730421602725983,
-0.030498331412672997,
0.09305144101381302,
-0.002418916206806898,
0.03155044466257095,
-0.10859871655702591,
-0.1559450775384903,
0.002088764449581504,
0.05985340103507042,
-0.04688568040728569,
0.10144050419330597,
0.1355113685131073,
-0.0257203858345747,
0.030315903946757317,
0.010771372355520725,
-0.0693720132112503,
-0.18811063468456268,
-0.16303007304668427,
-0.07321853935718536,
-0.1456780731678009,
-0.0010391600662842393,
-0.14742250740528107,
0.026323698461055756,
-0.005462203174829483,
0.08112800121307373,
-0.05216994509100914,
0.07045230269432068,
0.04388967528939247,
-0.10776453465223312,
0.07453998178243637,
-0.026552090421319008,
0.08100797981023788,
-0.05688752233982086,
0.020254475995898247,
-0.092989981174469,
0.048274729400873184,
-0.004330832045525312,
0.027424659579992294,
-0.07039772719144821,
0.014993264339864254,
-0.10928340256214142,
-0.09682200849056244,
-0.059809017926454544,
0.052349694073200226,
-0.014722905121743679,
0.16540159285068512,
0.029818788170814514,
-0.036255598068237305,
0.03984658047556877,
0.27960652112960815,
-0.09042903035879135,
-0.09938719123601913,
-0.07346504181623459,
0.22644560039043427,
0.015965810045599937,
0.11177685856819153,
-0.006199236493557692,
0.0029834015294909477,
-0.08800560981035233,
0.3005940020084381,
0.3068792521953583,
-0.1137877106666565,
0.01951170153915882,
0.02053114026784897,
0.033440060913562775,
0.10175138711929321,
0.07465256750583649,
0.10524051636457443,
0.3031412959098816,
-0.07784409075975418,
-0.015214548446238041,
0.0016938613262027502,
-0.037261322140693665,
-0.06472985446453094,
0.055700432509183884,
0.06474675238132477,
-0.0791921615600586,
-0.011798587627708912,
0.11328379064798355,
-0.2963547706604004,
0.12791521847248077,
-0.1831689029932022,
-0.19182266294956207,
-0.08072139322757721,
-0.011356024071574211,
0.07030326128005981,
0.03915482014417648,
0.10506898164749146,
0.0070085772313177586,
-0.06465279310941696,
0.055598184466362,
0.032765720039606094,
-0.2108049988746643,
0.0043840170837938786,
0.09294254332780838,
-0.07355425506830215,
-0.033761121332645416,
-0.021369922906160355,
0.05585765093564987,
0.0726432353258133,
0.058077625930309296,
-0.03144989535212517,
0.04147079586982727,
-0.018611937761306763,
-0.04767528176307678,
0.044379837810993195,
0.05118147283792496,
0.021311096847057343,
-0.06206592544913292,
0.07549527287483215,
-0.1717899590730667,
0.0535724051296711,
-0.016312379390001297,
-0.027859320864081383,
-0.02312055230140686,
0.01578165777027607,
-0.05608862265944481,
0.07583457231521606,
0.09817071259021759,
-0.014204797334969044,
-0.0108909010887146,
-0.016017073765397072,
-0.023940768092870712,
-0.013564666733145714,
-0.06358545273542404,
-0.10549838840961456,
-0.14770179986953735,
-0.11236501485109329,
0.08373946696519852,
0.0009978330926969647,
-0.1803843379020691,
-0.00041061025694943964,
-0.11307556182146072,
0.036461442708969116,
-0.12302865833044052,
0.08757439255714417,
0.10424633324146271,
0.019965240731835365,
0.004368630703538656,
0.006793509237468243,
0.049096278846263885,
0.08629801124334335,
-0.1422823816537857,
-0.10053034871816635
] |
null | null | transformers |
# Jarvis DialoGPT Model | {"tags": ["conversational"]} | text-generation | PurpleJacketGuy/My_Jarvis_2 | [
"transformers",
"pytorch",
"gpt2",
"text-generation",
"conversational",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [] | TAGS
#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
# Jarvis DialoGPT Model | [
"# Jarvis DialoGPT Model"
] | [
"TAGS\n#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n",
"# Jarvis DialoGPT Model"
] | [
51,
8
] | [
"passage: TAGS\n#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n# Jarvis DialoGPT Model"
] | [
-0.028724955394864082,
0.07407274097204208,
-0.005834823939949274,
0.023433813825249672,
0.1246638298034668,
-0.011197861284017563,
0.13986065983772278,
0.12920737266540527,
0.006953559350222349,
-0.06446392834186554,
0.1267763078212738,
0.20062048733234406,
-0.013820264488458633,
0.04989049211144447,
-0.07979270070791245,
-0.3160135746002197,
0.05021711811423302,
0.05453893914818764,
0.025644347071647644,
0.12028034031391144,
0.08469628542661667,
-0.04969409480690956,
0.08766992390155792,
0.014502068981528282,
-0.1231098547577858,
-0.009229401126503944,
0.020790129899978638,
-0.11993566900491714,
0.11178915202617645,
0.05145671218633652,
0.028923071920871735,
0.040120698511600494,
-0.0469035767018795,
-0.1391468197107315,
0.030170980840921402,
-0.01779216341674328,
-0.02611646242439747,
0.05819234997034073,
0.01532368641346693,
-0.10071725398302078,
0.13818569481372833,
0.10986344516277313,
0.009786374866962433,
0.04206942766904831,
-0.1763571947813034,
0.011633799411356449,
0.007480927277356386,
0.037772487848997116,
0.08922077715396881,
0.13162215054035187,
-0.03783842921257019,
0.11453060805797577,
-0.06034768000245094,
0.11005792021751404,
0.0901191458106041,
-0.3333207070827484,
-0.01252543181180954,
0.11713425815105438,
0.06325526535511017,
0.04281870275735855,
-0.04157290235161781,
0.07460950314998627,
0.017148630693554878,
0.002558201551437378,
-0.029336335137486458,
-0.08326403796672821,
-0.11960744112730026,
0.025943249464035034,
-0.0958387479186058,
-0.01642487198114395,
0.2635208070278168,
-0.03299141675233841,
0.045670948922634125,
-0.06727854162454605,
-0.09215495735406876,
-0.014930838719010353,
-0.0276251882314682,
-0.03944746032357216,
-0.08661610633134842,
0.08678511530160904,
-0.023032160475850105,
-0.10082648694515228,
-0.11236582696437836,
-0.024453405290842056,
-0.1788819432258606,
0.16535693407058716,
0.018432052806019783,
0.04092192277312279,
-0.2225951850414276,
0.10763050615787506,
0.04071442037820816,
-0.09701156616210938,
0.010043409653007984,
-0.10200215131044388,
0.026231074705719948,
0.000974346708972007,
-0.032374296337366104,
-0.04516354203224182,
0.03585856780409813,
0.07345622032880783,
-0.005490866955369711,
-0.008279246278107166,
-0.0023671817034482956,
0.034084253013134,
0.06236754357814789,
0.07986926287412643,
-0.008260391652584076,
-0.04819576442241669,
0.01917174831032753,
-0.10077600181102753,
-0.01404132042080164,
-0.06159525737166405,
-0.17954261600971222,
-0.024861468002200127,
0.06059776619076729,
0.05018831044435501,
0.030141718685626984,
0.120445117354393,
-0.027303654700517654,
-0.07678719609975815,
0.03894506394863129,
-0.004095045849680901,
-0.01891145668923855,
0.002460737945511937,
-0.012222259305417538,
0.08562415838241577,
0.021218465641140938,
0.06035950779914856,
-0.11216261237859726,
-0.0039659831672906876,
-0.06221126392483711,
-0.0019792644307017326,
-0.017066283151507378,
-0.05726960301399231,
-0.022101402282714844,
-0.025740308687090874,
0.0260939858853817,
-0.13017575442790985,
-0.19093196094036102,
0.002637390047311783,
-0.010585050098598003,
-0.04783153533935547,
-0.12444384396076202,
-0.09703589975833893,
-0.027112217620015144,
0.037101540714502335,
-0.06442387402057648,
-0.02624974213540554,
-0.06278516352176666,
0.07361729443073273,
-0.027189156040549278,
0.08692190051078796,
-0.07533649355173111,
0.0788595974445343,
-0.10190857946872711,
-0.043259281665086746,
-0.06242547929286957,
0.152882382273674,
0.019782183691859245,
0.057174280285835266,
-0.03296364098787308,
-0.014946426264941692,
-0.0772169977426529,
0.06222575902938843,
-0.0513303168118,
0.25273123383522034,
-0.12171001732349396,
-0.11129625141620636,
0.23760539293289185,
-0.03575725108385086,
-0.12098793685436249,
0.10641667246818542,
-0.02180953323841095,
0.08944790065288544,
0.12785297632217407,
0.17467059195041656,
-0.012370780110359192,
0.014495102688670158,
0.08202231675386429,
0.08953217417001724,
-0.07762133330106735,
0.010499859228730202,
0.022684618830680847,
-0.02511371485888958,
-0.10109779238700867,
0.021839041262865067,
0.08066894114017487,
0.061613090336322784,
-0.0615631602704525,
-0.023919325321912766,
-0.000054330106650013477,
-0.00001123027232097229,
0.09208830446004868,
-0.026808064430952072,
0.13684028387069702,
-0.024504397064447403,
-0.057841259986162186,
0.010001561604440212,
-0.0004933199961669743,
-0.04709082841873169,
0.027116525918245316,
-0.08150961995124817,
0.04716242849826813,
-0.02983279526233673,
0.048291172832250595,
-0.14555862545967102,
-0.05460355058312416,
-0.024690264835953712,
0.1643878072500229,
0.06425291299819946,
0.0964064672589302,
0.057472676038742065,
-0.030470551922917366,
-0.007327340543270111,
0.023110942915081978,
0.15473513305187225,
-0.006806367542594671,
-0.08998996764421463,
-0.09599178284406662,
0.0767839252948761,
-0.06290669739246368,
0.09940192103385925,
-0.06836964935064316,
0.013265252113342285,
-0.0003538935852702707,
0.10273724794387817,
0.013013976626098156,
0.021943405270576477,
0.04780351743102074,
-0.016768358647823334,
-0.05335192382335663,
0.00039953039959073067,
0.10065773874521255,
-0.0026151584461331367,
-0.08283500373363495,
0.21575133502483368,
-0.16274574398994446,
0.11507182568311691,
0.17688411474227905,
-0.22587908804416656,
0.0035741892643272877,
-0.09320090711116791,
-0.02911660447716713,
0.012986136600375175,
0.04193082079291344,
-0.013512968085706234,
0.2868538796901703,
-0.015745481476187706,
0.15488483011722565,
-0.03525629639625549,
-0.04365459829568863,
-0.041145291179418564,
-0.04184336215257645,
0.009840158745646477,
0.08669433742761612,
0.07837274670600891,
-0.1606738120317459,
0.15056751668453217,
0.09772747755050659,
0.060334645211696625,
0.18346990644931793,
0.032311514019966125,
0.0017100629629567266,
0.05952059105038643,
-0.02360295131802559,
-0.04877248778939247,
-0.0822998583316803,
-0.31281545758247375,
-0.03653373196721077,
0.07137937098741531,
0.03230374678969383,
0.09856313467025757,
-0.09207519888877869,
-0.03250809386372566,
0.013535907492041588,
-0.018245304003357887,
0.015381118282675743,
0.10139873623847961,
0.034380923956632614,
0.09754294157028198,
-0.026423681527376175,
-0.07583978027105331,
0.07115431874990463,
0.021844753995537758,
-0.08759263902902603,
0.18592306971549988,
-0.11179307848215103,
-0.31430837512016296,
-0.09333854168653488,
-0.1731521338224411,
-0.06803490221500397,
0.05851156264543533,
0.09780998528003693,
-0.09703154116868973,
-0.013259954750537872,
-0.014382905326783657,
0.13825514912605286,
-0.07531143724918365,
0.01739112287759781,
-0.042957853525877,
-0.006613840814679861,
-0.13227160274982452,
-0.10042990744113922,
-0.0443732775747776,
-0.04921574518084526,
-0.07468399405479431,
0.11027255654335022,
-0.12926431000232697,
0.007907571271061897,
0.22697941958904266,
0.054130252450704575,
0.053005415946245193,
-0.02889549545943737,
0.24429047107696533,
-0.09617699682712555,
-0.007782569155097008,
0.17568200826644897,
-0.03214964270591736,
0.028945090249180794,
0.1430768221616745,
-0.005454486235976219,
-0.08216629922389984,
0.03437422588467598,
-0.023625463247299194,
-0.05254097655415535,
-0.21236687898635864,
-0.13641506433486938,
-0.11246171593666077,
0.08837425708770752,
0.028856061398983,
0.029008490964770317,
0.17980797588825226,
0.06998403370380402,
-0.024225519970059395,
0.008345061913132668,
0.052182964980602264,
0.08681490272283554,
0.23569262027740479,
-0.06456708163022995,
0.14927791059017181,
0.00410221703350544,
-0.18166150152683258,
0.0628277137875557,
0.04515660181641579,
0.08813724666833878,
0.0582970455288887,
0.06131142005324364,
0.00355919380672276,
0.03240963816642761,
0.1446857452392578,
0.0793708935379982,
0.01701505482196808,
-0.049107473343610764,
-0.04447033256292343,
-0.03657027706503868,
-0.03904195502400398,
0.04817810654640198,
0.08848144859075546,
-0.1286434531211853,
-0.046292904764413834,
-0.036265984177589417,
0.03642420843243599,
0.07392685860395432,
0.13249701261520386,
-0.19443918764591217,
-0.040899429470300674,
0.07547131180763245,
-0.04706265404820442,
-0.11716531962156296,
0.08500263094902039,
0.007305088918656111,
-0.13068830966949463,
0.02335016056895256,
-0.01591906137764454,
0.11393614858388901,
-0.11107517778873444,
0.08079040050506592,
-0.11890203505754471,
-0.08969497680664062,
-0.0024596217554062605,
0.08675013482570648,
-0.25016066431999207,
0.2085849940776825,
-0.010405858047306538,
-0.05011548101902008,
-0.09866302460432053,
-0.005153730046004057,
0.012952320277690887,
0.10962537676095963,
0.11347955465316772,
-0.017160089686512947,
0.04657955467700958,
0.023680271580815315,
-0.0716506689786911,
0.03091360069811344,
0.07532316446304321,
-0.07377494126558304,
-0.00210787751711905,
-0.028751103207468987,
-0.0015667197294533253,
-0.0037975553423166275,
-0.02157602645456791,
-0.026713673025369644,
-0.17328990995883942,
0.07057955116033554,
0.04507847875356674,
0.11449800431728363,
0.03573475033044815,
-0.040371526032686234,
-0.03457089886069298,
0.2549757659435272,
0.009089363738894463,
-0.0932002142071724,
-0.08367221802473068,
-0.049158982932567596,
0.06671328842639923,
-0.05973284691572189,
0.009215520694851875,
-0.04506795480847359,
0.017958952113986015,
-0.03619559854269028,
-0.18073822557926178,
0.12886932492256165,
-0.08524782955646515,
-0.025213321670889854,
-0.04239622876048088,
0.19000285863876343,
-0.023934034630656242,
0.028936181217432022,
0.04323231056332588,
0.014129805378615856,
-0.1184040755033493,
-0.08949636667966843,
-0.007749824319034815,
0.0282608512789011,
0.008089686743915081,
0.04343842715024948,
-0.028428029268980026,
-0.056490883231163025,
-0.060577694326639175,
0.0005143927992321551,
0.3266129791736603,
0.12531548738479614,
-0.04014194756746292,
0.1638588160276413,
0.102069191634655,
-0.04681462422013283,
-0.24390307068824768,
-0.09674343466758728,
-0.0685282051563263,
-0.035416919738054276,
-0.09263017773628235,
-0.18166717886924744,
0.10428355634212494,
-0.03567168861627579,
-0.009429809637367725,
0.061497658491134644,
-0.27015578746795654,
-0.11015525460243225,
0.2084948718547821,
-0.019251808524131775,
0.4108446538448334,
-0.09035273641347885,
-0.08560672402381897,
-0.034404605627059937,
-0.18611980974674225,
0.20530439913272858,
-0.0176062174141407,
0.12445810437202454,
-0.012248687446117401,
0.15911716222763062,
0.05640494450926781,
0.006764915306121111,
0.07689318805932999,
0.0375145822763443,
-0.05476841330528259,
-0.08420978486537933,
-0.08418436348438263,
-0.03343486785888672,
0.030808717012405396,
0.037713974714279175,
-0.018507065251469612,
0.03895849734544754,
-0.13813582062721252,
-0.060191020369529724,
-0.08382532000541687,
0.04864807426929474,
0.03709883987903595,
-0.09080565720796585,
0.022937729954719543,
-0.051426008343696594,
-0.013946615159511566,
0.009243153035640717,
0.18950949609279633,
-0.09211190789937973,
0.15301349759101868,
0.10249796509742737,
0.1602579802274704,
-0.15450111031532288,
0.0073991394601762295,
-0.045979246497154236,
-0.053626302629709244,
0.0723934918642044,
-0.07310688495635986,
0.016686048358678818,
0.10177405178546906,
-0.020466594025492668,
0.08702471852302551,
0.10455366969108582,
0.014056392014026642,
0.014223670586943626,
0.07381457090377808,
-0.25049594044685364,
-0.07693623006343842,
-0.09078479558229446,
-0.003200283506885171,
0.06433140486478806,
0.10453234612941742,
0.21748274564743042,
-0.013566804118454456,
-0.04283350706100464,
0.0010352939134463668,
0.00625515915453434,
-0.03756700083613396,
0.06964551657438278,
-0.014319177716970444,
0.017711857333779335,
-0.15306653082370758,
0.05433988198637962,
0.012669985182583332,
-0.09580761939287186,
0.011869417503476143,
0.144877091050148,
-0.10455725342035294,
-0.1272098422050476,
-0.09316560626029968,
0.14136214554309845,
-0.14257429540157318,
0.02615646831691265,
-0.024316754192113876,
-0.14349636435508728,
0.06975545734167099,
0.09523206949234009,
0.051848169416189194,
0.0765124261379242,
-0.12125283479690552,
-0.03275693207979202,
-0.02243814989924431,
-0.009171457961201668,
0.04990307241678238,
-0.016414111480116844,
-0.0595385767519474,
0.0709061548113823,
-0.04528675973415375,
0.1117570698261261,
-0.09652837365865707,
-0.10905763506889343,
-0.14447017014026642,
0.035887088626623154,
-0.10863792151212692,
-0.09975949674844742,
-0.11079566180706024,
-0.04249405115842819,
-0.007833102717995644,
-0.039758577942848206,
-0.03923891484737396,
-0.042374469339847565,
-0.11342903226613998,
0.031014585867524147,
-0.04215342923998833,
0.022732378914952278,
-0.06524349004030228,
0.024729691445827484,
0.05656211078166962,
-0.03480573743581772,
0.1618380844593048,
0.1542748361825943,
-0.10700564831495285,
0.07503072917461395,
-0.1329660266637802,
-0.07290257513523102,
0.10728160291910172,
0.005636665504425764,
0.07303185760974884,
0.03020731918513775,
0.004278136882930994,
0.059855442494153976,
0.05505819991230965,
0.041162483394145966,
0.058300044387578964,
-0.07535979151725769,
0.013305122964084148,
-0.015411239117383957,
-0.12322933226823807,
-0.04696159437298775,
-0.024948565289378166,
0.03553123027086258,
0.03812912479043007,
0.11296061426401138,
-0.04572174325585365,
0.08800002932548523,
-0.07006838917732239,
0.0349404402077198,
0.006336439400911331,
-0.16684605181217194,
-0.01303866133093834,
-0.08324834704399109,
0.054168712347745895,
0.003874107962474227,
0.19242747128009796,
0.03083484247326851,
0.005074565764516592,
0.01857074163854122,
0.07604409009218216,
0.08988475799560547,
0.006509547121822834,
0.20836439728736877,
0.12420791387557983,
-0.05382644385099411,
-0.07717001438140869,
0.10774771869182587,
0.03978855535387993,
0.08906969428062439,
0.12361247837543488,
-0.024730421602725983,
-0.030498331412672997,
0.09305144101381302,
-0.002418916206806898,
0.03155044466257095,
-0.10859871655702591,
-0.1559450775384903,
0.002088764449581504,
0.05985340103507042,
-0.04688568040728569,
0.10144050419330597,
0.1355113685131073,
-0.0257203858345747,
0.030315903946757317,
0.010771372355520725,
-0.0693720132112503,
-0.18811063468456268,
-0.16303007304668427,
-0.07321853935718536,
-0.1456780731678009,
-0.0010391600662842393,
-0.14742250740528107,
0.026323698461055756,
-0.005462203174829483,
0.08112800121307373,
-0.05216994509100914,
0.07045230269432068,
0.04388967528939247,
-0.10776453465223312,
0.07453998178243637,
-0.026552090421319008,
0.08100797981023788,
-0.05688752233982086,
0.020254475995898247,
-0.092989981174469,
0.048274729400873184,
-0.004330832045525312,
0.027424659579992294,
-0.07039772719144821,
0.014993264339864254,
-0.10928340256214142,
-0.09682200849056244,
-0.059809017926454544,
0.052349694073200226,
-0.014722905121743679,
0.16540159285068512,
0.029818788170814514,
-0.036255598068237305,
0.03984658047556877,
0.27960652112960815,
-0.09042903035879135,
-0.09938719123601913,
-0.07346504181623459,
0.22644560039043427,
0.015965810045599937,
0.11177685856819153,
-0.006199236493557692,
0.0029834015294909477,
-0.08800560981035233,
0.3005940020084381,
0.3068792521953583,
-0.1137877106666565,
0.01951170153915882,
0.02053114026784897,
0.033440060913562775,
0.10175138711929321,
0.07465256750583649,
0.10524051636457443,
0.3031412959098816,
-0.07784409075975418,
-0.015214548446238041,
0.0016938613262027502,
-0.037261322140693665,
-0.06472985446453094,
0.055700432509183884,
0.06474675238132477,
-0.0791921615600586,
-0.011798587627708912,
0.11328379064798355,
-0.2963547706604004,
0.12791521847248077,
-0.1831689029932022,
-0.19182266294956207,
-0.08072139322757721,
-0.011356024071574211,
0.07030326128005981,
0.03915482014417648,
0.10506898164749146,
0.0070085772313177586,
-0.06465279310941696,
0.055598184466362,
0.032765720039606094,
-0.2108049988746643,
0.0043840170837938786,
0.09294254332780838,
-0.07355425506830215,
-0.033761121332645416,
-0.021369922906160355,
0.05585765093564987,
0.0726432353258133,
0.058077625930309296,
-0.03144989535212517,
0.04147079586982727,
-0.018611937761306763,
-0.04767528176307678,
0.044379837810993195,
0.05118147283792496,
0.021311096847057343,
-0.06206592544913292,
0.07549527287483215,
-0.1717899590730667,
0.0535724051296711,
-0.016312379390001297,
-0.027859320864081383,
-0.02312055230140686,
0.01578165777027607,
-0.05608862265944481,
0.07583457231521606,
0.09817071259021759,
-0.014204797334969044,
-0.0108909010887146,
-0.016017073765397072,
-0.023940768092870712,
-0.013564666733145714,
-0.06358545273542404,
-0.10549838840961456,
-0.14770179986953735,
-0.11236501485109329,
0.08373946696519852,
0.0009978330926969647,
-0.1803843379020691,
-0.00041061025694943964,
-0.11307556182146072,
0.036461442708969116,
-0.12302865833044052,
0.08757439255714417,
0.10424633324146271,
0.019965240731835365,
0.004368630703538656,
0.006793509237468243,
0.049096278846263885,
0.08629801124334335,
-0.1422823816537857,
-0.10053034871816635
] |
null | null | transformers |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert-base-dutch-cased-finetuned-gv
This model is a fine-tuned version of [GroNLP/bert-base-dutch-cased](https://huggingface.co/GroNLP/bert-base-dutch-cased) on an unkown dataset.
It achieves the following results on the evaluation set:
- Loss: 1.7837
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 3.0
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:-----:|:----:|:---------------:|
| 1.4741 | 1.0 | 2603 | 1.8404 |
| 1.2384 | 2.0 | 5206 | 1.8457 |
| 1.2121 | 3.0 | 7809 | 1.7837 |
### Framework versions
- Transformers 4.9.0
- Pytorch 1.9.0+cu102
- Datasets 1.10.2
- Tokenizers 0.10.3
| {"tags": ["generated_from_trainer"], "model_index": [{"name": "bert-base-dutch-cased-finetuned-gv", "results": [{"task": {"name": "Masked Language Modeling", "type": "fill-mask"}}]}]} | fill-mask | Pyjay/bert-base-dutch-cased-finetuned-gv | [
"transformers",
"pytorch",
"tensorboard",
"bert",
"fill-mask",
"generated_from_trainer",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [] | TAGS
#transformers #pytorch #tensorboard #bert #fill-mask #generated_from_trainer #autotrain_compatible #endpoints_compatible #region-us
| bert-base-dutch-cased-finetuned-gv
==================================
This model is a fine-tuned version of GroNLP/bert-base-dutch-cased on an unkown dataset.
It achieves the following results on the evaluation set:
* Loss: 1.7837
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 2e-05
* train\_batch\_size: 8
* eval\_batch\_size: 8
* seed: 42
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* num\_epochs: 3.0
### Training results
### Framework versions
* Transformers 4.9.0
* Pytorch 1.9.0+cu102
* Datasets 1.10.2
* Tokenizers 0.10.3
| [
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 8\n* eval\\_batch\\_size: 8\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 3.0",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.9.0\n* Pytorch 1.9.0+cu102\n* Datasets 1.10.2\n* Tokenizers 0.10.3"
] | [
"TAGS\n#transformers #pytorch #tensorboard #bert #fill-mask #generated_from_trainer #autotrain_compatible #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 8\n* eval\\_batch\\_size: 8\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 3.0",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.9.0\n* Pytorch 1.9.0+cu102\n* Datasets 1.10.2\n* Tokenizers 0.10.3"
] | [
47,
98,
4,
35
] | [
"passage: TAGS\n#transformers #pytorch #tensorboard #bert #fill-mask #generated_from_trainer #autotrain_compatible #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 8\n* eval\\_batch\\_size: 8\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 3.0### Training results### Framework versions\n\n\n* Transformers 4.9.0\n* Pytorch 1.9.0+cu102\n* Datasets 1.10.2\n* Tokenizers 0.10.3"
] | [
-0.10416870564222336,
0.02868482656776905,
-0.0017935760552063584,
0.11453021317720413,
0.1960303783416748,
0.03731715679168701,
0.11631033569574356,
0.09527730196714401,
-0.11093778163194656,
0.038880541920661926,
0.13148410618305206,
0.14918896555900574,
0.0035096679348498583,
0.1080155298113823,
-0.04148118570446968,
-0.2648707628250122,
-0.026910237967967987,
0.0353105366230011,
-0.09945442527532578,
0.122718945145607,
0.07000605016946793,
-0.15979871153831482,
0.07356705516576767,
-0.004719045013189316,
-0.24258586764335632,
0.013920930214226246,
0.03471226990222931,
-0.059188954532146454,
0.14511723816394806,
0.00481057446449995,
0.17413429915905,
-0.004370466340333223,
0.10548147559165955,
-0.1463535875082016,
0.016424117609858513,
0.06373313814401627,
0.010865571908652782,
0.0792321115732193,
0.04810978099703789,
0.0039216214790940285,
0.10374992340803146,
-0.10522224009037018,
0.06985163688659668,
0.0032310844399034977,
-0.13300751149654388,
-0.20275476574897766,
-0.07260729372501373,
-0.008144739083945751,
0.05067618191242218,
0.10069606453180313,
-0.005702553782612085,
0.16379395127296448,
-0.10876481235027313,
0.09584622830152512,
0.237474724650383,
-0.26922595500946045,
-0.09250853955745697,
0.031113136559724808,
0.011925258673727512,
0.068922258913517,
-0.11561012268066406,
-0.00767065305262804,
0.05165446549654007,
0.06003456190228462,
0.12994489073753357,
-0.028938550502061844,
-0.09540928155183792,
0.01627345196902752,
-0.14494499564170837,
0.002938835183158517,
0.03307881951332092,
0.02024364471435547,
-0.02069713920354843,
-0.00628255307674408,
-0.07105974853038788,
-0.15732955932617188,
-0.05044572427868843,
-0.029139816761016846,
0.042802758514881134,
-0.06945626437664032,
-0.12009967118501663,
0.007590710651129484,
-0.09926695376634598,
-0.06929533183574677,
-0.0773807018995285,
0.18617603182792664,
0.03602402284741402,
0.021420419216156006,
-0.04168672114610672,
0.09798207879066467,
-0.021048622205853462,
-0.14976102113723755,
0.04828618839383125,
0.03825540095567703,
-0.034639872610569,
-0.05716363340616226,
-0.07932250201702118,
-0.11103212833404541,
0.002852043369784951,
0.08424384891986847,
-0.03970058262348175,
0.046966616064310074,
0.04167024791240692,
0.03991372138261795,
-0.103887178003788,
0.1993483304977417,
-0.052997153252363205,
-0.03087778016924858,
0.0024615535512566566,
0.05244796350598335,
0.0021718163043260574,
-0.017872100695967674,
-0.10187827795743942,
0.008420445956289768,
0.10061539709568024,
-0.0028078549075871706,
-0.0672939121723175,
0.05898120999336243,
-0.03466563671827316,
-0.007061879616230726,
-0.027988377958536148,
-0.09596121311187744,
0.05186628922820091,
-0.008725696243345737,
-0.07925685495138168,
0.008920153602957726,
0.02323116734623909,
0.01129673607647419,
-0.00323474477045238,
0.17928552627563477,
-0.09989379346370697,
0.04789409413933754,
-0.12199428677558899,
-0.12495589256286621,
0.004226445686072111,
-0.07771758735179901,
0.015013870783150196,
-0.09312371164560318,
-0.12335502356290817,
-0.011796001344919205,
0.06661448627710342,
-0.03243676945567131,
-0.019360652193427086,
-0.03795095533132553,
-0.08217793703079224,
0.010726716369390488,
-0.009115934371948242,
0.1682180017232895,
-0.04952092841267586,
0.11753058433532715,
0.05494574457406998,
0.09111308306455612,
-0.06243479251861572,
0.04519815742969513,
-0.08083993941545486,
0.004057641141116619,
-0.2332674115896225,
0.02512192912399769,
-0.04591185599565506,
0.0635114461183548,
-0.05971100181341171,
-0.12016353011131287,
-0.000010331471457902808,
-0.005799963138997555,
0.10791383683681488,
0.08941619843244553,
-0.1892310529947281,
-0.08874625712633133,
0.159417524933815,
-0.05521278455853462,
-0.08267873525619507,
0.1257084608078003,
-0.06327029317617416,
0.015852145850658417,
0.07346449792385101,
0.12698757648468018,
0.03973172977566719,
-0.11692299693822861,
0.02019921876490116,
-0.03191295266151428,
0.060209088027477264,
-0.05258336663246155,
0.03850633651018143,
0.014753367751836777,
0.016762625426054,
0.02650560438632965,
-0.02254411019384861,
0.06424954533576965,
-0.12323445081710815,
-0.09043493866920471,
-0.03372259438037872,
-0.10749072581529617,
0.0755016952753067,
0.08271583169698715,
0.09356234222650528,
-0.10179772973060608,
-0.08118852227926254,
0.0781160444021225,
0.04769628867506981,
-0.045230548828840256,
0.026504168286919594,
-0.06076684594154358,
0.06542646884918213,
-0.06657836586236954,
-0.02951158583164215,
-0.19264079630374908,
-0.05334989354014397,
-0.003109449055045843,
0.03337964788079262,
0.0309227854013443,
0.012134416028857231,
0.09826308488845825,
0.08110041171312332,
-0.06330759078264236,
0.0028514997102320194,
-0.045725442469120026,
-0.008011016994714737,
-0.1576050966978073,
-0.20705801248550415,
-0.0370449461042881,
-0.016211893409490585,
0.09252927452325821,
-0.19435349106788635,
0.016796033829450607,
-0.0604209266602993,
0.0890633687376976,
0.011206467635929585,
-0.019601281732320786,
-0.0717247799038887,
0.10983636230230331,
-0.011411318555474281,
-0.0487133264541626,
0.05943366140127182,
-0.020000291988253593,
-0.08201860636472702,
-0.08100414276123047,
-0.1126246303319931,
0.18951883912086487,
0.13827316462993622,
-0.15502974390983582,
-0.10187577456235886,
0.044042572379112244,
-0.06232598051428795,
-0.020616156980395317,
-0.06057468429207802,
0.05056897550821304,
0.1880485713481903,
0.003412766382098198,
0.13672234117984772,
-0.05068511515855789,
-0.03278718888759613,
0.029443303123116493,
-0.0310433991253376,
0.04307908937335014,
0.0897153839468956,
0.15059712529182434,
-0.03940613195300102,
0.11778322607278824,
0.1509907841682434,
-0.13500353693962097,
0.1329699009656906,
-0.010110517963767052,
-0.08420467376708984,
-0.019560277462005615,
-0.04399570822715759,
0.013255658559501171,
0.1273476928472519,
-0.11582465469837189,
-0.01773892343044281,
0.0002416743227513507,
0.007166935596615076,
0.022181224077939987,
-0.23684905469417572,
-0.051916737109422684,
0.030692225322127342,
-0.010813366621732712,
-0.02192240208387375,
-0.018059924244880676,
0.01964404806494713,
0.11857054382562637,
0.004721351433545351,
-0.07392850518226624,
0.017444796860218048,
0.0016925475792959332,
-0.06208620220422745,
0.21213039755821228,
-0.06676346063613892,
-0.10924061387777328,
-0.09383472055196762,
-0.08615770936012268,
-0.038485895842313766,
0.01526541355997324,
0.033705465495586395,
-0.1196124404668808,
-0.023631976917386055,
-0.029333680868148804,
0.022084373980760574,
0.019491953775286674,
0.07107469439506531,
0.008586333133280277,
-0.01912618987262249,
0.07496349513530731,
-0.09702760726213455,
-0.0049941278994083405,
-0.0754212886095047,
-0.08032887428998947,
0.05771036446094513,
0.07464209198951721,
0.1312241554260254,
0.16566209495067596,
-0.04243149608373642,
0.00475239148363471,
-0.016949454322457314,
0.24275775253772736,
-0.08115217834711075,
-0.03862946480512619,
0.09871209412813187,
-0.029973233118653297,
0.05294471234083176,
0.10050763189792633,
0.08532054722309113,
-0.09570303559303284,
0.008636540733277798,
0.0423126220703125,
-0.049345292150974274,
-0.18880756199359894,
-0.0320916622877121,
-0.057156212627887726,
-0.061752140522003174,
0.08368566632270813,
0.014280391857028008,
0.02554192766547203,
0.06510069966316223,
0.06277455389499664,
0.09555092453956604,
-0.07628457248210907,
0.037361424416303635,
0.06086978688836098,
0.050236862152814865,
0.13052695989608765,
-0.02676234021782875,
-0.09754843264818192,
0.010872794315218925,
-0.045429982244968414,
0.22133207321166992,
-0.0025950060226023197,
0.06370116770267487,
0.049043331295251846,
0.1911393254995346,
-0.0016745758475735784,
0.09473666548728943,
0.006538813002407551,
-0.07510871440172195,
0.0040976135060191154,
-0.04555337131023407,
-0.034210413694381714,
0.009465453214943409,
-0.016096729785203934,
0.06593941152095795,
-0.12345859408378601,
-0.0038542705588042736,
0.04752412810921669,
0.23847170174121857,
0.03327656537294388,
-0.3270232379436493,
-0.07426319271326065,
-0.012203285470604897,
-0.022330092266201973,
-0.00419027591124177,
-0.004094818141311407,
0.1222047507762909,
-0.08553053438663483,
0.03117891028523445,
-0.07347670197486877,
0.08326739072799683,
0.0025715141091495752,
0.04959415644407272,
0.06683506816625595,
0.12858504056930542,
-0.00578814372420311,
0.04989451915025711,
-0.30476075410842896,
0.29165324568748474,
0.006735708564519882,
0.09782925248146057,
-0.08185607939958572,
-0.010690880008041859,
0.039896395057439804,
0.02933371253311634,
0.041417431086301804,
-0.020869426429271698,
-0.009260797873139381,
-0.21770630776882172,
-0.03974118456244469,
0.041102271527051926,
0.12905465066432953,
-0.0031363784801214933,
0.10954775661230087,
-0.005851256661117077,
-0.0079354802146554,
0.08167478442192078,
0.007727944757789373,
-0.06524090468883514,
-0.06889476627111435,
-0.020920023322105408,
0.008913977071642876,
-0.09514705091714859,
-0.055419620126485825,
-0.13345029950141907,
-0.1390269547700882,
0.14656580984592438,
0.024426240473985672,
-0.0024459389969706535,
-0.12363003194332123,
0.12026125192642212,
0.08943556994199753,
-0.08186034113168716,
0.041643884032964706,
0.018901217728853226,
0.05956339091062546,
0.026190416887402534,
-0.06520816683769226,
0.11533936858177185,
-0.0662451982498169,
-0.14882878959178925,
-0.07729712128639221,
0.0813736841082573,
0.05193546414375305,
0.07845402508974075,
-0.02123054675757885,
0.02934906631708145,
-0.013146600686013699,
-0.08319361507892609,
0.04582541808485985,
-0.03534872457385063,
0.05324462056159973,
0.02825833484530449,
-0.04303828999400139,
0.0013924702070653439,
-0.05157936364412308,
0.0019079741323366761,
0.17473548650741577,
0.23967978358268738,
-0.09315288811922073,
-0.009791984222829342,
0.030079662799835205,
-0.051762811839580536,
-0.20696324110031128,
0.09228701144456863,
0.08556065708398819,
0.021174201741814613,
0.05760708823800087,
-0.15551121532917023,
0.15484270453453064,
0.0924990251660347,
0.0008827873971313238,
0.1284046769142151,
-0.30996495485305786,
-0.13546781241893768,
0.10553540289402008,
0.17389467358589172,
0.14830611646175385,
-0.14276757836341858,
-0.005759868770837784,
-0.02624211460351944,
-0.10369052737951279,
0.07826905697584152,
-0.08276035636663437,
0.12086597830057144,
-0.01811765879392624,
0.10271121561527252,
0.010336626321077347,
-0.07573084533214569,
0.10009902715682983,
-0.0015047406777739525,
0.09813780337572098,
-0.06380554288625717,
-0.05122731253504753,
0.04240002855658531,
-0.025525903329253197,
-0.028841154649853706,
-0.032552264630794525,
0.005489796865731478,
-0.03596137464046478,
-0.01350586861371994,
-0.09751008450984955,
0.04139583557844162,
-0.029522711411118507,
-0.05961140617728233,
-0.017950143665075302,
0.024953685700893402,
0.043855275958776474,
-0.021473029628396034,
0.095500148832798,
0.010934596881270409,
0.1886860579252243,
0.0585058331489563,
0.05809527635574341,
-0.07421340048313141,
-0.0535779632627964,
0.012154173105955124,
-0.01050878781825304,
0.06674724817276001,
-0.10805217176675797,
0.014301265589892864,
0.1490686535835266,
0.03406580537557602,
0.11110962182283401,
0.09696155786514282,
-0.0253138467669487,
0.02195722796022892,
0.08124212175607681,
-0.16325442492961884,
-0.06123119965195656,
0.013621469959616661,
-0.08792372792959213,
-0.11804108321666718,
0.044672295451164246,
0.07631179690361023,
-0.07562766224145889,
-0.0005944695440120995,
-0.015946567058563232,
-0.01757596991956234,
-0.08200222253799438,
0.22991672158241272,
0.06860245764255524,
0.04322993382811546,
-0.09740076959133148,
0.03566915914416313,
0.04475540667772293,
-0.0865761935710907,
0.000031868457881500944,
0.08376199007034302,
-0.06250083446502686,
-0.011060926131904125,
0.11923561990261078,
0.2031702846288681,
-0.04405314102768898,
-0.010277178138494492,
-0.15111562609672546,
-0.10675718635320663,
0.06294477730989456,
0.20126332342624664,
0.09928874671459198,
-0.012566950172185898,
-0.060252029448747635,
0.04468844085931778,
-0.14273612201213837,
0.06540613621473312,
0.04863647744059563,
0.08038973808288574,
-0.11928488314151764,
0.21378685534000397,
0.0031481776386499405,
0.05156838148832321,
-0.03549792617559433,
0.041516512632369995,
-0.11408154666423798,
0.024504046887159348,
-0.11905800551176071,
-0.05981198698282242,
-0.006096096243709326,
-0.022077644243836403,
-0.0028171814046800137,
-0.07460960000753403,
-0.06864092499017715,
-0.00012837073882110417,
-0.1267959028482437,
-0.028732461854815483,
0.045072488486766815,
0.006945312023162842,
-0.1193162128329277,
-0.03743574768304825,
0.02733756974339485,
-0.056103404611349106,
0.04279834404587746,
0.05887634679675102,
0.020431535318493843,
0.08068010956048965,
-0.176357701420784,
-0.036549147218465805,
0.06111890450119972,
-0.0008975841919891536,
0.1011032834649086,
-0.03718985617160797,
-0.011698327027261257,
-0.013627012260258198,
0.10983957350254059,
0.023104719817638397,
0.0766916498541832,
-0.13529376685619354,
0.008137134835124016,
-0.03142014145851135,
-0.10225238651037216,
-0.05506172403693199,
0.0154444994404912,
0.07115865498781204,
0.009943617507815361,
0.17614665627479553,
-0.10007571429014206,
0.07106710225343704,
-0.21671901643276215,
-0.009785722941160202,
-0.01820845529437065,
-0.09070399403572083,
-0.09804287552833557,
-0.05205312371253967,
0.08979342877864838,
-0.05473289266228676,
0.10927208513021469,
0.027863578870892525,
0.08167310059070587,
0.030908741056919098,
-0.030851703137159348,
-0.00020115847291890532,
0.027712378650903702,
0.19562655687332153,
0.05033518746495247,
-0.04905720800161362,
0.06377633661031723,
0.08235683292150497,
0.10689575225114822,
0.11612473428249359,
0.2494431734085083,
0.13769792020320892,
0.00479986984282732,
0.09976983815431595,
0.03459687530994415,
-0.0528223030269146,
-0.14862576127052307,
0.00530048506334424,
-0.07940389215946198,
0.08931348472833633,
-0.02559768222272396,
0.18254080414772034,
0.053672902286052704,
-0.16025522351264954,
0.04490147903561592,
-0.06661848723888397,
-0.10109811276197433,
-0.10663332790136337,
0.0008108383626677096,
-0.08088849484920502,
-0.125822052359581,
0.01999427191913128,
-0.09603998064994812,
0.021396925672888756,
0.11681027710437775,
0.013774064369499683,
-0.014195085503160954,
0.22389623522758484,
0.03210490569472313,
0.05747838318347931,
0.05721547454595566,
0.01665268838405609,
-0.009893476963043213,
-0.07265272736549377,
-0.062408410012722015,
-0.04147135838866234,
-0.008752593770623207,
0.03843417763710022,
-0.07554889470338821,
-0.10245194286108017,
0.04920671135187149,
0.0006436549592763186,
-0.11545142531394958,
0.0250560212880373,
0.0134214973077178,
0.08038881421089172,
0.02886802703142166,
0.0013017789460718632,
0.029758771881461143,
-0.03939750790596008,
0.19135889410972595,
-0.0946003869175911,
-0.08865116536617279,
-0.09919420629739761,
0.27595674991607666,
0.03796280175447464,
-0.0034534167498350143,
0.009415767155587673,
-0.06585882604122162,
-0.0053132385946810246,
0.24821844696998596,
0.20903006196022034,
-0.12960878014564514,
0.00018973072292283177,
0.007651051040738821,
-0.01216838601976633,
-0.048947978764772415,
0.13739798963069916,
0.12581150233745575,
0.07195891439914703,
-0.10859408229589462,
-0.0520414374768734,
-0.061438027769327164,
-0.016164101660251617,
-0.04255646839737892,
0.028553064912557602,
0.06514755636453629,
0.024607816711068153,
-0.05775150656700134,
0.06311190873384476,
-0.06647990643978119,
-0.12410445511341095,
0.09157954156398773,
-0.2274026721715927,
-0.17334946990013123,
-0.008075359277427197,
0.11729436367750168,
-0.0019106435356661677,
0.08328032493591309,
-0.028730472549796104,
-0.003289751475676894,
0.05314198508858681,
-0.02409638836979866,
-0.05741436406970024,
-0.09441334754228592,
0.10127303749322891,
-0.1032259613275528,
0.19946379959583282,
-0.04488328844308853,
0.07231160253286362,
0.12222351878881454,
0.07862439006567001,
-0.04165499284863472,
0.05079738050699234,
0.04581068828701973,
-0.12779204547405243,
0.009650387801229954,
0.13441316783428192,
-0.040038879960775375,
0.027940871194005013,
0.04075118154287338,
-0.13831143081188202,
0.03959493339061737,
-0.09944911301136017,
-0.03755798935890198,
-0.03413943573832512,
-0.04271988570690155,
-0.05706609785556793,
0.11642289906740189,
0.24008744955062866,
-0.006424346938729286,
0.028045551851391792,
-0.07942261546850204,
0.0061412230134010315,
0.05484328791499138,
0.06354586035013199,
-0.11634671688079834,
-0.2563081681728363,
0.01598319783806801,
0.045006148517131805,
-0.042532265186309814,
-0.23607465624809265,
-0.09508538991212845,
0.014705062843859196,
-0.0785178691148758,
-0.08634442836046219,
0.07473259419202805,
0.07163660228252411,
0.061101969331502914,
-0.04880274832248688,
-0.11316004395484924,
-0.07035776227712631,
0.15810780227184296,
-0.16609163582324982,
-0.09227148443460464
] |
null | null | transformers |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# gpt2-medium-dutch-finetuned-text-generation
This model is a fine-tuned version of [GroNLP/gpt2-medium-dutch-embeddings](https://huggingface.co/GroNLP/gpt2-medium-dutch-embeddings) on an unkown dataset.
It achieves the following results on the evaluation set:
- Loss: 3.9268
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 3.0
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:-----:|:----:|:---------------:|
| No log | 1.0 | 394 | 4.0144 |
| 3.3633 | 2.0 | 788 | 3.9379 |
| 2.7108 | 3.0 | 1182 | 3.9268 |
### Framework versions
- Transformers 4.9.0
- Pytorch 1.9.0+cu102
- Datasets 1.10.2
- Tokenizers 0.10.3
| {"tags": ["generated_from_trainer"], "model_index": [{"name": "gpt2-medium-dutch-finetuned-text-generation", "results": [{"task": {"name": "Causal Language Modeling", "type": "text-generation"}}]}]} | text-generation | Pyjay/gpt2-medium-dutch-finetuned-text-generation | [
"transformers",
"pytorch",
"tensorboard",
"gpt2",
"text-generation",
"generated_from_trainer",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [] | TAGS
#transformers #pytorch #tensorboard #gpt2 #text-generation #generated_from_trainer #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
| gpt2-medium-dutch-finetuned-text-generation
===========================================
This model is a fine-tuned version of GroNLP/gpt2-medium-dutch-embeddings on an unkown dataset.
It achieves the following results on the evaluation set:
* Loss: 3.9268
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 2e-05
* train\_batch\_size: 8
* eval\_batch\_size: 8
* seed: 42
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* num\_epochs: 3.0
### Training results
### Framework versions
* Transformers 4.9.0
* Pytorch 1.9.0+cu102
* Datasets 1.10.2
* Tokenizers 0.10.3
| [
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 8\n* eval\\_batch\\_size: 8\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 3.0",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.9.0\n* Pytorch 1.9.0+cu102\n* Datasets 1.10.2\n* Tokenizers 0.10.3"
] | [
"TAGS\n#transformers #pytorch #tensorboard #gpt2 #text-generation #generated_from_trainer #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 8\n* eval\\_batch\\_size: 8\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 3.0",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.9.0\n* Pytorch 1.9.0+cu102\n* Datasets 1.10.2\n* Tokenizers 0.10.3"
] | [
58,
98,
4,
35
] | [
"passage: TAGS\n#transformers #pytorch #tensorboard #gpt2 #text-generation #generated_from_trainer #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 8\n* eval\\_batch\\_size: 8\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 3.0### Training results### Framework versions\n\n\n* Transformers 4.9.0\n* Pytorch 1.9.0+cu102\n* Datasets 1.10.2\n* Tokenizers 0.10.3"
] | [
-0.0968809574842453,
0.04838081821799278,
-0.0022084214724600315,
0.1074838787317276,
0.18029356002807617,
0.020224273204803467,
0.12615835666656494,
0.1255510151386261,
-0.12327337265014648,
0.031047385185956955,
0.1387261301279068,
0.168846994638443,
0.011243524961173534,
0.12003077566623688,
-0.05034506320953369,
-0.27236127853393555,
-0.017588555812835693,
0.06017780676484108,
-0.05197053402662277,
0.14815180003643036,
0.08348850160837173,
-0.13930056989192963,
0.0745851993560791,
-0.007452408783137798,
-0.2438432276248932,
0.000158037175424397,
0.011652300134301186,
-0.06335315853357315,
0.1537589132785797,
0.017171381041407585,
0.1283966600894928,
0.011281285434961319,
0.07079468667507172,
-0.15958987176418304,
0.012328870594501495,
0.04669899120926857,
0.018455607816576958,
0.0910767912864685,
0.06242996081709862,
-0.005089402664452791,
0.144361212849617,
-0.07449464499950409,
0.043465182185173035,
0.008084261789917946,
-0.1332104355096817,
-0.18231071531772614,
-0.0683756172657013,
0.01394125446677208,
0.05248939245939255,
0.11792676895856857,
-0.01967742294073105,
0.13544119894504547,
-0.07985319197177887,
0.10739997774362564,
0.24161851406097412,
-0.2647187113761902,
-0.07129999995231628,
0.02298559434711933,
0.029397407546639442,
0.09525179862976074,
-0.10476351529359818,
-0.011085706762969494,
0.05329430103302002,
0.054036419838666916,
0.12826991081237793,
-0.03030559793114662,
-0.0954371988773346,
0.015758628025650978,
-0.1501684933900833,
-0.0385686531662941,
0.12410428375005722,
0.007753877900540829,
-0.019874105229973793,
-0.044953349977731705,
-0.06745290756225586,
-0.17487631738185883,
-0.0372033566236496,
-0.01927385851740837,
0.020413829013705254,
-0.0426168330013752,
-0.09334926307201385,
-0.031027210876345634,
-0.11102559417486191,
-0.07381535321474075,
-0.07019360363483429,
0.15673354268074036,
0.03967665880918503,
-0.001110552460886538,
-0.04342321306467056,
0.12320669740438461,
0.0010631104232743382,
-0.12891937792301178,
0.03647216036915779,
0.02920694835484028,
0.016614452004432678,
-0.044719256460666656,
-0.07313033938407898,
-0.09569055587053299,
-0.0026567738968878984,
0.07576175034046173,
-0.07384514808654785,
0.04334881529211998,
0.027632877230644226,
0.045924898236989975,
-0.09181801974773407,
0.20079739391803741,
-0.03575318306684494,
0.0044693597592413425,
0.0015545680653303862,
0.05552930757403374,
-0.0074670701287686825,
-0.02463798224925995,
-0.1165943443775177,
-0.008930987678468227,
0.13835695385932922,
0.014458195306360722,
-0.07115695625543594,
0.07896006852388382,
-0.03760577365756035,
-0.026990044862031937,
-0.019023053348064423,
-0.09547294676303864,
0.024844501167535782,
-0.010169122368097305,
-0.08364945650100708,
0.008752490393817425,
0.020869221538305283,
0.007409443147480488,
-0.04719710350036621,
0.1178654134273529,
-0.08561715483665466,
0.045899130403995514,
-0.1012391522526741,
-0.11787808686494827,
-0.0011783071095123887,
-0.09191858768463135,
0.009544625878334045,
-0.09702087938785553,
-0.16207344830036163,
-0.02572186477482319,
0.03041957877576351,
-0.027464544400572777,
-0.04818375036120415,
-0.07425052672624588,
-0.07252616435289383,
0.006874692626297474,
-0.019067242741584778,
0.1431574672460556,
-0.052320174872875214,
0.13155071437358856,
0.04781462624669075,
0.07374417036771774,
-0.04681232199072838,
0.06044568493962288,
-0.08684036135673523,
-0.004376135766506195,
-0.18883053958415985,
0.07438929378986359,
-0.024824263527989388,
0.062399476766586304,
-0.07497794926166534,
-0.11849290877580643,
-0.0014988826587796211,
0.006066313479095697,
0.09063665568828583,
0.10577886551618576,
-0.15830086171627045,
-0.09409012645483017,
0.1847926825284958,
-0.05775841698050499,
-0.09565398097038269,
0.12638366222381592,
-0.061221182346343994,
0.06738311052322388,
0.08819577842950821,
0.18772511184215546,
0.054784469306468964,
-0.06205975264310837,
0.029053671285510063,
-0.017054982483386993,
0.048719193786382675,
-0.050013717263936996,
0.04409599304199219,
0.004940487910062075,
0.005359542556107044,
0.03495967015624046,
-0.01206792052835226,
0.06661819666624069,
-0.11579190939664841,
-0.0795639157295227,
-0.03604405000805855,
-0.0949895828962326,
0.06571987271308899,
0.07014185935258865,
0.10663152486085892,
-0.09841524064540863,
-0.07134969532489777,
0.06367926299571991,
0.06328251957893372,
-0.08359314501285553,
0.03536152094602585,
-0.048797011375427246,
0.07428537309169769,
-0.04511095583438873,
-0.01405406091362238,
-0.2013300061225891,
-0.01583746448159218,
0.009481078013777733,
0.048148635774850845,
0.0321480818092823,
0.01362187135964632,
0.07751689106225967,
0.06386620551347733,
-0.0493306964635849,
-0.013733058236539364,
-0.01774325780570507,
-0.02373911440372467,
-0.1476113349199295,
-0.16727885603904724,
-0.005387351382523775,
-0.01621866412460804,
0.11763466894626617,
-0.20422281324863434,
0.04020465165376663,
-0.0063419826328754425,
0.0631319060921669,
0.0008188101346604526,
-0.02044893614947796,
-0.037804264575242996,
0.0942385196685791,
-0.035266440361738205,
-0.04513046517968178,
0.08499575406312943,
0.0014006375567987561,
-0.0879705473780632,
-0.04221821948885918,
-0.13103580474853516,
0.1622016578912735,
0.13998644053936005,
-0.14641724526882172,
-0.09498952329158783,
-0.015847712755203247,
-0.05950930714607239,
-0.026326347142457962,
-0.04400845244526863,
0.023628342896699905,
0.2180345207452774,
-0.008432058617472649,
0.1613730788230896,
-0.07279659062623978,
-0.050760045647621155,
0.020433757454156876,
-0.03213118016719818,
0.04476065933704376,
0.1286415010690689,
0.1009281650185585,
-0.060121096670627594,
0.13122011721134186,
0.11618421226739883,
-0.0955747589468956,
0.16032420098781586,
-0.023428289219737053,
-0.07745732367038727,
-0.001569645944982767,
-0.009419220499694347,
-0.002545860130339861,
0.06734132766723633,
-0.15253473818302155,
-0.021465133875608444,
0.013127405196428299,
0.02313496172428131,
0.04168801009654999,
-0.233833447098732,
-0.042684782296419144,
0.03126223385334015,
-0.04296661540865898,
0.00033647214877419174,
-0.008617711253464222,
0.014937263913452625,
0.12051118910312653,
0.003506735432893038,
-0.05844040587544441,
0.030716324225068092,
0.008291760459542274,
-0.08191302418708801,
0.22016777098178864,
-0.06820365786552429,
-0.17968116700649261,
-0.11468848586082458,
-0.06987505406141281,
-0.0466962605714798,
0.01974872313439846,
0.06002657115459442,
-0.10859234631061554,
-0.007340163923799992,
-0.0577315017580986,
0.0520746149122715,
-0.011764478869736195,
0.040248624980449677,
-0.0004004125075880438,
-0.00797952339053154,
0.04577566683292389,
-0.10286561399698257,
-0.0076414369978010654,
-0.06687987595796585,
-0.07712623476982117,
0.06899993121623993,
0.02356080897152424,
0.11676233261823654,
0.17373061180114746,
-0.028130613267421722,
0.020324308425188065,
-0.03736289590597153,
0.2312426120042801,
-0.08503028005361557,
-0.03990962728857994,
0.1253521591424942,
-0.0080482866615057,
0.05517685040831566,
0.08180403709411621,
0.05780849978327751,
-0.09248632192611694,
0.016698045656085014,
0.02250823564827442,
-0.04603291675448418,
-0.2143072932958603,
-0.04166744276881218,
-0.053266603499650955,
-0.011198502965271473,
0.0943010002374649,
0.022524503991007805,
0.050913698971271515,
0.07574009150266647,
0.04893442988395691,
0.08603425323963165,
-0.037435032427310944,
0.05017288774251938,
0.11873052269220352,
0.035379309207201004,
0.13318124413490295,
-0.03651077672839165,
-0.08886048942804337,
0.033789679408073425,
-0.02770708128809929,
0.22295869886875153,
-0.024077480658888817,
0.1230563372373581,
0.04179868847131729,
0.1636088788509369,
0.013443456962704659,
0.09087906032800674,
-0.00867860484868288,
-0.04503092169761658,
-0.008929875679314137,
-0.0345541387796402,
-0.05382818728685379,
0.0013825972564518452,
-0.050512634217739105,
0.0374782420694828,
-0.1325376033782959,
-0.017583515495061874,
0.053817667067050934,
0.22524699568748474,
0.02658224105834961,
-0.33721959590911865,
-0.08883088827133179,
-0.008506341837346554,
-0.0312698595225811,
-0.02789691463112831,
0.0200171060860157,
0.11532800644636154,
-0.10898176580667496,
0.01366997230798006,
-0.07557858526706696,
0.09365701675415039,
-0.06798815727233887,
0.059713780879974365,
0.040166761726140976,
0.095800019800663,
-0.0049545238725841045,
0.08083726465702057,
-0.3257361054420471,
0.2680692970752716,
-0.000985634163953364,
0.06778408586978912,
-0.08036200702190399,
-0.014296798035502434,
0.02315160073339939,
0.041926175355911255,
0.050621677190065384,
-0.0174864511936903,
-0.007437488064169884,
-0.20068855583667755,
-0.05934029817581177,
0.03619870916008949,
0.12027201056480408,
-0.030295398086309433,
0.10227824747562408,
-0.01902461238205433,
0.01840829849243164,
0.064728744328022,
-0.06033441796898842,
-0.03878441080451012,
-0.09352041780948639,
0.0009488348150625825,
0.008433312177658081,
-0.02140820026397705,
-0.04702826216816902,
-0.11914259195327759,
-0.12367787212133408,
0.15991967916488647,
0.009847916662693024,
-0.04350047558546066,
-0.11283669620752335,
0.09875167906284332,
0.07742062211036682,
-0.08803857862949371,
0.032765086740255356,
0.017789069563150406,
0.06288176774978638,
0.026391729712486267,
-0.06177883967757225,
0.10639480501413345,
-0.0466911680996418,
-0.16045013070106506,
-0.042672500014305115,
0.11621720343828201,
0.04292681813240051,
0.06127304211258888,
-0.013503629714250565,
0.00783439539372921,
-0.04619317501783371,
-0.09831025451421738,
0.025965616106987,
-0.018784038722515106,
0.06772962212562561,
0.030380772426724434,
-0.05033741891384125,
0.029237043112516403,
-0.07748313993215561,
-0.034618254750967026,
0.2259853184223175,
0.22396761178970337,
-0.07918699085712433,
0.020758546888828278,
0.03710385784506798,
-0.07424796372652054,
-0.20305362343788147,
0.04748784378170967,
0.07359794527292252,
0.015913760289549828,
0.020279712975025177,
-0.18659909069538116,
0.08854389935731888,
0.09950388222932816,
0.005599577911198139,
0.12187132239341736,
-0.35573136806488037,
-0.13161276280879974,
0.10786670446395874,
0.15492936968803406,
0.13005992770195007,
-0.1498049795627594,
-0.02177514135837555,
-0.018633443862199783,
-0.10171748697757721,
0.10514171421527863,
-0.07084499299526215,
0.13754969835281372,
-0.029955845326185226,
0.12123697251081467,
0.0169660747051239,
-0.06272111088037491,
0.10328167676925659,
0.013394195586442947,
0.07855682075023651,
-0.07093433290719986,
-0.03217150270938873,
0.035009317100048065,
-0.030888082459568977,
0.006402698345482349,
-0.06009431183338165,
0.024277260527014732,
-0.09480813145637512,
-0.032433267682790756,
-0.09128531813621521,
0.029286310076713562,
-0.025525012984871864,
-0.060816846787929535,
-0.030131282284855843,
0.00105668930336833,
0.037471216171979904,
-0.008000986650586128,
0.09228642284870148,
-0.00738101452589035,
0.1660686880350113,
0.09609773010015488,
0.08068642765283585,
-0.06808670610189438,
-0.045417286455631256,
-0.005672798492014408,
-0.005428055766969919,
0.04929066821932793,
-0.13042734563350677,
0.019428106024861336,
0.15704408288002014,
0.02159583941102028,
0.1361190229654312,
0.09312844276428223,
-0.023033278062939644,
0.028444137424230576,
0.05963130295276642,
-0.1875690072774887,
-0.07550060003995895,
-0.02389156073331833,
-0.08489729464054108,
-0.09022857993841171,
0.05568762496113777,
0.1005341038107872,
-0.06432745605707169,
-0.012470596469938755,
-0.01842563971877098,
-0.007988614961504936,
-0.05382434278726578,
0.21537497639656067,
0.04991314560174942,
0.04720465838909149,
-0.10175786912441254,
0.051408205181360245,
0.044085972011089325,
-0.07969411462545395,
0.018484577536582947,
0.10979478061199188,
-0.08308145403862,
-0.03929618000984192,
0.08796214312314987,
0.18591448664665222,
-0.07358355820178986,
-0.021304622292518616,
-0.14547854661941528,
-0.11869736760854721,
0.08069116622209549,
0.1691070944070816,
0.10460210591554642,
0.015052302740514278,
-0.07138510048389435,
0.019115025177598,
-0.15097658336162567,
0.06990659236907959,
0.06043555960059166,
0.06294181942939758,
-0.11611523479223251,
0.20873618125915527,
0.005991923622786999,
0.05240122228860855,
-0.03374885022640228,
0.013535435311496258,
-0.10757169127464294,
0.02876712754368782,
-0.10722754895687103,
-0.038889143615961075,
-0.012339884415268898,
-0.009929650463163853,
-0.014738156460225582,
-0.050553373992443085,
-0.042204421013593674,
-0.002750361105427146,
-0.12307736277580261,
-0.018002212047576904,
0.020260320976376534,
0.02575046941637993,
-0.10629212111234665,
-0.02750333398580551,
0.01871485821902752,
-0.05788052827119827,
0.0822191834449768,
0.063925601541996,
0.003299528965726495,
0.06769858300685883,
-0.14321067929267883,
0.0035108153242617846,
0.0714004635810852,
-0.0015065608313307166,
0.054721586406230927,
-0.051858432590961456,
0.00041456936742179096,
0.0012884113239124417,
0.08710771799087524,
0.03739636391401291,
0.06809794902801514,
-0.1342652291059494,
0.01824718713760376,
-0.02873002178966999,
-0.07517924159765244,
-0.07762344926595688,
0.04398871213197708,
0.05280613154172897,
0.029226386919617653,
0.16354164481163025,
-0.0890859067440033,
0.055933576077222824,
-0.20571134984493256,
-0.0006654510507360101,
-0.005029039923101664,
-0.1257864236831665,
-0.09037864953279495,
-0.07345759868621826,
0.08319946378469467,
-0.04772999510169029,
0.12964124977588654,
0.03089127503335476,
0.04910526052117348,
0.01636013388633728,
-0.023045796900987625,
0.009418935514986515,
0.01957455463707447,
0.2120518535375595,
0.04384109750390053,
-0.04837879538536072,
0.05756014212965965,
0.07282619178295135,
0.1077648401260376,
0.12881147861480713,
0.21437640488147736,
0.11852040141820908,
-0.019087227061390877,
0.0907861739397049,
0.016934648156166077,
-0.034162405878305435,
-0.1523517668247223,
0.032178230583667755,
-0.05637222155928612,
0.09146679937839508,
-0.030803989619016647,
0.21145440638065338,
0.06596212089061737,
-0.14925655722618103,
0.04314712435007095,
-0.04434124380350113,
-0.10597866028547287,
-0.099362313747406,
-0.034841008484363556,
-0.08080180734395981,
-0.14064866304397583,
0.0033398016821593046,
-0.12112459540367126,
0.02732081525027752,
0.10755056887865067,
0.02057483047246933,
-0.03290870785713196,
0.18272846937179565,
0.05046200752258301,
0.0039812615141272545,
0.0821969285607338,
-0.0034709987230598927,
-0.009026732295751572,
-0.10862244665622711,
-0.06443892419338226,
-0.01927110180258751,
0.0066059245727956295,
0.053417809307575226,
-0.045156240463256836,
-0.07431365549564362,
0.026777468621730804,
-0.040105365216732025,
-0.10029394179582596,
0.007783655077219009,
0.034557271748781204,
0.07614147663116455,
0.0397493913769722,
0.0009746197611093521,
-0.010530339553952217,
-0.024331487715244293,
0.2295479029417038,
-0.07485515624284744,
-0.08746443688869476,
-0.0834103524684906,
0.26668182015419006,
0.040437761694192886,
-0.006091040559113026,
0.019809430465102196,
-0.059730470180511475,
0.0022582083474844694,
0.2810267210006714,
0.2064015120267868,
-0.10326974093914032,
-0.0081781642511487,
0.005786443594843149,
-0.0028667824808508158,
0.002752486616373062,
0.12669338285923004,
0.1386803388595581,
0.06948685646057129,
-0.10639958083629608,
-0.04077838733792305,
-0.05314218997955322,
-0.008865153416991234,
-0.03843991830945015,
0.06195554509758949,
0.05778469145298004,
0.016489338129758835,
-0.05355563014745712,
0.06329178810119629,
-0.10203630477190018,
-0.08001995831727982,
0.015300456434488297,
-0.21837249398231506,
-0.15855245292186737,
-0.0013544366229325533,
0.098411425948143,
-0.009847268462181091,
0.07313767820596695,
-0.026330608874559402,
0.002529021818190813,
0.0512058399617672,
-0.02254209853708744,
-0.08412361890077591,
-0.05053820461034775,
0.08374538272619247,
-0.12101057171821594,
0.15873092412948608,
-0.04786095768213272,
0.06509470194578171,
0.11886525899171829,
0.056582480669021606,
-0.05427362397313118,
0.06910575926303864,
0.028006749227643013,
-0.06546690315008163,
0.03308163210749626,
0.11980932950973511,
-0.026369506493210793,
0.017723845317959785,
0.04359740391373634,
-0.1343492567539215,
0.027524521574378014,
-0.08916577696800232,
-0.03198493272066116,
-0.03041919879615307,
-0.06280052661895752,
-0.055525146424770355,
0.1235368475317955,
0.22577065229415894,
-0.018427208065986633,
0.025300407782197,
-0.07841871678829193,
0.0009793315548449755,
0.037063006311655045,
0.048061471432447433,
-0.07799481600522995,
-0.25593101978302,
-0.018181730061769485,
0.08443313091993332,
-0.03702622279524803,
-0.2692044675350189,
-0.08047409355640411,
-0.0002270637487526983,
-0.06656427681446075,
-0.10088490694761276,
0.09264538437128067,
0.06895444542169571,
0.05112272500991821,
-0.048355817794799805,
-0.06219981610774994,
-0.07814238220453262,
0.16975730657577515,
-0.1554383635520935,
-0.08864512294530869
] |
null | null | sentence-transformers |
# Pyjay/sentence-transformers-multilingual-snli-v2-500k
This is a [sentence-transformers](https://www.SBERT.net) model: It maps sentences & paragraphs to a 768 dimensional dense vector space and can be used for tasks like clustering or semantic search.
<!--- Describe your model here -->
## Usage (Sentence-Transformers)
Using this model becomes easy when you have [sentence-transformers](https://www.SBERT.net) installed:
```
pip install -U sentence-transformers
```
Then you can use the model like this:
```python
from sentence_transformers import SentenceTransformer
sentences = ["This is an example sentence", "Each sentence is converted"]
model = SentenceTransformer('Pyjay/sentence-transformers-multilingual-snli-v2-500k')
embeddings = model.encode(sentences)
print(embeddings)
```
## Usage (HuggingFace Transformers)
Without [sentence-transformers](https://www.SBERT.net), you can use the model like this: First, you pass your input through the transformer model, then you have to apply the right pooling-operation on-top of the contextualized word embeddings.
```python
from transformers import AutoTokenizer, AutoModel
import torch
#Mean Pooling - Take attention mask into account for correct averaging
def mean_pooling(model_output, attention_mask):
token_embeddings = model_output[0] #First element of model_output contains all token embeddings
input_mask_expanded = attention_mask.unsqueeze(-1).expand(token_embeddings.size()).float()
return torch.sum(token_embeddings * input_mask_expanded, 1) / torch.clamp(input_mask_expanded.sum(1), min=1e-9)
# Sentences we want sentence embeddings for
sentences = ['This is an example sentence', 'Each sentence is converted']
# Load model from HuggingFace Hub
tokenizer = AutoTokenizer.from_pretrained('Pyjay/sentence-transformers-multilingual-snli-v2-500k')
model = AutoModel.from_pretrained('Pyjay/sentence-transformers-multilingual-snli-v2-500k')
# Tokenize sentences
encoded_input = tokenizer(sentences, padding=True, truncation=True, return_tensors='pt')
# Compute token embeddings
with torch.no_grad():
model_output = model(**encoded_input)
# Perform pooling. In this case, max pooling.
sentence_embeddings = mean_pooling(model_output, encoded_input['attention_mask'])
print("Sentence embeddings:")
print(sentence_embeddings)
```
## Evaluation Results
<!--- Describe how your model was evaluated -->
For an automated evaluation of this model, see the *Sentence Embeddings Benchmark*: [https://seb.sbert.net](https://seb.sbert.net?model_name=Pyjay/sentence-transformers-multilingual-snli-v2-500k)
## Training
The model was trained with the parameters:
**DataLoader**:
`torch.utils.data.dataloader.DataLoader` of length 15604 with parameters:
```
{'batch_size': 32, 'sampler': 'torch.utils.data.sampler.RandomSampler', 'batch_sampler': 'torch.utils.data.sampler.BatchSampler'}
```
**Loss**:
`sentence_transformers.losses.SoftmaxLoss.SoftmaxLoss`
**DataLoader**:
`torch.utils.data.dataloader.DataLoader` of length 180 with parameters:
```
{'batch_size': 32, 'sampler': 'torch.utils.data.sampler.RandomSampler', 'batch_sampler': 'torch.utils.data.sampler.BatchSampler'}
```
**Loss**:
`sentence_transformers.losses.CosineSimilarityLoss.CosineSimilarityLoss`
Parameters of the fit()-Method:
```
{
"callback": null,
"epochs": 4,
"evaluation_steps": 1000,
"evaluator": "sentence_transformers.evaluation.EmbeddingSimilarityEvaluator.EmbeddingSimilarityEvaluator",
"max_grad_norm": 1,
"optimizer_class": "<class 'transformers.optimization.AdamW'>",
"optimizer_params": {
"lr": 2e-05
},
"scheduler": "WarmupLinear",
"steps_per_epoch": null,
"warmup_steps": 72,
"weight_decay": 0.01
}
```
## Full Model Architecture
```
SentenceTransformer(
(0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: XLMRobertaModel
(1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False})
)
```
## Citing & Authors
<!--- Describe where people can find more information --> | {"tags": ["sentence-transformers", "feature-extraction", "sentence-similarity", "transformers"], "pipeline_tag": "sentence-similarity"} | sentence-similarity | Pyjay/sentence-transformers-multilingual-snli-v2-500k | [
"sentence-transformers",
"pytorch",
"xlm-roberta",
"feature-extraction",
"sentence-similarity",
"transformers",
"endpoints_compatible",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [] | TAGS
#sentence-transformers #pytorch #xlm-roberta #feature-extraction #sentence-similarity #transformers #endpoints_compatible #region-us
|
# Pyjay/sentence-transformers-multilingual-snli-v2-500k
This is a sentence-transformers model: It maps sentences & paragraphs to a 768 dimensional dense vector space and can be used for tasks like clustering or semantic search.
## Usage (Sentence-Transformers)
Using this model becomes easy when you have sentence-transformers installed:
Then you can use the model like this:
## Usage (HuggingFace Transformers)
Without sentence-transformers, you can use the model like this: First, you pass your input through the transformer model, then you have to apply the right pooling-operation on-top of the contextualized word embeddings.
## Evaluation Results
For an automated evaluation of this model, see the *Sentence Embeddings Benchmark*: URL
## Training
The model was trained with the parameters:
DataLoader:
'URL.dataloader.DataLoader' of length 15604 with parameters:
Loss:
'sentence_transformers.losses.SoftmaxLoss.SoftmaxLoss'
DataLoader:
'URL.dataloader.DataLoader' of length 180 with parameters:
Loss:
'sentence_transformers.losses.CosineSimilarityLoss.CosineSimilarityLoss'
Parameters of the fit()-Method:
## Full Model Architecture
## Citing & Authors
| [
"# Pyjay/sentence-transformers-multilingual-snli-v2-500k\r\n\r\nThis is a sentence-transformers model: It maps sentences & paragraphs to a 768 dimensional dense vector space and can be used for tasks like clustering or semantic search.",
"## Usage (Sentence-Transformers)\r\n\r\nUsing this model becomes easy when you have sentence-transformers installed:\r\n\r\n\r\n\r\nThen you can use the model like this:",
"## Usage (HuggingFace Transformers)\r\nWithout sentence-transformers, you can use the model like this: First, you pass your input through the transformer model, then you have to apply the right pooling-operation on-top of the contextualized word embeddings.",
"## Evaluation Results\r\n\r\n\r\n\r\nFor an automated evaluation of this model, see the *Sentence Embeddings Benchmark*: URL",
"## Training\r\nThe model was trained with the parameters:\r\n\r\nDataLoader:\r\n\r\n'URL.dataloader.DataLoader' of length 15604 with parameters:\r\n\r\n\r\nLoss:\r\n\r\n'sentence_transformers.losses.SoftmaxLoss.SoftmaxLoss' \r\n\r\nDataLoader:\r\n\r\n'URL.dataloader.DataLoader' of length 180 with parameters:\r\n\r\n\r\nLoss:\r\n\r\n'sentence_transformers.losses.CosineSimilarityLoss.CosineSimilarityLoss' \r\n\r\nParameters of the fit()-Method:",
"## Full Model Architecture",
"## Citing & Authors"
] | [
"TAGS\n#sentence-transformers #pytorch #xlm-roberta #feature-extraction #sentence-similarity #transformers #endpoints_compatible #region-us \n",
"# Pyjay/sentence-transformers-multilingual-snli-v2-500k\r\n\r\nThis is a sentence-transformers model: It maps sentences & paragraphs to a 768 dimensional dense vector space and can be used for tasks like clustering or semantic search.",
"## Usage (Sentence-Transformers)\r\n\r\nUsing this model becomes easy when you have sentence-transformers installed:\r\n\r\n\r\n\r\nThen you can use the model like this:",
"## Usage (HuggingFace Transformers)\r\nWithout sentence-transformers, you can use the model like this: First, you pass your input through the transformer model, then you have to apply the right pooling-operation on-top of the contextualized word embeddings.",
"## Evaluation Results\r\n\r\n\r\n\r\nFor an automated evaluation of this model, see the *Sentence Embeddings Benchmark*: URL",
"## Training\r\nThe model was trained with the parameters:\r\n\r\nDataLoader:\r\n\r\n'URL.dataloader.DataLoader' of length 15604 with parameters:\r\n\r\n\r\nLoss:\r\n\r\n'sentence_transformers.losses.SoftmaxLoss.SoftmaxLoss' \r\n\r\nDataLoader:\r\n\r\n'URL.dataloader.DataLoader' of length 180 with parameters:\r\n\r\n\r\nLoss:\r\n\r\n'sentence_transformers.losses.CosineSimilarityLoss.CosineSimilarityLoss' \r\n\r\nParameters of the fit()-Method:",
"## Full Model Architecture",
"## Citing & Authors"
] | [
46,
66,
38,
64,
29,
123,
5,
6
] | [
"passage: TAGS\n#sentence-transformers #pytorch #xlm-roberta #feature-extraction #sentence-similarity #transformers #endpoints_compatible #region-us \n# Pyjay/sentence-transformers-multilingual-snli-v2-500k\r\n\r\nThis is a sentence-transformers model: It maps sentences & paragraphs to a 768 dimensional dense vector space and can be used for tasks like clustering or semantic search.## Usage (Sentence-Transformers)\r\n\r\nUsing this model becomes easy when you have sentence-transformers installed:\r\n\r\n\r\n\r\nThen you can use the model like this:## Usage (HuggingFace Transformers)\r\nWithout sentence-transformers, you can use the model like this: First, you pass your input through the transformer model, then you have to apply the right pooling-operation on-top of the contextualized word embeddings.## Evaluation Results\r\n\r\n\r\n\r\nFor an automated evaluation of this model, see the *Sentence Embeddings Benchmark*: URL## Training\r\nThe model was trained with the parameters:\r\n\r\nDataLoader:\r\n\r\n'URL.dataloader.DataLoader' of length 15604 with parameters:\r\n\r\n\r\nLoss:\r\n\r\n'sentence_transformers.losses.SoftmaxLoss.SoftmaxLoss' \r\n\r\nDataLoader:\r\n\r\n'URL.dataloader.DataLoader' of length 180 with parameters:\r\n\r\n\r\nLoss:\r\n\r\n'sentence_transformers.losses.CosineSimilarityLoss.CosineSimilarityLoss' \r\n\r\nParameters of the fit()-Method:## Full Model Architecture## Citing & Authors"
] | [
-0.045807141810655594,
0.1795015186071396,
-0.00797197688370943,
0.036948252469301224,
0.13978636264801025,
0.01548740454018116,
0.12077317386865616,
0.0963602066040039,
-0.03194922208786011,
0.09185347706079483,
0.003910420928150415,
0.11470472067594528,
0.01892167702317238,
0.057666074484586716,
0.02253360114991665,
-0.2431221902370453,
0.008571270853281021,
-0.04461986944079399,
0.01995900459587574,
0.07601102441549301,
0.10542041063308716,
-0.07648266106843948,
0.058544982224702835,
0.01907646283507347,
-0.051393233239650726,
0.011998108588159084,
-0.0022304286248981953,
-0.039935823529958725,
0.07239415496587753,
0.060446035116910934,
0.03477863594889641,
0.013757206499576569,
0.015725592151284218,
-0.18531782925128937,
0.012964857742190361,
0.07704642415046692,
-0.0074243140406906605,
0.06042930856347084,
0.07244224101305008,
-0.04823186248540878,
0.18241631984710693,
-0.10577882826328278,
0.04691322520375252,
0.05903349444270134,
-0.10195276886224747,
-0.10346619784832001,
-0.07161582261323929,
-0.013562140986323357,
0.13962137699127197,
0.0895596370100975,
-0.053315505385398865,
0.1014244481921196,
-0.056782547384500504,
0.09422728419303894,
0.11974696815013885,
-0.26396945118904114,
-0.037050534039735794,
0.03192618116736412,
0.02424044907093048,
0.017054788768291473,
-0.08880933374166489,
0.011317920871078968,
-0.03714415803551674,
0.03453078866004944,
0.09083791822195053,
-0.06153997778892517,
0.010748695582151413,
-0.0048359292559325695,
-0.10770529508590698,
-0.015218584798276424,
0.1606273353099823,
0.019127534702420235,
-0.0327589213848114,
-0.18478208780288696,
-0.054400116205215454,
0.04787840321660042,
-0.035111140459775925,
-0.034577567130327225,
0.02685006707906723,
0.02003820613026619,
-0.016870269551873207,
-0.07066476345062256,
-0.10654138028621674,
-0.004575985483825207,
-0.07230719923973083,
0.01853868179023266,
-0.019425401464104652,
-0.02835048921406269,
0.004881246015429497,
0.07119843363761902,
-0.061486586928367615,
-0.11214998364448547,
-0.03319799527525902,
-0.008503745310008526,
-0.13792960345745087,
-0.02988051064312458,
-0.053970858454704285,
-0.1530318707227707,
0.047904834151268005,
0.11143355816602707,
0.06316951662302017,
0.01599586009979248,
0.004003904294222593,
0.0322607159614563,
0.019554441794753075,
0.1844371259212494,
-0.03207864239811897,
-0.10332705825567245,
-0.028947921469807625,
0.013828063383698463,
0.0062956660985946655,
-0.01190867368131876,
-0.015346026979386806,
-0.014652339741587639,
0.05784111097455025,
0.06915581971406937,
0.06508538872003555,
0.03698330745100975,
-0.06286993622779846,
-0.031088387593626976,
0.06985408812761307,
-0.13244172930717468,
0.029387565329670906,
0.010543892160058022,
-0.0595804862678051,
0.06355764716863632,
0.08152944594621658,
-0.028628725558519363,
-0.10020297765731812,
0.06030017137527466,
-0.07431209832429886,
-0.0054940287955105305,
-0.06946562975645065,
-0.121434785425663,
0.006575934123247862,
-0.021633116528391838,
-0.052263323217630386,
-0.08768980950117111,
-0.15758906304836273,
-0.09783368557691574,
0.030960610136389732,
-0.025824163109064102,
-0.020725609734654427,
-0.09051370620727539,
0.0003409182245377451,
0.018922457471489906,
-0.01845295913517475,
-0.03323163092136383,
0.0020838105119764805,
0.02396460995078087,
-0.03772919997572899,
0.05988326668739319,
0.06853163987398148,
0.04888499155640602,
-0.11557803303003311,
0.02339591458439827,
-0.12604740262031555,
0.15489402413368225,
-0.05399324372410774,
0.05125735327601433,
-0.14854571223258972,
-0.007317966781556606,
0.041714951395988464,
0.042988523840904236,
0.014007213525474072,
0.13613669574260712,
-0.1994290053844452,
-0.0645817294716835,
0.12063103914260864,
-0.06207435205578804,
-0.07309537380933762,
0.09900974482297897,
-0.007536659017205238,
0.09139835089445114,
0.12787993252277374,
0.0875687524676323,
0.14521098136901855,
-0.03905578330159187,
-0.040226954966783524,
0.01996505632996559,
-0.04752464219927788,
0.090494804084301,
0.05184917896986008,
-0.06815598160028458,
0.13678334653377533,
0.012300646863877773,
-0.04687391594052315,
0.008335638791322708,
-0.0024779660161584616,
-0.05863431841135025,
0.01248246431350708,
-0.03699031099677086,
0.06214532256126404,
-0.013484970666468143,
0.005288133397698402,
0.015358425676822662,
-0.09368207305669785,
0.10421661287546158,
0.07216812670230865,
-0.058428678661584854,
0.008324282243847847,
-0.0797886922955513,
0.0038132029585540295,
-0.005435451399534941,
0.02527809329330921,
-0.2116437405347824,
-0.12632179260253906,
0.007259981706738472,
-0.07073095440864563,
0.0774429589509964,
0.042318880558013916,
0.05492222309112549,
0.03157602250576019,
-0.016433486714959145,
-0.015993179753422737,
0.02342466078698635,
-0.02165684476494789,
-0.07016237825155258,
-0.08096463978290558,
-0.00933773536235094,
-0.00884983316063881,
0.07813495397567749,
-0.11290793865919113,
0.025417273864150047,
0.04576694220304489,
0.05747297406196594,
0.048224929720163345,
-0.04591488838195801,
0.020421985536813736,
-0.009773923084139824,
0.007166564930230379,
-0.03391357138752937,
0.04239282011985779,
0.01578127220273018,
-0.10490798950195312,
0.08575519919395447,
-0.17838111519813538,
-0.09571428596973419,
0.0732465535402298,
0.05157525837421417,
-0.05070037394762039,
-0.0435590036213398,
-0.006562595255672932,
-0.008438581600785255,
-0.05524249002337456,
-0.04603033885359764,
0.13529108464717865,
0.09511581808328629,
0.1099705696105957,
-0.026027023792266846,
-0.012425475753843784,
-0.04020676016807556,
-0.03670733794569969,
-0.02484091743826866,
0.10969170928001404,
-0.04549787566065788,
-0.10368158668279648,
0.05549415946006775,
0.06596637517213821,
-0.07635796815156937,
0.09465450793504715,
-0.02050742879509926,
-0.07671141624450684,
-0.0634491890668869,
0.025630971416831017,
0.050671402364969254,
-0.0196845680475235,
-0.054880157113075256,
0.011484496295452118,
0.07477220892906189,
0.0053898547776043415,
0.006358148530125618,
-0.07269666343927383,
0.04633507877588272,
0.04927965998649597,
-0.007608589716255665,
0.0732247605919838,
0.04419482499361038,
0.012983654625713825,
0.05438731610774994,
0.0025474063586443663,
0.05585244670510292,
-0.03876366466283798,
-0.05558100342750549,
-0.10708904266357422,
0.16612902283668518,
-0.12490958720445633,
-0.2407865822315216,
-0.17029882967472076,
0.006030617281794548,
-0.07843243330717087,
0.004439042415469885,
0.0645872950553894,
-0.04705003276467323,
-0.07797985523939133,
-0.056061796844005585,
0.07495638728141785,
0.08841543644666672,
-0.03844131901860237,
-0.021163741126656532,
0.03406757861375809,
0.019193172454833984,
-0.10963106155395508,
-0.007227632682770491,
0.006507229059934616,
-0.08113843947649002,
0.01760435290634632,
0.024222824722528458,
0.07914680987596512,
0.09117047488689423,
0.03380206599831581,
-0.011259512975811958,
0.0006466629565693438,
0.21108189225196838,
-0.08786235749721527,
0.045046884566545486,
0.21824824810028076,
0.024389760568737984,
0.07026346772909164,
0.05931427329778671,
0.03402084857225418,
-0.035656414926052094,
0.04469219967722893,
0.05768461897969246,
-0.027477910742163658,
-0.16136474907398224,
-0.10556013882160187,
-0.061113953590393066,
0.014242667704820633,
0.12125054001808167,
0.029318705201148987,
-0.01773432269692421,
0.05365881696343422,
-0.015843188390135765,
-0.001417763764038682,
0.059774719178676605,
0.09296388924121857,
0.1388329267501831,
0.0018321577226743102,
0.10285826772451401,
-0.059029970318078995,
-0.03713052719831467,
0.07075858116149902,
-0.02309955470263958,
0.14181257784366608,
-0.018562957644462585,
0.15261150896549225,
0.0693662092089653,
-0.005640802904963493,
-0.00905690062791109,
0.08935665339231491,
-0.02631722018122673,
0.010001792572438717,
0.0021758803632110357,
-0.10257815569639206,
-0.05087321624159813,
0.06844653189182281,
0.0631113052368164,
-0.023183319717645645,
-0.028310179710388184,
0.05997628718614578,
0.10343824326992035,
0.1769140511751175,
0.05560411140322685,
-0.22817787528038025,
-0.05261906608939171,
0.023888852447271347,
-0.06519784033298492,
-0.058103978633880615,
-0.022700119763612747,
0.06364942342042923,
-0.14882364869117737,
0.06390876322984695,
-0.03292671591043472,
0.0903690829873085,
-0.09260864555835724,
0.013796824030578136,
-0.03163507953286171,
0.057320430874824524,
0.008349612355232239,
0.07624716311693192,
-0.23536157608032227,
0.0626872330904007,
0.029630083590745926,
0.049993738532066345,
-0.05129816755652428,
0.037010859698057175,
0.05980158597230911,
-0.0021216098684817553,
0.14984260499477386,
-0.01965738832950592,
0.007584547623991966,
-0.011529299430549145,
-0.06897559016942978,
-0.011395125649869442,
0.042535338550806046,
-0.09691943228244781,
0.09864870458841324,
-0.027678169310092926,
-0.03386952728033066,
-0.038344044238328934,
0.00364338094368577,
-0.03853849321603775,
-0.17344263195991516,
0.020820314064621925,
0.020509840920567513,
0.04549445956945419,
-0.029485924169421196,
-0.016486763954162598,
-0.013850771822035313,
0.20354154706001282,
-0.10563478618860245,
-0.08574996143579483,
-0.12113489210605621,
-0.007943897508084774,
0.11180225014686584,
-0.10341931134462357,
0.016068730503320694,
-0.004590904340147972,
0.148012176156044,
-0.02957993559539318,
-0.06252020597457886,
0.03540883958339691,
-0.06462903320789337,
-0.07995029538869858,
-0.029200803488492966,
0.09755130857229233,
0.04215272516012192,
0.04841609671711922,
0.02379986084997654,
0.05686212703585625,
-0.03936254605650902,
-0.09625329077243805,
-0.06049632653594017,
0.128820538520813,
-0.011535088531672955,
0.09863238036632538,
-0.0775776132941246,
-0.08527121692895889,
-0.09026044607162476,
0.0164567232131958,
0.1850949227809906,
0.16181227564811707,
-0.07042882591485977,
0.07682184875011444,
0.08856222778558731,
-0.07742007821798325,
-0.21727317571640015,
-0.05603669583797455,
0.06479351222515106,
0.051868610084056854,
0.08921881765127182,
-0.14106760919094086,
0.08658956736326218,
0.07237531244754791,
-0.008222207427024841,
-0.03103754296898842,
-0.27513986825942993,
-0.13456016778945923,
0.09303600341081619,
0.029978850856423378,
-0.07538745552301407,
-0.11821242421865463,
-0.058731935918331146,
-0.07032109797000885,
-0.044856734573841095,
0.10152255743741989,
-0.060340940952301025,
0.08109984546899796,
0.020709365606307983,
0.051398273557424545,
0.06770969182252884,
-0.007444295566529036,
0.12893244624137878,
0.052862558513879776,
0.03353577479720116,
-0.02958887256681919,
0.020025527104735374,
0.09077790379524231,
-0.07914568483829498,
0.14616023004055023,
-0.05702284723520279,
0.03862041234970093,
-0.1284557431936264,
-0.022628575563430786,
-0.05548200011253357,
0.025709595531225204,
-0.04573248699307442,
-0.033040713518857956,
-0.006936643738299608,
0.04200967028737068,
0.0992036759853363,
-0.00363155547529459,
0.004540158901363611,
-0.07156533002853394,
0.028203772380948067,
0.2015722692012787,
0.10285806655883789,
0.06152820959687233,
-0.20251399278640747,
-0.0009793138597160578,
0.011886775493621826,
0.0342218317091465,
-0.11503719538450241,
0.07085001468658447,
0.08434470742940903,
0.013269509188830853,
0.13511572778224945,
0.002516763051971793,
-0.07607857137918472,
0.004441153258085251,
0.07577195018529892,
-0.07589578628540039,
-0.16125468909740448,
-0.035562314093112946,
-0.004995758645236492,
-0.13315516710281372,
-0.0483115054666996,
0.13960863649845123,
0.0018763747066259384,
0.004257849883288145,
0.03299960866570473,
0.049778081476688385,
-0.017373602837324142,
0.14783425629138947,
-0.02450527437031269,
0.05963068827986717,
-0.07714813202619553,
0.083279088139534,
0.0976436510682106,
-0.09792029857635498,
-0.007078791037201881,
0.13949309289455414,
-0.07223789393901825,
-0.079999178647995,
-0.0610634945333004,
0.05065971240401268,
-0.08660320937633514,
0.009569558314979076,
-0.05655648559331894,
-0.06942151486873627,
0.01917664334177971,
0.005497660953551531,
0.058497052639722824,
0.05336466804146767,
-0.07105857133865356,
-0.042423125356435776,
-0.07264868915081024,
0.08412549644708633,
0.09219179302453995,
0.017167897894978523,
-0.050354965031147,
0.11534532904624939,
-0.018050484359264374,
0.021652353927493095,
-0.013604346662759781,
-0.04737177491188049,
-0.0464310348033905,
0.0033043466974049807,
-0.044340137392282486,
-0.009820409119129181,
-0.11123080551624298,
-0.00970494095236063,
0.021789994090795517,
0.04642336815595627,
-0.006513717118650675,
-0.005887248553335667,
-0.05376201868057251,
-0.0790521502494812,
-0.04883655160665512,
0.09913808107376099,
-0.13264551758766174,
-0.005409166216850281,
0.03343711420893669,
-0.08499427139759064,
0.08212415128946304,
-0.015599028207361698,
-0.020833374932408333,
0.03359484672546387,
-0.016122756525874138,
-0.04714597761631012,
0.021006472408771515,
0.03760474920272827,
0.04980630427598953,
-0.10479718446731567,
0.016307296231389046,
-0.02981029637157917,
0.014057830907404423,
0.008719409815967083,
0.033099062740802765,
-0.11691613495349884,
0.03861032426357269,
-0.012236862443387508,
0.016403507441282272,
-0.11176551133394241,
0.043868642300367355,
0.030189158394932747,
0.04893643409013748,
0.17646093666553497,
-0.060176484286785126,
0.08173082768917084,
-0.13321730494499207,
0.003658161498606205,
0.0042949458584189415,
-0.037014033645391464,
0.03382611274719238,
-0.08202625066041946,
0.05830807983875275,
-0.03422068804502487,
0.03568067401647568,
0.015982531011104584,
0.06825701147317886,
0.054426159709692,
0.030165359377861023,
-0.02758101560175419,
0.0036282665096223354,
0.033557113260030746,
0.05210595950484276,
-0.0034883595071733,
-0.010912302881479263,
0.03560280054807663,
-0.016834856942296028,
-0.01712481491267681,
0.07890848815441132,
0.08133004605770111,
0.02101965807378292,
0.08231578767299652,
0.042400166392326355,
-0.0008530059130862355,
-0.12164803594350815,
0.04541568458080292,
-0.06223945692181587,
0.0778387039899826,
-0.05064519867300987,
0.060302022844552994,
0.1341102421283722,
-0.15846441686153412,
0.11897554993629456,
0.035881899297237396,
-0.05042953044176102,
-0.08387557417154312,
-0.1397257000207901,
-0.06534267216920853,
-0.04524907097220421,
-0.018657371401786804,
-0.12017073482275009,
0.03241350129246712,
0.014473087154328823,
0.0011316112941130996,
-0.021072473376989365,
0.10378986597061157,
-0.12132126092910767,
-0.1087992861866951,
0.07992006838321686,
-0.031330615282058716,
0.05221893638372421,
0.04396310821175575,
0.05251530185341835,
0.008453560061752796,
0.05827678367495537,
0.07369983196258545,
0.05669115111231804,
0.034057240933179855,
0.03323429822921753,
-0.09293471276760101,
-0.080786794424057,
0.0017277889419347048,
0.009559172205626965,
-0.02530674636363983,
0.08818299323320389,
0.05346876010298729,
-0.06038724258542061,
-0.02555779553949833,
0.22354906797409058,
-0.08506869524717331,
-0.09902027249336243,
-0.1655590832233429,
0.1792737990617752,
0.05670293793082237,
0.019083984196186066,
-0.01954682730138302,
-0.09141231328248978,
-0.020040644332766533,
0.13040946424007416,
0.1297014355659485,
-0.05074173957109451,
0.016027476638555527,
0.01247051265090704,
0.01508291158825159,
-0.0174003466963768,
0.04001152142882347,
0.0460534505546093,
0.2378026247024536,
-0.04690604656934738,
0.11734256893396378,
-0.020323235541582108,
-0.05269887298345566,
-0.07786205410957336,
0.07105802744626999,
0.004446270409971476,
0.02702871896326542,
-0.00933486595749855,
0.11496338248252869,
-0.0353810153901577,
-0.08918317407369614,
-0.03357546031475067,
-0.09173248708248138,
-0.1277461051940918,
-0.024345703423023224,
0.0005878958618268371,
0.04145236313343048,
0.09582120180130005,
0.03576485067605972,
-0.021057454869151115,
0.12608014047145844,
-0.01944011077284813,
-0.06230758875608444,
-0.026354920119047165,
0.04269295185804367,
-0.04827747121453285,
0.13081349432468414,
-0.001595403766259551,
-0.00039341437513940036,
0.11337543278932571,
0.0051732310093939304,
-0.07525047659873962,
0.06925676763057709,
0.0304634477943182,
-0.09958904981613159,
0.13386055827140808,
0.09234180301427841,
-0.002179160015657544,
0.08920595794916153,
0.06176178157329559,
-0.12308269739151001,
0.039503488689661026,
-0.007380765397101641,
-0.006259525660425425,
-0.07091221213340759,
0.044815707951784134,
-0.06628501415252686,
0.11062680929899216,
0.1978217363357544,
-0.020046938210725784,
0.007761628832668066,
-0.017942093312740326,
0.015337340533733368,
0.012453011237084866,
0.06528524309396744,
-0.04351356625556946,
-0.08036080002784729,
0.021277040243148804,
-0.02209370583295822,
0.019746439531445503,
-0.2502197325229645,
-0.08301110565662384,
0.010971218347549438,
-0.03210006281733513,
-0.023151054978370667,
0.12130893021821976,
0.05928187444806099,
0.0006780339172109962,
-0.025705847889184952,
-0.20293456315994263,
0.023692190647125244,
0.11551711708307266,
-0.12133617699146271,
-0.08805753290653229
] |
null | null | transformers | This model is finetuned by Qichang Zheng(Pyke) based on bart with patent abstract dataset(7 million records), with 'facebook/bart-base' being the tokenizer and original model. The input is the same as the output, which is the patent abstract.
This model is finetuned to serve as a reference to the research that Qichang is in. | {} | text2text-generation | Pyke/bart-finetuned-with-patent | [
"transformers",
"pytorch",
"bart",
"text2text-generation",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [] | TAGS
#transformers #pytorch #bart #text2text-generation #autotrain_compatible #endpoints_compatible #region-us
| This model is finetuned by Qichang Zheng(Pyke) based on bart with patent abstract dataset(7 million records), with 'facebook/bart-base' being the tokenizer and original model. The input is the same as the output, which is the patent abstract.
This model is finetuned to serve as a reference to the research that Qichang is in. | [] | [
"TAGS\n#transformers #pytorch #bart #text2text-generation #autotrain_compatible #endpoints_compatible #region-us \n"
] | [
38
] | [
"passage: TAGS\n#transformers #pytorch #bart #text2text-generation #autotrain_compatible #endpoints_compatible #region-us \n"
] | [
-0.0318886935710907,
0.0096189696341753,
-0.007606509141623974,
0.001847309060394764,
0.14772942662239075,
0.02160252071917057,
0.1334671527147293,
0.12192132323980331,
0.01626305654644966,
-0.028771573677659035,
0.14323928952217102,
0.1992919147014618,
-0.024837283417582512,
0.16441908478736877,
-0.08063437789678574,
-0.2598302364349365,
0.05283436179161072,
0.07620055228471756,
0.042803842574357986,
0.12526075541973114,
0.08328705281019211,
-0.0723603218793869,
0.0735759362578392,
-0.032824575901031494,
-0.1756834238767624,
0.0490216426551342,
0.03857412934303284,
-0.11679597198963165,
0.10073336213827133,
0.044240228831768036,
0.14631077647209167,
0.03982788696885109,
-0.06636208295822144,
-0.1283167600631714,
0.034403931349515915,
-0.012848195619881153,
-0.06484833359718323,
0.0399547815322876,
0.0953272208571434,
-0.10738255828619003,
0.0691714733839035,
0.07824554294347763,
-0.0009729358134791255,
0.051042620092630386,
-0.13444896042346954,
-0.04049830511212349,
-0.027359316125512123,
0.032211143523454666,
0.06613613665103912,
0.08172227442264557,
-0.002032114891335368,
0.12712892889976501,
-0.09982932358980179,
0.129965141415596,
0.14824458956718445,
-0.30388808250427246,
-0.012859301641583443,
0.046737879514694214,
0.08478476107120514,
0.05252804234623909,
-0.024114781990647316,
0.03457428514957428,
0.02039201185107231,
0.030345473438501358,
-0.011317359283566475,
-0.08197411149740219,
-0.12215276807546616,
0.02166357822716236,
-0.0591701939702034,
-0.05171586200594902,
0.2085941880941391,
-0.08335429430007935,
0.052767593413591385,
-0.035482101142406464,
-0.09834597259759903,
-0.04572887346148491,
-0.030292777344584465,
0.012228906154632568,
-0.07027875632047653,
0.06717630475759506,
-0.024450715631246567,
-0.051333602517843246,
-0.139779195189476,
0.009453840553760529,
-0.19736739993095398,
0.1813373863697052,
0.003591995220631361,
0.05767586827278137,
-0.22959688305854797,
0.07853192090988159,
0.032813530415296555,
-0.11851274222135544,
0.05080040171742439,
-0.09867561608552933,
0.05972042679786682,
0.0018865568563342094,
-0.07992928475141525,
-0.09396862983703613,
0.07693824172019958,
0.15202535688877106,
0.0570392943918705,
0.03992437571287155,
-0.050025515258312225,
0.08682024478912354,
0.005244566593319178,
0.08907615393400192,
0.0664292722940445,
-0.08503730595111847,
0.05343325808644295,
-0.12424403429031372,
0.024643665179610252,
-0.07267257571220398,
-0.1592642068862915,
-0.049884188920259476,
0.04776405170559883,
0.07991299778223038,
0.05279557406902313,
0.05907239764928818,
-0.05020331218838692,
-0.017758851870894432,
0.07987207174301147,
-0.07385040819644928,
0.012242065742611885,
0.0059492760337889194,
0.024120714515447617,
0.1218082532286644,
0.002459706272929907,
0.013701106421649456,
-0.09565173089504242,
0.10911376029253006,
-0.04307014122605324,
0.0018043185118585825,
-0.05350656807422638,
-0.05434543266892433,
0.03374440222978592,
-0.09654112905263901,
0.023122891783714294,
-0.16352982819080353,
-0.16724829375743866,
0.007726968731731176,
0.012249041348695755,
-0.003505572210997343,
-0.033027712255716324,
-0.03621775656938553,
-0.005919867195188999,
0.05795317515730858,
-0.08099211752414703,
-0.01666862703859806,
-0.042124539613723755,
0.10648948699235916,
-0.0036009550094604492,
0.08183425664901733,
-0.16650360822677612,
0.06636875867843628,
-0.11494749039411545,
-0.03601225093007088,
-0.09505820274353027,
0.035676173865795135,
0.0012118915328755975,
0.15507112443447113,
0.018306683748960495,
-0.009874354116618633,
-0.10775242745876312,
0.05634631961584091,
-0.018046220764517784,
0.19111143052577972,
-0.10143165290355682,
-0.11551348119974136,
0.2726460099220276,
-0.08581715822219849,
-0.15655820071697235,
0.0745999738574028,
0.0038844537921249866,
0.03687083348631859,
0.10288292169570923,
0.17752613127231598,
0.06705204397439957,
-0.009362038224935532,
0.09385570883750916,
0.10423172265291214,
-0.08946622908115387,
-0.12407493591308594,
-0.005046447739005089,
-0.010976677760481834,
-0.10818316042423248,
0.06363586336374283,
0.10392338782548904,
0.08065568655729294,
-0.05380159243941307,
-0.03353259712457657,
-0.03641233593225479,
-0.013897283934056759,
0.10549046844244003,
0.026039913296699524,
0.1269293576478958,
-0.0887337177991867,
-0.012849004939198494,
-0.014335026033222675,
-0.01335445512086153,
0.003660373855382204,
0.047355301678180695,
-0.0394231453537941,
0.10426464676856995,
-0.0038209620397537947,
0.03976859524846077,
-0.20150139927864075,
-0.06227380782365799,
-0.014103719033300877,
0.13805876672267914,
0.0026925376150757074,
0.11038415879011154,
0.07020298391580582,
-0.037029724568128586,
0.0012128711678087711,
-0.02271571010351181,
0.1580611914396286,
-0.01027847919613123,
-0.07842051237821579,
-0.048077233135700226,
0.057122793048620224,
-0.07903748750686646,
-0.004407473839819431,
-0.04450428858399391,
0.023696700111031532,
0.03759979456663132,
0.13724929094314575,
0.016397079452872276,
0.04251628741621971,
-0.035238903015851974,
0.039600640535354614,
-0.08605601638555527,
0.029575467109680176,
0.10157769173383713,
0.014959105290472507,
-0.07233147323131561,
0.2013004571199417,
-0.1762475073337555,
0.2476811707019806,
0.212754026055336,
-0.27035385370254517,
0.024685995653271675,
-0.04099436104297638,
-0.01907818578183651,
0.011665256693959236,
0.04634355753660202,
-0.016739649698138237,
0.04964808002114296,
0.017727717757225037,
0.19336023926734924,
-0.03373531624674797,
-0.04479131102561951,
-0.01514318399131298,
-0.0669543519616127,
-0.010299740359187126,
0.053888946771621704,
0.030631281435489655,
-0.13001108169555664,
0.18424159288406372,
0.2280101478099823,
0.025706658139824867,
0.18376773595809937,
0.022436637431383133,
-0.0006371058989316225,
0.07107949256896973,
-0.019524546340107918,
-0.0311858169734478,
-0.06731158494949341,
-0.18211530148983002,
-0.03696368262171745,
0.0802861750125885,
0.012168895453214645,
0.09167364984750748,
-0.11742155998945236,
-0.02911762334406376,
-0.006805825047194958,
0.01033635064959526,
-0.0067903995513916016,
0.09016279131174088,
0.07038231194019318,
0.10586395859718323,
-0.02393309772014618,
-0.022413793951272964,
0.11036060005426407,
0.012995772063732147,
-0.08900920301675797,
0.15894490480422974,
-0.1406673640012741,
-0.35423746705055237,
-0.18273937702178955,
-0.16229088604450226,
-0.02076569013297558,
0.0609489381313324,
0.13703471422195435,
-0.07584594935178757,
-0.0263918898999691,
0.026742594316601753,
0.02490844950079918,
-0.04625730961561203,
0.02991250529885292,
-0.05131559073925018,
0.052110932767391205,
-0.05489020794630051,
-0.07748328149318695,
-0.0574861541390419,
-0.010487782768905163,
-0.02249021828174591,
0.1595280021429062,
-0.12328781187534332,
0.08628269284963608,
0.14141078293323517,
0.007060518022626638,
0.06670382618904114,
-0.019857754930853844,
0.16306842863559723,
-0.07192041724920273,
-0.01979277841746807,
0.20871739089488983,
-0.06324824690818787,
0.08185254782438278,
0.13290759921073914,
0.0014225946506485343,
-0.0659416913986206,
0.036012422293424606,
-0.06399042159318924,
-0.09558235108852386,
-0.2123410850763321,
-0.11684432625770569,
-0.1203472763299942,
0.08290492743253708,
0.05139313265681267,
0.05275741592049599,
0.13240981101989746,
0.08159936964511871,
-0.01733388565480709,
0.026433371007442474,
0.004888464231044054,
0.09375131875276566,
0.19021064043045044,
-0.019578177481889725,
0.1585603803396225,
-0.07485145330429077,
-0.12854962050914764,
0.09895215928554535,
0.041452351957559586,
0.10840454697608948,
0.08608803153038025,
0.03231623023748398,
0.014189885929226875,
0.09416954219341278,
0.154267817735672,
0.14471131563186646,
0.04096266254782677,
-0.01844504289329052,
-0.014848168008029461,
-0.019450439140200615,
-0.07458235323429108,
0.03915075212717056,
0.035822778940200806,
-0.1387520581483841,
-0.05972644314169884,
-0.12952616810798645,
0.07954489439725876,
0.08016408234834671,
0.05154338851571083,
-0.21658846735954285,
0.010716202668845654,
0.09096989780664444,
-0.032574914395809174,
-0.10637634992599487,
0.05862889811396599,
-0.020082173869013786,
-0.14357052743434906,
0.08368008583784103,
-0.041197724640369415,
0.13136643171310425,
-0.016771353781223297,
0.09348223358392715,
-0.07598451524972916,
-0.11635787039995193,
0.04214917868375778,
0.1051286980509758,
-0.3333568274974823,
0.19576585292816162,
-0.005010900087654591,
-0.049015872180461884,
-0.09523550420999527,
-0.010616874322295189,
0.01414541807025671,
0.1286686807870865,
0.06345471739768982,
-0.005398744251579046,
-0.07144735753536224,
-0.12392700463533401,
-0.01792708784341812,
0.022568373009562492,
0.13874293863773346,
-0.025139471516013145,
0.005983670707792044,
-0.03845648840069771,
-0.028553906828165054,
-0.035857707262039185,
-0.016320165246725082,
-0.00006297017534961924,
-0.17832933366298676,
0.07639492303133011,
0.05771424248814583,
0.06800124049186707,
0.0010011186823248863,
-0.021595092490315437,
-0.03323802351951599,
0.2092742919921875,
-0.058046482503414154,
-0.07962445914745331,
-0.11384638398885727,
-0.0835738554596901,
0.04440158233046532,
-0.09057649224996567,
0.050487905740737915,
-0.08124570548534393,
0.033525500446558,
-0.08009763807058334,
-0.21084435284137726,
0.11020998656749725,
-0.10891846567392349,
-0.03099050186574459,
-0.06553018093109131,
0.1588260680437088,
-0.08069963753223419,
0.012498589232563972,
0.03211408853530884,
0.008073501288890839,
-0.12820586562156677,
-0.0717388316988945,
-0.03555352985858917,
-0.005153812002390623,
0.040916040539741516,
0.009281206876039505,
-0.06748723983764648,
-0.048664286732673645,
-0.019427035003900528,
-0.015754178166389465,
0.28569287061691284,
0.1475352942943573,
-0.0609462708234787,
0.18389250338077545,
0.13568007946014404,
-0.0724790170788765,
-0.31663528084754944,
-0.12009056657552719,
-0.09762241691350937,
-0.018755966797471046,
-0.02749086543917656,
-0.1484328955411911,
0.09650276601314545,
-0.027682725340127945,
-0.03106766566634178,
0.11427687108516693,
-0.2001187950372696,
-0.09288447350263596,
0.16899827122688293,
-0.01052873209118843,
0.3756139874458313,
-0.12983417510986328,
-0.11189896613359451,
-0.08148515969514847,
-0.1828407496213913,
0.13390463590621948,
-0.048098571598529816,
0.08482028543949127,
-0.03219360485672951,
0.1280236542224884,
0.04634714499115944,
-0.05130390822887421,
0.07594503462314606,
-0.0059575652703642845,
0.0011326826643198729,
-0.12049131095409393,
-0.037878088653087616,
0.045219022780656815,
-0.019460856914520264,
0.02688322775065899,
-0.06330608576536179,
0.017958227545022964,
-0.15472637116909027,
-0.035788290202617645,
-0.09085410833358765,
0.06023867800831795,
0.02375026047229767,
-0.034585386514663696,
0.030826816335320473,
-0.07676052302122116,
-0.01754079759120941,
0.01558644324541092,
0.21057024598121643,
-0.045633990317583084,
0.17369864881038666,
0.1272992342710495,
0.10232851654291153,
-0.15472714602947235,
0.053716737776994705,
-0.06897317618131638,
-0.07662619650363922,
0.05559123307466507,
-0.06246177479624748,
0.06266094744205475,
0.1154441237449646,
-0.051855891942977905,
0.04971490427851677,
0.09952472150325775,
0.02619350329041481,
-0.0035122428089380264,
0.16450707614421844,
-0.2513583302497864,
0.056844647973775864,
-0.06753088533878326,
0.03402099013328552,
0.061399027705192566,
0.05922277644276619,
0.1611248403787613,
0.05132662132382393,
-0.05297247692942619,
-0.02526140585541725,
-0.00987333245575428,
-0.041071340441703796,
0.06454507261514664,
0.03129083290696144,
0.028051014989614487,
-0.13582447171211243,
0.03623811900615692,
0.014572393149137497,
-0.16184279322624207,
-0.008318443782627583,
0.18180175125598907,
-0.13040250539779663,
-0.11827728897333145,
-0.0003556807932909578,
0.13719442486763,
-0.1603710651397705,
-0.04452987387776375,
-0.07251504808664322,
-0.10844884812831879,
0.07334499061107635,
0.18042348325252533,
0.08402121067047119,
0.08780218660831451,
-0.03176552429795265,
-0.019938340410590172,
-0.007511130999773741,
-0.012712632305920124,
0.04139583930373192,
0.05297689884901047,
-0.062334027141332626,
0.07110132277011871,
-0.02785586751997471,
0.13658872246742249,
-0.09310124069452286,
-0.05491911247372627,
-0.14652810990810394,
0.036501798778772354,
-0.14568524062633514,
-0.05431540310382843,
-0.09395479410886765,
-0.05915261059999466,
-0.0031735983211547136,
-0.04335375502705574,
-0.036842718720436096,
-0.05164634436368942,
-0.11949747055768967,
0.01925506442785263,
-0.05278397724032402,
0.007503626402467489,
-0.09032180160284042,
-0.013594781048595905,
0.09462499618530273,
-0.039998870342969894,
0.08794531226158142,
0.15195296704769135,
-0.0913335531949997,
0.09685277938842773,
-0.14502272009849548,
-0.11198843270540237,
0.09822298586368561,
0.024706006050109863,
0.05699537321925163,
0.10741657018661499,
0.013904707506299019,
0.10026929527521133,
0.05376213416457176,
0.05014893040060997,
0.0676761269569397,
-0.12273658812046051,
0.03846529871225357,
-0.027500825002789497,
-0.17917148768901825,
-0.05761365592479706,
-0.03858879581093788,
0.07174218446016312,
0.021226610988378525,
0.13927356898784637,
-0.04824436455965042,
0.11497675627470016,
-0.046819183975458145,
0.021025141701102257,
-0.005755160469561815,
-0.1860162913799286,
-0.06618314236402512,
-0.09151892364025116,
0.013906091451644897,
0.01864214614033699,
0.2196902483701706,
0.011389349587261677,
0.05455735698342323,
0.03266080841422081,
0.06021501496434212,
0.01324552483856678,
0.002735121175646782,
0.19415953755378723,
0.09189482778310776,
-0.05106740444898605,
-0.10222998261451721,
0.07806974649429321,
0.015882208943367004,
-0.01169833354651928,
0.13383661210536957,
0.056038159877061844,
-0.0000010217938779533142,
0.11240266263484955,
-0.0205028485506773,
0.06780295819044113,
-0.13827838003635406,
-0.22069905698299408,
-0.03221917152404785,
0.04884885624051094,
-0.011768092401325703,
0.11001694947481155,
0.1406608521938324,
-0.029341092333197594,
0.027563219889998436,
-0.030912354588508606,
-0.044107139110565186,
-0.17532260715961456,
-0.11427092552185059,
-0.09254711121320724,
-0.1073429211974144,
0.004966165870428085,
-0.07934264838695526,
0.053909074515104294,
0.05231036990880966,
0.03997195512056351,
-0.06067992001771927,
0.09453555941581726,
0.05611402541399002,
-0.08242598921060562,
0.05563623085618019,
-0.039030808955430984,
0.057145651429891586,
0.0187069084495306,
-0.01721620187163353,
-0.13299258053302765,
0.006979918107390404,
-0.02085346356034279,
0.06046636775135994,
-0.059781454503536224,
-0.0046447101049125195,
-0.1300465315580368,
-0.11646021157503128,
-0.040615372359752655,
0.05290810763835907,
-0.010315419174730778,
0.1543513685464859,
0.010002214461565018,
0.006278361659497023,
0.02952582575380802,
0.21361204981803894,
-0.08304668962955475,
-0.0997757539153099,
-0.027381369844079018,
0.1809464693069458,
0.06438671052455902,
0.09963551163673401,
-0.03316444158554077,
0.008969796821475029,
-0.09206332266330719,
0.34561264514923096,
0.2768198251724243,
-0.05752769485116005,
0.03757641464471817,
0.027008935809135437,
0.04348360002040863,
0.12249194085597992,
0.13782788813114166,
0.08802539855241776,
0.2891320586204529,
-0.06964948028326035,
-0.02117864601314068,
-0.014709130860865116,
-0.022582873702049255,
-0.12718559801578522,
0.0898301973938942,
0.0016211067559197545,
-0.06520198285579681,
-0.04728702828288078,
0.09117545187473297,
-0.19861075282096863,
0.15790048241615295,
-0.055438071489334106,
-0.19179017841815948,
-0.04403429105877876,
0.006371563300490379,
0.1934935450553894,
-0.006737882271409035,
0.08999232947826385,
-0.0039511388167738914,
-0.07897204160690308,
0.06476317346096039,
0.0007307027699425817,
-0.21401961147785187,
0.006611082702875137,
0.05997241288423538,
-0.15226763486862183,
-0.009286170825362206,
-0.02284272201359272,
0.0413786917924881,
0.0732390508055687,
0.07314230501651764,
-0.038510482758283615,
0.04344063252210617,
-0.002147699473425746,
-0.03637602925300598,
0.019572056829929352,
0.05892649292945862,
0.0019930254202336073,
-0.09937852621078491,
0.06121731922030449,
-0.1571817398071289,
0.051574014127254486,
-0.013654936105012894,
-0.015761863440275192,
-0.003717645537108183,
0.01800469495356083,
-0.046465542167425156,
0.06301630288362503,
0.06507781893014908,
-0.007565658539533615,
-0.017934875562787056,
-0.041166454553604126,
-0.026228422299027443,
0.003031977917999029,
-0.08298347890377045,
-0.11668974161148071,
-0.1263510286808014,
-0.11484553664922714,
0.14168350398540497,
0.008285743184387684,
-0.22244790196418762,
0.00014119630213826895,
-0.11391746252775192,
0.04563592001795769,
-0.1792900711297989,
0.10082010924816132,
0.0644376203417778,
-0.0009037127019837499,
0.005460427142679691,
-0.07608924806118011,
0.044657569378614426,
0.08494101464748383,
-0.11684994399547577,
-0.10534628480672836
] |
null | null | transformers |
Propaganda Techniques Analysis BERT
----
This model is a BERT based model to make predictions of propaganda techniques in
news articles in English. The model is described in
[this paper](https://propaganda.qcri.org/papers/EMNLP_2019__Fine_Grained_Propaganda_Detection.pdf).
## Model description
Please find propaganda definition here:
https://propaganda.qcri.org/annotations/definitions.html
You can also try the model in action here: https://www.tanbih.org/prta
### How to use
```python
>>> from transformers import BertTokenizerFast
>>> from .model import BertForTokenAndSequenceJointClassification
>>>
>>> tokenizer = BertTokenizerFast.from_pretrained('bert-base-cased')
>>> model = BertForTokenAndSequenceJointClassification.from_pretrained(
>>> "QCRI/PropagandaTechniquesAnalysis-en-BERT",
>>> revision="v0.1.0",
>>> )
>>>
>>> inputs = tokenizer.encode_plus("Hello, my dog is cute", return_tensors="pt")
>>> outputs = model(**inputs)
>>> sequence_class_index = torch.argmax(outputs.sequence_logits, dim=-1)
>>> sequence_class = model.sequence_tags[sequence_class_index[0]]
>>> token_class_index = torch.argmax(outputs.token_logits, dim=-1)
>>> tokens = tokenizer.convert_ids_to_tokens(inputs.input_ids[0][1:-1])
>>> tags = [model.token_tags[i] for i in token_class_index[0].tolist()[1:-1]]
```
### BibTeX entry and citation info
```bibtex
@inproceedings{da-san-martino-etal-2019-fine,
title = "Fine-Grained Analysis of Propaganda in News Article",
author = "Da San Martino, Giovanni and
Yu, Seunghak and
Barr{\'o}n-Cede{\~n}o, Alberto and
Petrov, Rostislav and
Nakov, Preslav",
booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP)",
month = nov,
year = "2019",
address = "Hong Kong, China",
publisher = "Association for Computational Linguistics",
url = "https://www.aclweb.org/anthology/D19-1565",
doi = "10.18653/v1/D19-1565",
pages = "5636--5646",
abstract = "Propaganda aims at influencing people{'}s mindset with the purpose of advancing a specific agenda. Previous work has addressed propaganda detection at document level, typically labelling all articles from a propagandistic news outlet as propaganda. Such noisy gold labels inevitably affect the quality of any learning system trained on them. A further issue with most existing systems is the lack of explainability. To overcome these limitations, we propose a novel task: performing fine-grained analysis of texts by detecting all fragments that contain propaganda techniques as well as their type. In particular, we create a corpus of news articles manually annotated at fragment level with eighteen propaganda techniques and propose a suitable evaluation measure. We further design a novel multi-granularity neural network, and we show that it outperforms several strong BERT-based baselines.",
}
```
| {"language": "en", "license": "MIT", "tags": ["propaganda", "bert"], "datasets": [], "metrics": [], "thumbnail": "https://pbs.twimg.com/profile_images/1092721745994440704/d6R-AHzj_400x400.jpg"} | null | QCRI/PropagandaTechniquesAnalysis-en-BERT | [
"transformers",
"pytorch",
"bert",
"propaganda",
"en",
"endpoints_compatible",
"has_space",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [
"en"
] | TAGS
#transformers #pytorch #bert #propaganda #en #endpoints_compatible #has_space #region-us
|
Propaganda Techniques Analysis BERT
----
This model is a BERT based model to make predictions of propaganda techniques in
news articles in English. The model is described in
this paper.
## Model description
Please find propaganda definition here:
URL
You can also try the model in action here: URL
### How to use
### BibTeX entry and citation info
| [
"## Model description\n\nPlease find propaganda definition here:\nURL\n\nYou can also try the model in action here: URL",
"### How to use",
"### BibTeX entry and citation info"
] | [
"TAGS\n#transformers #pytorch #bert #propaganda #en #endpoints_compatible #has_space #region-us \n",
"## Model description\n\nPlease find propaganda definition here:\nURL\n\nYou can also try the model in action here: URL",
"### How to use",
"### BibTeX entry and citation info"
] | [
33,
21,
5,
11
] | [
"passage: TAGS\n#transformers #pytorch #bert #propaganda #en #endpoints_compatible #has_space #region-us \n## Model description\n\nPlease find propaganda definition here:\nURL\n\nYou can also try the model in action here: URL### How to use### BibTeX entry and citation info"
] | [
0.02404068224132061,
0.05809054151177406,
-0.0032792044803500175,
-0.06676968932151794,
0.038533709943294525,
0.02980879321694374,
0.16362974047660828,
0.02404295653104782,
0.10234043747186661,
-0.021297894418239594,
0.17476743459701538,
-0.010271008126437664,
0.0034545378293842077,
0.07022044062614441,
0.04443640634417534,
-0.2839575707912445,
0.04445207864046097,
0.095156729221344,
0.0037270192988216877,
0.13350224494934082,
0.03779754042625427,
-0.10363474488258362,
0.06445261090993881,
0.08805884420871735,
-0.08027485013008118,
0.04868725314736366,
-0.044932954013347626,
-0.07907094061374664,
0.11381060630083084,
-0.024238664656877518,
0.1346893012523651,
0.027201058343052864,
0.05443604663014412,
-0.12980477511882782,
0.043688978999853134,
0.00868130847811699,
-0.09026722609996796,
0.10677096247673035,
-0.02159445732831955,
-0.04684463143348694,
0.367392361164093,
0.0651099756360054,
-0.03636455535888672,
0.035937998443841934,
-0.15552251040935516,
0.029685594141483307,
0.043510448187589645,
0.20683948695659637,
-0.04419700801372528,
0.029191911220550537,
-0.020323090255260468,
0.13439100980758667,
-0.11105424910783768,
0.013026965782046318,
0.18409287929534912,
-0.08182878792285919,
0.002357826568186283,
0.0830838531255722,
0.08406323194503784,
0.07599060982465744,
-0.017453836277127266,
0.17055095732212067,
0.07439723610877991,
0.06931373476982117,
-0.06240420043468475,
-0.07520479708909988,
0.16509535908699036,
0.03458036854863167,
-0.11894045770168304,
-0.058761171996593475,
0.009801201522350311,
0.011562753468751907,
0.017607515677809715,
-0.021867938339710236,
-0.022320300340652466,
0.03403507173061371,
-0.026696451008319855,
-0.053307924419641495,
0.04133937880396843,
0.02019910328090191,
0.08109256625175476,
-0.1426040530204773,
-0.08318742364645004,
-0.04989401996135712,
-0.07836978137493134,
0.1695149540901184,
-0.02414153702557087,
0.04551751911640167,
-0.11203569918870926,
0.01749202236533165,
-0.0563785620033741,
-0.09157076478004456,
-0.025629552081227303,
-0.14644460380077362,
0.16003498435020447,
0.043794531375169754,
-0.09869307279586792,
0.07557836174964905,
0.09032057225704193,
-0.006413896568119526,
-0.010564730502665043,
-0.06969822943210602,
-0.024877391755580902,
0.13625305891036987,
0.1145389974117279,
0.0579415038228035,
-0.14802296459674835,
0.07340478152036667,
0.012440355494618416,
0.04717255011200905,
0.05512333661317825,
0.00569143844768405,
-0.19499529898166656,
-0.12677952647209167,
0.005160060245543718,
0.018052684143185616,
-0.014638770371675491,
0.00664796307682991,
-0.07011435180902481,
0.02880156971514225,
0.05773315206170082,
0.021412812173366547,
0.0014452319592237473,
-0.04181263968348503,
-0.013902071863412857,
0.0888669341802597,
0.09148286283016205,
0.030512608587741852,
0.030142562463879585,
0.020326562225818634,
-0.1215762197971344,
0.09820911288261414,
-0.06699788570404053,
-0.10527123510837555,
0.02323647029697895,
-0.09763448685407639,
0.07328448444604874,
-0.15581367909908295,
-0.03129320591688156,
0.027624433860182762,
0.027471549808979034,
-0.03808354213833809,
0.055426884442567825,
0.015190905891358852,
0.013544362038373947,
-0.013506650924682617,
-0.012089290656149387,
-0.041122883558273315,
-0.09959224611520767,
0.00563401635736227,
-0.08583004027605057,
0.13421863317489624,
-0.014424961060285568,
0.0037119430489838123,
-0.10150676965713501,
0.06574781984090805,
-0.09928366541862488,
0.08063304424285889,
-0.0959048792719841,
0.16798660159111023,
-0.015066372230648994,
-0.04335067421197891,
-0.11256762593984604,
0.08254672586917877,
0.05388127639889717,
0.26531264185905457,
-0.1340128481388092,
-0.07020217180252075,
0.16187213361263275,
-0.06459244340658188,
-0.12997561693191528,
-0.001890780869871378,
-0.043486371636390686,
0.1263020932674408,
0.03188936412334442,
0.09145698696374893,
0.032759830355644226,
-0.2508690655231476,
0.050334397703409195,
0.12060586363077164,
-0.09279462695121765,
0.06442895531654358,
0.017174478620290756,
0.006447388790547848,
-0.052854400128126144,
-0.04925929009914398,
0.034597303718328476,
0.08751251548528671,
-0.06978055834770203,
-0.026659032329916954,
0.023117711767554283,
-0.07030048966407776,
0.1605290025472641,
0.020418982952833176,
0.057524316012859344,
-0.011305559426546097,
-0.0818207636475563,
0.08191800117492676,
0.034414470195770264,
0.14191699028015137,
-0.00192946195602417,
-0.07410262525081635,
0.007995294407010078,
-0.18960227072238922,
0.010307786986231804,
-0.16364586353302002,
0.1047171875834465,
-0.11021095514297485,
0.1652638167142868,
0.15152651071548462,
0.2059401571750641,
0.03544820100069046,
-0.1024046465754509,
-0.0046420348808169365,
-0.020152658224105835,
-0.07736936211585999,
0.045288216322660446,
-0.0334029458463192,
-0.022058574482798576,
0.005789152346551418,
-0.04454334080219269,
-0.09874650835990906,
-0.13249467313289642,
0.005596923176199198,
0.10286425799131393,
0.04929191246628761,
-0.04231568053364754,
0.0877244770526886,
0.058980684727430344,
-0.0028301572892814875,
-0.07901446521282196,
-0.004466289654374123,
0.05796636641025543,
-0.03908589482307434,
-0.0797058492898941,
0.06947364658117294,
0.1461380124092102,
0.1500922590494156,
0.17738854885101318,
-0.20096242427825928,
-0.06060289591550827,
0.03737468644976616,
-0.031355589628219604,
0.06386254727840424,
0.10568995773792267,
-0.06720849871635437,
0.055392976850271225,
0.025139831006526947,
0.032791461795568466,
-0.07859624177217484,
0.015389123000204563,
0.01396841462701559,
-0.07748648524284363,
-0.0993824154138565,
0.06100296974182129,
0.16852954030036926,
-0.2700941264629364,
0.07394412159919739,
0.26087602972984314,
-0.020894255489110947,
0.18394576013088226,
0.06683595478534698,
-0.013323646038770676,
-0.024605628103017807,
-0.15893018245697021,
-0.044086355715990067,
0.09014781564474106,
-0.10415950417518616,
-0.027977313846349716,
0.01123962365090847,
-0.008022771216928959,
0.09605327993631363,
-0.11293191462755203,
-0.08318885415792465,
0.03267814964056015,
0.008087115362286568,
-0.08700203895568848,
0.03932662680745125,
-0.02128841169178486,
0.11520075798034668,
0.08375725895166397,
-0.05305376648902893,
0.029007818549871445,
0.008111669681966305,
-0.06265512108802795,
0.08185195177793503,
-0.06837578862905502,
-0.22869783639907837,
-0.09538227319717407,
-0.09785497933626175,
0.05036304146051407,
0.034417226910591125,
-0.009357006289064884,
-0.12119727581739426,
0.011540059931576252,
0.050228919833898544,
0.07588327676057816,
-0.07680833339691162,
0.009202827699482441,
0.09639438986778259,
0.08735319972038269,
-0.004607051145285368,
-0.1338251531124115,
-0.08837218582630157,
-0.06933583319187164,
0.00392555259168148,
-0.04381631314754486,
-0.10430966317653656,
0.08900827169418335,
0.07903845608234406,
-0.003968013450503349,
0.05783158540725708,
0.04956956207752228,
0.2725658714771271,
-0.10061261057853699,
0.02044108137488365,
0.07254786044359207,
0.007951107807457447,
0.035302095115184784,
0.10617801547050476,
0.05990283191204071,
-0.002696855226531625,
-0.04217705503106117,
-0.0359715111553669,
-0.11649077385663986,
-0.06552241742610931,
-0.1358553171157837,
-0.0064566656947135925,
-0.16051119565963745,
0.040432728826999664,
0.011177349835634232,
0.07440520823001862,
0.1277555376291275,
0.036592863500118256,
-0.018043074756860733,
-0.05376804620027542,
0.01406088937073946,
-0.01170544233173132,
-0.0175679549574852,
0.008810501545667648,
-0.052126556634902954,
-0.02844902127981186,
0.0572112537920475,
-0.006842184346169233,
0.11014273762702942,
0.07082070410251617,
-0.06652447581291199,
0.14335192739963531,
-0.05356216058135033,
0.13785813748836517,
0.1806974709033966,
-0.05501841753721237,
-0.057709746062755585,
0.025674335658550262,
-0.035476721823215485,
0.008942834101617336,
0.13669130206108093,
0.02908998355269432,
-0.010052940808236599,
-0.017239484935998917,
-0.25703075528144836,
0.07305184751749039,
-0.08138610422611237,
0.0812792107462883,
-0.2262231707572937,
-0.05115276575088501,
0.05607922375202179,
0.005236713215708733,
-0.027023665606975555,
0.0381685271859169,
0.05775478854775429,
-0.10606858879327774,
0.1265040636062622,
0.004201714415103197,
0.08623922616243362,
0.03199402242898941,
0.0355917364358902,
-0.11587034165859222,
-0.10244566202163696,
-0.05273645371198654,
0.057487983256578445,
-0.23661334812641144,
0.2777107357978821,
-0.0048861075192689896,
-0.0874347984790802,
-0.06194097548723221,
-0.07113354653120041,
0.09659724682569504,
0.2521769404411316,
0.11393732577562332,
0.019253158941864967,
0.16073539853096008,
-0.0847301185131073,
-0.02304268628358841,
-0.025549419224262238,
0.016288889572024345,
-0.04987170919775963,
0.04445398598909378,
0.033282507210969925,
-0.005813345778733492,
-0.04965744540095329,
0.1339344084262848,
-0.1789259910583496,
-0.11462267488241196,
-0.013343721628189087,
0.00380537542514503,
-0.031056497246026993,
-0.002056375378742814,
-0.08482999354600906,
0.015388455241918564,
0.10655809938907623,
-0.04802677407860756,
-0.09208071231842041,
-0.05510595068335533,
0.04006688669323921,
0.08401811867952347,
-0.04769586771726608,
0.04548902437090874,
-0.0678647980093956,
0.0564068965613842,
-0.11472172290086746,
-0.10922907292842865,
0.041595108807086945,
-0.0918857604265213,
-0.04287494346499443,
-0.023805133998394012,
0.10383405536413193,
0.09752815216779709,
0.047724563628435135,
0.0544714592397213,
0.04140102490782738,
-0.0954534038901329,
-0.11124879866838455,
0.049993790686130524,
-0.04871585592627525,
0.0500442311167717,
-0.008085167966783047,
-0.03408823162317276,
-0.0049418010748922825,
0.019384896382689476,
0.018487339839339256,
0.1156897246837616,
0.2208201289176941,
-0.04981112480163574,
0.11169750243425369,
0.05732187256217003,
-0.03868391737341881,
-0.21964409947395325,
0.0029544569551944733,
-0.06086497753858566,
0.02301662229001522,
0.059703145176172256,
-0.14629603922367096,
-0.10609197616577148,
-0.009552303701639175,
-0.05371579900383949,
0.06130576133728027,
-0.15173286199569702,
-0.08527350425720215,
0.1093977689743042,
-0.027691662311553955,
0.1750616729259491,
-0.06774090230464935,
-0.061304375529289246,
-0.049945514649152756,
-0.09955283999443054,
0.24146941304206848,
0.024268662557005882,
0.005592418368905783,
0.01733056642115116,
0.1598820686340332,
0.04785235971212387,
-0.013057769276201725,
0.18987971544265747,
-0.10067495703697205,
-0.03248852118849754,
-0.11695817112922668,
-0.30084410309791565,
0.1338101625442505,
-0.02223362773656845,
-0.0009560272446833551,
-0.02839553728699684,
-0.023277850821614265,
-0.13571326434612274,
-0.027572499588131905,
-0.14693982899188995,
0.02464335784316063,
-0.032316647469997406,
-0.0716058760881424,
-0.08181880414485931,
0.06141792982816696,
-0.07710926979780197,
0.03892486169934273,
0.21284157037734985,
-0.12277846783399582,
0.12009233981370926,
0.18586663901805878,
0.16595321893692017,
-0.12081008404493332,
0.1924583613872528,
0.0071349600329995155,
-0.030586477369070053,
0.09181836247444153,
-0.16982881724834442,
-0.053205706179142,
0.03414143994450569,
0.025848936289548874,
0.017532482743263245,
0.08235391974449158,
-0.08624553680419922,
0.03838832676410675,
0.15392333269119263,
-0.10963649302721024,
-0.18414755165576935,
-0.07498113065958023,
0.01808399334549904,
0.06835572421550751,
0.06173194199800491,
0.12989267706871033,
-0.039846085011959076,
-0.08824833482503891,
0.01220821961760521,
-0.0029438072815537453,
-0.17570842802524567,
-0.04139560088515282,
0.1891198456287384,
0.016307184472680092,
-0.010853100568056107,
-0.08194329589605331,
-0.025125205516815186,
-0.015230272896587849,
-0.021323880180716515,
0.08581801503896713,
-0.06617391854524612,
-0.12878219783306122,
-0.13903871178627014,
0.07835577428340912,
-0.06162373349070549,
-0.001344490796327591,
0.07485364377498627,
-0.038506895303726196,
0.012231675907969475,
0.16485647857189178,
0.09668754786252975,
-0.011555595323443413,
-0.05357099324464798,
-0.019282780587673187,
0.029631393030285835,
-0.06993406265974045,
-0.021508388221263885,
-0.06699183583259583,
-0.04499039798974991,
0.005626986734569073,
0.0112221147865057,
0.17731308937072754,
-0.12768599390983582,
-0.008344794623553753,
-0.1448773890733719,
0.07574726641178131,
-0.27142050862312317,
-0.02171642892062664,
-0.14365577697753906,
-0.04240390285849571,
-0.03966761380434036,
-0.09724410623311996,
-0.0957818478345871,
-0.03535228967666626,
-0.13990360498428345,
0.06844934821128845,
-0.005559669807553291,
0.05375884100794792,
-0.062116317451000214,
0.03833574801683426,
0.174875408411026,
0.0035164919681847095,
0.07147045433521271,
0.037527330219745636,
-0.07712528109550476,
0.05389079079031944,
-0.1398705393075943,
-0.019472187384963036,
0.11483088880777359,
-0.030587676912546158,
0.03940717875957489,
0.08979400247335434,
0.029575608670711517,
-0.004005106166005135,
-0.024546341970562935,
0.04505901038646698,
0.010945003479719162,
0.015598124824464321,
0.10358744114637375,
0.1582067757844925,
-0.0819825828075409,
0.008919011801481247,
0.039488986134529114,
0.05405454710125923,
0.043578267097473145,
0.003571760840713978,
0.012829680927097797,
0.039850443601608276,
-0.03494713082909584,
0.04061952605843544,
-0.092311792075634,
-0.10411152243614197,
0.10661669820547104,
-0.12732504308223724,
0.008198786526918411,
0.013798710890114307,
0.3344947099685669,
0.06307891756296158,
0.0352337621152401,
0.02719443291425705,
0.17190782725811005,
-0.04144526273012161,
0.02614075317978859,
0.12528805434703827,
0.0130632808431983,
-0.008807872422039509,
-0.13128796219825745,
0.016993004828691483,
-0.0009519474115222692,
-0.1726844757795334,
0.03884505480527878,
0.11911533772945404,
0.052631910890340805,
0.04950377345085144,
0.014206258580088615,
0.07785216718912125,
0.025511329993605614,
-0.17863506078720093,
0.04509992524981499,
0.02068803459405899,
-0.06829190254211426,
0.12344346940517426,
0.14122168719768524,
-0.08167414367198944,
0.04027066379785538,
0.009623752906918526,
0.021203365176916122,
-0.09344169497489929,
-0.17753848433494568,
0.002705678576603532,
-0.13414686918258667,
0.03508502244949341,
0.000505817006342113,
-0.03530890867114067,
0.10436492413282394,
0.0034935614094138145,
-0.042733196169137955,
0.13249032199382782,
0.03290876746177673,
-0.10045794397592545,
0.06085139513015747,
0.02693822793662548,
0.019090646877884865,
-0.14777623116970062,
0.01754976063966751,
-0.07688463479280472,
-0.005327656399458647,
-0.028943177312612534,
0.03399113565683365,
-0.14466170966625214,
-0.06399662792682648,
-0.09679147601127625,
-0.027766665443778038,
-0.009108573198318481,
0.0038189617916941643,
-0.010453673079609871,
-0.03786991164088249,
-0.003129425924271345,
-0.004939202684909105,
-0.014066376723349094,
0.1667269915342331,
-0.03608149290084839,
-0.10124112665653229,
-0.008810057304799557,
0.10626273602247238,
-0.13312885165214539,
0.13550487160682678,
-0.08839058876037598,
-0.025524333119392395,
-0.11591482162475586,
0.2711458206176758,
0.33878639340400696,
-0.25021860003471375,
0.05880655348300934,
-0.048438623547554016,
0.04982103779911995,
0.11938150972127914,
0.0021484699100255966,
0.051959626376628876,
0.26725396513938904,
-0.1286793351173401,
0.027850091457366943,
-0.06140502914786339,
-0.017141692340373993,
0.004475828725844622,
-0.04066090285778046,
0.013676872476935387,
-0.03710604086518288,
-0.12172506749629974,
0.07541082054376602,
-0.12642253935337067,
-0.017434827983379364,
0.017827868461608887,
-0.1633412092924118,
-0.05998261272907257,
0.013153393752872944,
-0.017455315217375755,
-0.013565469533205032,
0.0937826931476593,
-0.03483660891652107,
-0.12017408758401871,
-0.029877550899982452,
0.0030992142856121063,
-0.06522780656814575,
-0.05599122494459152,
0.1337004154920578,
0.05202193930745125,
0.0923381894826889,
-0.040334828197956085,
-0.044774118810892105,
0.10645637661218643,
0.011705394834280014,
-0.08609110862016678,
0.028961658477783203,
0.08927638828754425,
-0.03342263400554657,
-0.11814350634813309,
-0.07116081565618515,
0.0181263480335474,
-0.21519915759563446,
0.0694885179400444,
-0.21758906543254852,
0.10655233263969421,
-0.06119227409362793,
-0.029655668884515762,
-0.012223146855831146,
0.06544996798038483,
-0.10417578369379044,
0.036460474133491516,
0.023119205608963966,
-0.08160173147916794,
-0.03549475967884064,
0.020506642758846283,
0.08273991197347641,
0.08168288320302963,
0.02816755324602127,
-0.0962219387292862,
-0.0050356131978333,
-0.085517019033432,
0.06632056832313538,
-0.014799879863858223,
-0.1936732530593872,
-0.021077774465084076,
-0.04327160865068436,
0.0982925072312355,
-0.03930439054965973,
0.051913801580667496,
0.08421922475099564,
0.0024118563160300255,
-0.019117766991257668,
-0.11532608419656754,
0.0537104606628418,
0.028888586908578873,
-0.11127030849456787,
-0.04138414189219475
] |
null | null | transformers |
# Model Trained Using AutoNLP
- Problem type: Binary Classification
- Model ID: 36769078
- CO2 Emissions (in grams): 23.42719853096565
## Validation Metrics
- Loss: 0.15959647297859192
- Accuracy: 0.9817757009345794
- Precision: 0.980411361410382
- Recall: 0.9813725490196078
- AUC: 0.9982379201680672
- F1: 0.9808917197452229
## Usage
You can use cURL to access this model:
```
$ curl -X POST -H "Authorization: Bearer YOUR_API_KEY" -H "Content-Type: application/json" -d '{"inputs": "I love AutoNLP"}' https://api-inference.huggingface.co/models/Qinghui/autonlp-fake-covid-news-36769078
```
Or Python API:
```
from transformers import AutoModelForSequenceClassification, AutoTokenizer
model = AutoModelForSequenceClassification.from_pretrained("Qinghui/autonlp-fake-covid-news-36769078", use_auth_token=True)
tokenizer = AutoTokenizer.from_pretrained("Qinghui/autonlp-fake-covid-news-36769078", use_auth_token=True)
inputs = tokenizer("I love AutoNLP", return_tensors="pt")
outputs = model(**inputs)
``` | {"language": "unk", "tags": "autonlp", "datasets": ["Qinghui/autonlp-data-fake-covid-news"], "widget": [{"text": "I love AutoNLP \ud83e\udd17"}], "co2_eq_emissions": 23.42719853096565} | text-classification | Qinghui/autonlp-fake-covid-news-36769078 | [
"transformers",
"pytorch",
"roberta",
"text-classification",
"autonlp",
"unk",
"dataset:Qinghui/autonlp-data-fake-covid-news",
"co2_eq_emissions",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [
"unk"
] | TAGS
#transformers #pytorch #roberta #text-classification #autonlp #unk #dataset-Qinghui/autonlp-data-fake-covid-news #co2_eq_emissions #autotrain_compatible #endpoints_compatible #region-us
|
# Model Trained Using AutoNLP
- Problem type: Binary Classification
- Model ID: 36769078
- CO2 Emissions (in grams): 23.42719853096565
## Validation Metrics
- Loss: 0.15959647297859192
- Accuracy: 0.9817757009345794
- Precision: 0.980411361410382
- Recall: 0.9813725490196078
- AUC: 0.9982379201680672
- F1: 0.9808917197452229
## Usage
You can use cURL to access this model:
Or Python API:
| [
"# Model Trained Using AutoNLP\n\n- Problem type: Binary Classification\n- Model ID: 36769078\n- CO2 Emissions (in grams): 23.42719853096565",
"## Validation Metrics\n\n- Loss: 0.15959647297859192\n- Accuracy: 0.9817757009345794\n- Precision: 0.980411361410382\n- Recall: 0.9813725490196078\n- AUC: 0.9982379201680672\n- F1: 0.9808917197452229",
"## Usage\n\nYou can use cURL to access this model:\n\n\n\nOr Python API:"
] | [
"TAGS\n#transformers #pytorch #roberta #text-classification #autonlp #unk #dataset-Qinghui/autonlp-data-fake-covid-news #co2_eq_emissions #autotrain_compatible #endpoints_compatible #region-us \n",
"# Model Trained Using AutoNLP\n\n- Problem type: Binary Classification\n- Model ID: 36769078\n- CO2 Emissions (in grams): 23.42719853096565",
"## Validation Metrics\n\n- Loss: 0.15959647297859192\n- Accuracy: 0.9817757009345794\n- Precision: 0.980411361410382\n- Recall: 0.9813725490196078\n- AUC: 0.9982379201680672\n- F1: 0.9808917197452229",
"## Usage\n\nYou can use cURL to access this model:\n\n\n\nOr Python API:"
] | [
73,
42,
79,
17
] | [
"passage: TAGS\n#transformers #pytorch #roberta #text-classification #autonlp #unk #dataset-Qinghui/autonlp-data-fake-covid-news #co2_eq_emissions #autotrain_compatible #endpoints_compatible #region-us \n# Model Trained Using AutoNLP\n\n- Problem type: Binary Classification\n- Model ID: 36769078\n- CO2 Emissions (in grams): 23.42719853096565## Validation Metrics\n\n- Loss: 0.15959647297859192\n- Accuracy: 0.9817757009345794\n- Precision: 0.980411361410382\n- Recall: 0.9813725490196078\n- AUC: 0.9982379201680672\n- F1: 0.9808917197452229## Usage\n\nYou can use cURL to access this model:\n\n\n\nOr Python API:"
] | [
-0.1685558259487152,
0.16381238400936127,
0.0001235377276316285,
0.056201279163360596,
0.10237830132246017,
0.0054432121105492115,
0.05481581389904022,
0.08567805588245392,
0.019713472574949265,
0.07865533977746964,
0.15727598965168,
0.17137975990772247,
0.04567375406622887,
0.14124222099781036,
-0.12217796593904495,
-0.1433172971010208,
0.0817541629076004,
0.0557309128344059,
0.1226399689912796,
0.12593591213226318,
0.07103701680898666,
-0.11621890217065811,
0.1442156732082367,
0.0669570043683052,
-0.1574873924255371,
-0.020574573427438736,
0.07826047390699387,
-0.11272621899843216,
0.08334740251302719,
0.09056981652975082,
0.15109792351722717,
0.01710687391459942,
0.1004972830414772,
-0.07358276844024658,
-0.015521641820669174,
-0.00995545368641615,
-0.019189415499567986,
0.09785088151693344,
0.052379436790943146,
-0.03428633138537407,
0.015843883156776428,
-0.01390621718019247,
0.07350561022758484,
0.04176056385040283,
-0.08866439014673233,
-0.032695211470127106,
-0.05201641470193863,
0.0555809922516346,
0.12962836027145386,
0.14005252718925476,
-0.017413198947906494,
0.29663097858428955,
-0.11436247825622559,
0.075532928109169,
0.04391488805413246,
-0.24406211078166962,
-0.027808716520667076,
0.12584155797958374,
-0.032161712646484375,
-0.08967862278223038,
-0.029826562851667404,
0.013938238844275475,
0.10482146590948105,
0.023320285603404045,
0.016041787341237068,
-0.04103509709239006,
-0.03663162887096405,
0.0010293921222910285,
-0.10956913977861404,
-0.06648578494787216,
0.18398559093475342,
0.04209040105342865,
-0.08911793678998947,
-0.037589751183986664,
-0.08093184232711792,
-0.13026097416877747,
-0.06596741825342178,
-0.06241273134946823,
-0.026981040835380554,
-0.04748673737049103,
-0.05421077832579613,
0.09260520339012146,
-0.11486071348190308,
-0.06385974586009979,
-0.1818695217370987,
0.10284204035997391,
0.01823589578270912,
0.05543883889913559,
-0.05204208195209503,
0.11055062711238861,
-0.09079611301422119,
-0.11087144166231155,
-0.028496280312538147,
-0.03702586516737938,
-0.05917854607105255,
-0.03756193071603775,
-0.027263877913355827,
0.08934330940246582,
0.010701409541070461,
0.15951615571975708,
0.005541559308767319,
0.02284999005496502,
0.06477433443069458,
-0.011258573271334171,
0.002499890746548772,
0.19073405861854553,
-0.14274504780769348,
0.0007405175128951669,
0.05182289704680443,
-0.0422823429107666,
0.03616009280085564,
-0.023699672892689705,
-0.06824108213186264,
-0.11560884118080139,
0.1205766499042511,
0.026070410385727882,
0.01610393635928631,
0.03599200397729874,
-0.08557181805372238,
-0.019081836566329002,
0.11185789108276367,
-0.045819103717803955,
0.002430316526442766,
-0.023550383746623993,
-0.1011592298746109,
0.09534105658531189,
0.07752461731433868,
0.05255448818206787,
-0.06328269839286804,
0.05463607609272003,
-0.1375080645084381,
0.01803063601255417,
-0.03140237182378769,
-0.09565524756908417,
0.04518904164433479,
-0.10153999924659729,
0.04313843324780464,
-0.19759215414524078,
-0.16554005444049835,
-0.024917369708418846,
-0.03330974653363228,
-0.06226695701479912,
-0.033382777124643326,
-0.03574998676776886,
-0.06349782645702362,
0.0421418733894825,
-0.01332738995552063,
-0.004893406294286251,
-0.04972384124994278,
0.035707246512174606,
0.035237591713666916,
0.03534139320254326,
-0.10385900735855103,
0.023093553259968758,
-0.11196660995483398,
0.0003635651373770088,
-0.07072386145591736,
0.03629807382822037,
-0.011577087454497814,
0.033705923706293106,
-0.13613027334213257,
-0.07317939400672913,
0.09186878055334091,
-0.03361768275499344,
0.08346707373857498,
0.1641440987586975,
-0.05647178366780281,
-0.011014138348400593,
0.05726596340537071,
-0.05464695394039154,
-0.10293401032686234,
0.10042676329612732,
-0.04642413556575775,
0.010415468364953995,
0.07134431600570679,
-0.040423862636089325,
0.13362745940685272,
-0.13104292750358582,
-0.053027570247650146,
0.025848722085356712,
-0.025626257061958313,
-0.08790181577205658,
0.061307668685913086,
0.02141072414815426,
-0.16733287274837494,
0.04640447720885277,
0.045932065695524216,
0.03036327473819256,
-0.07698294520378113,
-0.0985276997089386,
-0.05964718386530876,
-0.04795049503445625,
0.02078351005911827,
0.0005598742282018065,
0.08232101052999496,
-0.023974386975169182,
-0.08262845873832703,
-0.04901294410228729,
0.1471894234418869,
0.005628415383398533,
-0.03437742218375206,
-0.1591857522726059,
0.13460101187229156,
-0.18999576568603516,
-0.06504600495100021,
-0.1885906159877777,
-0.008536160923540592,
-0.00879251305013895,
0.013777872547507286,
-0.0342731773853302,
-0.024866797029972076,
0.04769635945558548,
0.024977387860417366,
0.01842813938856125,
-0.01683177426457405,
0.09717770665884018,
0.0018752055475488305,
-0.11520703136920929,
-0.06122303009033203,
0.007788578514009714,
-0.018192656338214874,
0.24262385070323944,
-0.1306576281785965,
-0.03243836760520935,
-0.04508180171251297,
0.10422330349683762,
-0.029099034145474434,
0.026614967733621597,
-0.006662886124104261,
0.03418273478746414,
-0.05079253017902374,
0.0040905955247581005,
0.026155754923820496,
-0.013238977640867233,
-0.12174281477928162,
0.04695015028119087,
-0.1750849485397339,
0.18172062933444977,
0.16667191684246063,
-0.04407649487257004,
-0.08571036159992218,
0.02690720185637474,
0.029988177120685577,
-0.018221022561192513,
-0.04567530378699303,
0.02063210867345333,
0.11064014583826065,
-0.0024501467123627663,
0.10544645041227341,
-0.09420982003211975,
-0.01237805001437664,
0.08461466431617737,
-0.08010091632604599,
-0.026348542422056198,
0.18150869011878967,
0.053440161049366,
-0.18912506103515625,
0.08295682072639465,
0.04092072322964668,
-0.09533660113811493,
0.013402157463133335,
0.024505972862243652,
-0.07807623594999313,
-0.03862367197871208,
-0.05273552983999252,
0.005679688882082701,
0.08266840130090714,
-0.025693736970424652,
0.06301017850637436,
0.09083107858896255,
-0.01923203654587269,
0.011406734585762024,
-0.14129137992858887,
-0.01008788961917162,
0.008157027885317802,
0.0137995770201087,
-0.09269391745328903,
0.0311262309551239,
0.03535572439432144,
0.14314298331737518,
0.018671968951821327,
-0.12504087388515472,
0.0374239981174469,
0.025486242026090622,
-0.14339739084243774,
0.24520394206047058,
-0.08042609691619873,
-0.21337014436721802,
-0.15356005728244781,
-0.09040406346321106,
-0.0468718521296978,
0.00447655888274312,
0.013945159502327442,
-0.05709189176559448,
-0.12369813024997711,
-0.04660073295235634,
-0.0870964303612709,
-0.01048264279961586,
-0.002139007905498147,
-0.021420132368803024,
-0.03757765516638756,
0.0458344966173172,
-0.0732845589518547,
-0.04538511857390404,
-0.03884027525782585,
-0.02660437300801277,
0.13118650019168854,
-0.07523618638515472,
0.11723146587610245,
0.1718500703573227,
-0.01325102336704731,
0.014914629980921745,
0.029596766456961632,
0.21890480816364288,
-0.025499004870653152,
-0.024645939469337463,
0.19380971789360046,
0.03367719426751137,
0.02521548420190811,
0.13468436896800995,
0.009594359435141087,
-0.06769877672195435,
-0.0025532077997922897,
-0.030847257003188133,
-0.035820942372083664,
-0.18935957551002502,
-0.17313382029533386,
0.011034952476620674,
-0.041436780244112015,
0.11418254673480988,
0.004616291727870703,
0.10501549392938614,
0.1657574623823166,
-0.005294010974466801,
0.03209434822201729,
-0.08529660105705261,
0.0848860964179039,
0.13867038488388062,
0.030175477266311646,
0.15564163029193878,
-0.0646822452545166,
-0.06396465748548508,
0.05738447606563568,
-0.05334457382559776,
0.070844866335392,
0.03966508060693741,
-0.04423009231686592,
-0.01901838928461075,
0.17108874022960663,
0.08464308083057404,
0.11610549688339233,
0.09207102656364441,
-0.052218854427337646,
0.02589336223900318,
-0.04642995819449425,
-0.1047496348619461,
0.04604574292898178,
0.06640693545341492,
0.04823824390769005,
-0.10652957111597061,
-0.00549812288954854,
-0.013013690710067749,
0.0629470944404602,
0.19416971504688263,
-0.4933422803878784,
-0.11069022119045258,
0.0024310804437845945,
-0.023033704608678818,
-0.1368325799703598,
-0.0007222809945233166,
-0.08206414431333542,
-0.17354658246040344,
0.030708851292729378,
-0.011594191193580627,
0.1052074134349823,
-0.004809732083231211,
-0.010323517955839634,
-0.09638109803199768,
0.019065072759985924,
-0.022756418213248253,
0.08918981999158859,
-0.24068091809749603,
0.24932609498500824,
0.046365901827812195,
0.046399008482694626,
-0.08738698065280914,
-0.01390210259705782,
0.018624095246195793,
0.080551378428936,
0.15342669188976288,
-0.0006774224457331002,
0.07220693677663803,
-0.2892662286758423,
-0.19068211317062378,
0.05968628451228142,
-0.03592142090201378,
0.017795715481042862,
0.09295665472745895,
0.025207700207829475,
-0.024535367265343666,
0.0010990815935656428,
-0.06407669186592102,
-0.10139265656471252,
-0.03136739879846573,
0.024109244346618652,
0.09633684158325195,
-0.027959395200014114,
0.004826897289603949,
-0.09473937749862671,
-0.06348428130149841,
0.0827966183423996,
-0.020505782216787338,
-0.08273874223232269,
-0.12290192395448685,
0.03599629923701286,
0.1308654397726059,
-0.11627399176359177,
0.04746779054403305,
-0.05527424439787865,
0.0702262818813324,
0.021321918815374374,
-0.11062277853488922,
0.09158802777528763,
-0.08141408860683441,
-0.08490972965955734,
0.023090017959475517,
0.09018819779157639,
0.06421196460723877,
0.048137933015823364,
0.07249417155981064,
0.04548490792512894,
-0.09812677651643753,
-0.11754447966814041,
-0.020105019211769104,
0.09056155383586884,
0.15286363661289215,
0.10577064752578735,
0.043176185339689255,
-0.1587880551815033,
-0.08592488616704941,
0.05988098680973053,
0.14108939468860626,
0.1984054297208786,
-0.09258514642715454,
-0.028248054906725883,
0.14787471294403076,
0.011806776747107506,
-0.20700092613697052,
-0.017690494656562805,
0.0010871029226109385,
0.08198259770870209,
-0.13702042400836945,
-0.006933329161256552,
0.11763087660074234,
0.0862981528043747,
-0.04632272571325302,
-0.024281786754727364,
-0.21176157891750336,
-0.1287827044725418,
0.2907413840293884,
0.04115832597017288,
0.18148255348205566,
-0.04786472022533417,
-0.01703665591776371,
-0.08799498528242111,
-0.2647145390510559,
0.1453835517168045,
-0.015047596767544746,
0.07359114289283752,
-0.054502055048942566,
0.13105155527591705,
0.0536612942814827,
-0.06812413781881332,
0.17405053973197937,
0.0036928532645106316,
0.0227663591504097,
-0.02809019573032856,
-0.05432701110839844,
-0.03407846763730049,
-0.05231449380517006,
0.15557026863098145,
0.08008304983377457,
0.09262482076883316,
-0.17136648297309875,
-0.0398896187543869,
-0.04840541258454323,
0.08910638839006424,
-0.029329992830753326,
-0.05293714255094528,
-0.03753910958766937,
-0.0009074126137420535,
-0.015938768163323402,
-0.07075575739145279,
0.02913287840783596,
-0.020364075899124146,
0.052484169602394104,
0.12258731573820114,
0.12519779801368713,
-0.06330655515193939,
0.0030413465574383736,
0.019956795498728752,
-0.08895912766456604,
0.10027050971984863,
-0.14829199016094208,
0.05643530189990997,
0.137201726436615,
-0.0005988344200886786,
0.08283410966396332,
0.03115028887987137,
-0.07436592876911163,
0.003523510415107012,
0.029298508539795876,
-0.16229787468910217,
0.06848174333572388,
-0.01144495140761137,
0.009182926267385483,
-0.044729188084602356,
0.04450138285756111,
0.1201215535402298,
-0.07713353633880615,
-0.03895972669124603,
0.002678799210116267,
0.011904161423444748,
-0.02105407416820526,
0.2221776247024536,
0.04179399460554123,
0.05092322826385498,
-0.14595070481300354,
0.045972634106874466,
0.02224348857998848,
-0.06289289891719818,
0.02755192294716835,
-0.05015093833208084,
-0.14481507241725922,
-0.08440037816762924,
-0.03802109137177467,
0.1307929903268814,
-0.2781497538089752,
-0.09541736543178558,
-0.04128482565283775,
-0.09343577176332474,
0.07088088244199753,
0.224111869931221,
0.12796469032764435,
0.03407346457242966,
-0.026309538632631302,
-0.12779958546161652,
-0.11402688920497894,
0.015290658921003342,
0.14436392486095428,
0.035547446459531784,
-0.13886111974716187,
0.11956318467855453,
-0.009821759536862373,
0.08607800304889679,
-0.05387214198708534,
-0.01758827082812786,
-0.1383884996175766,
0.02334456704556942,
-0.19476687908172607,
0.06656429171562195,
-0.08728726953268051,
0.010840863920748234,
-0.0035013158340007067,
-0.051930103451013565,
-0.07488124072551727,
0.026228928938508034,
-0.06253192573785782,
0.0012769087916240096,
0.02214074321091175,
0.016245342791080475,
-0.059610411524772644,
-0.04342716932296753,
0.06914958357810974,
-0.01250655297189951,
0.04553587734699249,
0.17004267871379852,
0.03225727751851082,
0.07532535493373871,
-0.0977231115102768,
-0.002010157098993659,
0.10092071443796158,
0.03611958026885986,
0.11351126432418823,
-0.16485723853111267,
0.06616050750017166,
0.0699872225522995,
0.0025349599309265614,
0.055119194090366364,
0.09793845564126968,
-0.09722451120615005,
-0.026389634236693382,
-0.05596195533871651,
-0.049612049013376236,
-0.1463763415813446,
0.010702826082706451,
0.11184268444776535,
0.0702485665678978,
0.11101029813289642,
-0.04267139360308647,
0.05579743534326553,
-0.11431790143251419,
0.0031752504874020815,
-0.08967311680316925,
-0.0572303831577301,
-0.03584933653473854,
-0.033165741711854935,
0.0791005939245224,
-0.009712755680084229,
0.10439971834421158,
-0.07780800759792328,
0.10995566099882126,
-0.007252143230289221,
0.08782303333282471,
0.041360221803188324,
-0.0179365873336792,
0.15099920332431793,
0.12019623816013336,
-0.006738651543855667,
0.035125359892845154,
0.09266101568937302,
0.07822370529174805,
-0.03164290264248848,
0.026102382689714432,
0.004991237539798021,
-0.010405373759567738,
0.15099869668483734,
0.019965901970863342,
-0.07322646677494049,
0.02712772972881794,
-0.06645430624485016,
-0.17751842737197876,
0.00785013847053051,
0.02685142494738102,
0.04532986879348755,
0.1417175531387329,
-0.08655713498592377,
-0.04201272875070572,
-0.05574284493923187,
-0.07296910881996155,
-0.19197101891040802,
-0.07650841772556305,
-0.14119654893875122,
-0.05676974356174469,
0.002377029974013567,
-0.08071279525756836,
-0.05121329054236412,
0.11800495535135269,
0.043674398213624954,
-0.0450056828558445,
0.07946331799030304,
-0.0709996148943901,
-0.029952557757496834,
0.0038862216752022505,
0.023220200091600418,
0.02870812453329563,
-0.01782439462840557,
-0.008263514377176762,
0.026492318138480186,
0.013433786109089851,
0.04821347817778587,
-0.008829406462609768,
0.029986055567860603,
0.10718588531017303,
-0.007078954018652439,
-0.09405801445245743,
-0.03711185231804848,
0.047864604741334915,
0.060201868414878845,
0.050065234303474426,
0.042434293776750565,
0.057971034198999405,
-0.018794924020767212,
0.22133086621761322,
-0.09272795915603638,
0.014999308623373508,
-0.1493390053510666,
0.31432172656059265,
-0.012304567731916904,
0.03142785653471947,
0.05814582481980324,
-0.05417720973491669,
0.009378170594573021,
0.1957596093416214,
0.11100134253501892,
-0.03502070531249046,
0.0005991543876007199,
-0.02044020965695381,
-0.013704821467399597,
-0.039038125425577164,
0.011280336417257786,
0.07517290115356445,
0.20624251663684845,
-0.11714604496955872,
0.006094764452427626,
-0.019512968137860298,
-0.004976282361894846,
-0.023377981036901474,
-0.00015341273683588952,
-0.017971428111195564,
-0.03807720169425011,
-0.06884992122650146,
0.08434952050447464,
-0.09511888772249222,
0.07895816117525101,
0.05830841138958931,
-0.07989077270030975,
-0.12651728093624115,
0.028149422258138657,
-0.08250056952238083,
0.00020917522488161922,
0.1050368994474411,
-0.09423910081386566,
0.001553711830638349,
0.045813869684934616,
0.016592737287282944,
-0.13559162616729736,
-0.07926519960165024,
0.05769823491573334,
0.19520722329616547,
0.16954849660396576,
0.042937733232975006,
0.1741182953119278,
0.13978973031044006,
0.03747917711734772,
-0.11644133180379868,
0.11688660830259323,
0.0252562053501606,
-0.07811266928911209,
0.12125039845705032,
0.012511193752288818,
0.0304050762206316,
0.02180001512169838,
0.05744108557701111,
-0.1500784158706665,
0.025691848248243332,
-0.1373053938150406,
0.06673432886600494,
-0.08591455221176147,
0.025389859452843666,
-0.07340498268604279,
0.12034249305725098,
0.10937380790710449,
-0.07513489574193954,
-0.025944460183382034,
-0.0260149072855711,
0.0863630473613739,
0.018361015245318413,
-0.11451556533575058,
0.012315315194427967,
-0.1310846358537674,
0.09015239775180817,
-0.0520760752260685,
0.023329168558120728,
-0.22177278995513916,
-0.014925656840205193,
-0.024487359449267387,
-0.08745843172073364,
-0.013652508147060871,
0.06871271878480911,
-0.002433168236166239,
0.045940157026052475,
-0.043327365070581436,
-0.03803606331348419,
0.005387534387409687,
0.11679268628358841,
-0.08437370508909225,
-0.17219750583171844
] |
null | null | transformers | # Punctuator for Uncased English
The model is fine-tuned based on `DistilBertForTokenClassification` for adding punctuations to plain text (uncased English)
## Usage
```python
from transformers import DistilBertForTokenClassification, DistilBertTokenizerFast
model = DistilBertForTokenClassification.from_pretrained("Qishuai/distilbert_punctuator_en")
tokenizer = DistilBertTokenizerFast.from_pretrained("Qishuai/distilbert_punctuator_en")
```
## Model Overview
### Training data
Combination of following three dataset:
- BBC news: From BBC news website corresponding to stories in five topical areas from 2004-2005. [Reference](https://www.kaggle.com/hgultekin/bbcnewsarchive)
- News articles: 20000 samples of short news articles scraped from Hindu, Indian times and Guardian between Feb 2017 and Aug 2017 [Reference](https://www.kaggle.com/sunnysai12345/news-summary?select=news_summary_more.csv)
- Ted talks: transcripts of over 4,000 TED talks between 2004 and 2019 [Reference](https://www.kaggle.com/miguelcorraljr/ted-ultimate-dataset)
### Model Performance
- Validation with 500 samples of dataset scraped from https://www.thenews.com.pk website. [Reference](https://www.kaggle.com/asad1m9a9h6mood/news-articles)
- Metrics Report:
| | precision | recall | f1-score | support |
|:--------------:|:---------:|:------:|:--------:|:-------:|
| COMMA | 0.66 | 0.55 | 0.60 | 7064 |
| EXLAMATIONMARK | 1.00 | 0.00 | 0.00 | 5 |
| PERIOD | 0.73 | 0.63 | 0.68 | 6573 |
| QUESTIONMARK | 0.54 | 0.41 | 0.47 | 17 |
| micro avg | 0.69 | 0.59 | 0.64 | 13659 |
| macro avg | 0.73 | 0.40 | 0.44 | 13659 |
| weighted avg | 0.69 | 0.59 | 0.64 | 13659 |
- Validation with 86 news ted talks of 2020 which are not included in training dataset [Reference](https://www.kaggle.com/thegupta/ted-talk)
- Metrics Report:
| | precision | recall | f1-score | support |
|:--------------:|:---------:|:------:|:--------:|:-------:|
| COMMA | 0.71 | 0.56 | 0.63 | 10712 |
| EXLAMATIONMARK | 0.45 | 0.07 | 0.12 | 75 |
| PERIOD | 0.75 | 0.65 | 0.70 | 7921 |
| QUESTIONMARK | 0.73 | 0.67 | 0.70 | 827 |
| micro avg | 0.73 | 0.60 | 0.66 | 19535 |
| macro avg | 0.66 | 0.49 | 0.53 | 19535 |
| weighted avg | 0.73 | 0.60 | 0.66 | 19535 |
| {} | token-classification | Qishuai/distilbert_punctuator_en | [
"transformers",
"pytorch",
"safetensors",
"distilbert",
"token-classification",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [] | TAGS
#transformers #pytorch #safetensors #distilbert #token-classification #autotrain_compatible #endpoints_compatible #has_space #region-us
| Punctuator for Uncased English
==============================
The model is fine-tuned based on 'DistilBertForTokenClassification' for adding punctuations to plain text (uncased English)
Usage
-----
Model Overview
--------------
### Training data
Combination of following three dataset:
* BBC news: From BBC news website corresponding to stories in five topical areas from 2004-2005. Reference
* News articles: 20000 samples of short news articles scraped from Hindu, Indian times and Guardian between Feb 2017 and Aug 2017 Reference
* Ted talks: transcripts of over 4,000 TED talks between 2004 and 2019 Reference
### Model Performance
* Validation with 500 samples of dataset scraped from URL website. Reference
* Metrics Report:
* Validation with 86 news ted talks of 2020 which are not included in training dataset Reference
* Metrics Report:
| [
"### Training data\n\n\nCombination of following three dataset:\n\n\n* BBC news: From BBC news website corresponding to stories in five topical areas from 2004-2005. Reference\n* News articles: 20000 samples of short news articles scraped from Hindu, Indian times and Guardian between Feb 2017 and Aug 2017 Reference\n* Ted talks: transcripts of over 4,000 TED talks between 2004 and 2019 Reference",
"### Model Performance\n\n\n* Validation with 500 samples of dataset scraped from URL website. Reference\n* Metrics Report:\n* Validation with 86 news ted talks of 2020 which are not included in training dataset Reference\n* Metrics Report:"
] | [
"TAGS\n#transformers #pytorch #safetensors #distilbert #token-classification #autotrain_compatible #endpoints_compatible #has_space #region-us \n",
"### Training data\n\n\nCombination of following three dataset:\n\n\n* BBC news: From BBC news website corresponding to stories in five topical areas from 2004-2005. Reference\n* News articles: 20000 samples of short news articles scraped from Hindu, Indian times and Guardian between Feb 2017 and Aug 2017 Reference\n* Ted talks: transcripts of over 4,000 TED talks between 2004 and 2019 Reference",
"### Model Performance\n\n\n* Validation with 500 samples of dataset scraped from URL website. Reference\n* Metrics Report:\n* Validation with 86 news ted talks of 2020 which are not included in training dataset Reference\n* Metrics Report:"
] | [
48,
84,
56
] | [
"passage: TAGS\n#transformers #pytorch #safetensors #distilbert #token-classification #autotrain_compatible #endpoints_compatible #has_space #region-us \n### Training data\n\n\nCombination of following three dataset:\n\n\n* BBC news: From BBC news website corresponding to stories in five topical areas from 2004-2005. Reference\n* News articles: 20000 samples of short news articles scraped from Hindu, Indian times and Guardian between Feb 2017 and Aug 2017 Reference\n* Ted talks: transcripts of over 4,000 TED talks between 2004 and 2019 Reference### Model Performance\n\n\n* Validation with 500 samples of dataset scraped from URL website. Reference\n* Metrics Report:\n* Validation with 86 news ted talks of 2020 which are not included in training dataset Reference\n* Metrics Report:"
] | [
-0.13766977190971375,
0.13224855065345764,
0.0027009404730051756,
0.03881433606147766,
0.12099363654851913,
-0.006505096331238747,
0.09835421293973923,
0.08403422683477402,
-0.1038932204246521,
-0.013245031237602234,
0.12844446301460266,
0.11040006577968597,
-0.008928002789616585,
0.18581007421016693,
-0.042766883969306946,
-0.23250256478786469,
0.08358456939458847,
0.02399390935897827,
-0.04854973033070564,
0.10867345333099365,
0.11551533639431,
-0.17640216648578644,
0.023254521191120148,
-0.00011458571680122986,
-0.13269712030887604,
0.06727034598588943,
-0.00430267071351409,
-0.11291997134685516,
0.13014274835586548,
0.04298792779445648,
0.08722877502441406,
0.05916428565979004,
0.0438733734190464,
-0.10474096238613129,
0.04831365495920181,
0.034815020859241486,
0.0023608319461345673,
0.08046771585941315,
0.05576503276824951,
-0.012288819998502731,
0.08654564619064331,
0.0014436685014516115,
0.039492905139923096,
0.031392041593790054,
-0.12087443470954895,
-0.16983357071876526,
-0.11774764955043793,
0.07209128886461258,
0.08044777810573578,
0.1243208572268486,
-0.052201371639966965,
0.26958075165748596,
-0.15624815225601196,
0.09222806245088577,
0.061165716499090195,
-0.277441143989563,
-0.03302985429763794,
0.09710054099559784,
0.011040237732231617,
0.04640120267868042,
-0.11144021898508072,
0.04294174909591675,
0.08829446136951447,
0.004683770705014467,
0.02086169458925724,
-0.020124997943639755,
-0.29459282755851746,
0.030798165127635002,
-0.13729935884475708,
-0.022928478196263313,
0.19193460047245026,
0.009193265810608864,
-0.028692636638879776,
-0.1349385529756546,
-0.00748081412166357,
-0.1184016689658165,
-0.00038714558468200266,
-0.03483467549085617,
-0.0056600370444357395,
-0.05441499501466751,
-0.04557090252637863,
0.014085011556744576,
-0.14183978736400604,
-0.0012954962439835072,
-0.10060926526784897,
0.07645417749881744,
0.05106734856963158,
0.06267935782670975,
-0.07762803882360458,
0.09548360854387283,
0.07145640254020691,
-0.09529437124729156,
0.009904062375426292,
-0.05820576101541519,
-0.0024410421028733253,
-0.02472444623708725,
-0.03888870030641556,
-0.08944754302501678,
0.0047658393159508705,
0.03804037347435951,
-0.16945196688175201,
-0.037570662796497345,
0.011092065833508968,
0.01909785531461239,
0.05232773348689079,
0.12480349093675613,
-0.19423165917396545,
-0.04007960110902786,
0.04393015801906586,
0.06531956791877747,
0.044923536479473114,
0.02075149677693844,
-0.030481385067105293,
0.010361401364207268,
0.03483784943819046,
0.06947910040616989,
-0.073148213326931,
0.09272455424070358,
-0.04359184205532074,
0.01720675453543663,
0.06473728269338608,
-0.06986241042613983,
-0.06563727557659149,
-0.029056323692202568,
-0.04285595566034317,
0.028988484293222427,
0.027316806837916374,
0.10488373041152954,
0.030490465462207794,
0.014858786016702652,
-0.1313864141702652,
-0.006900979671627283,
-0.01990111917257309,
-0.08225224167108536,
0.016864154487848282,
-0.07722951471805573,
0.009047401137650013,
-0.12840275466442108,
-0.16090892255306244,
-0.04230184853076935,
-0.008641555905342102,
-0.023017311468720436,
0.014508594758808613,
-0.05750269070267677,
-0.051365356892347336,
-0.03964132070541382,
0.013575656339526176,
0.11995353549718857,
-0.06602270156145096,
0.07968710362911224,
-0.018567757681012154,
0.06477237492799759,
-0.06398549675941467,
0.04366908222436905,
-0.1938028782606125,
-0.04869557544589043,
-0.04467257484793663,
0.0218854658305645,
-0.0758683979511261,
0.09351762384176254,
-0.048112764954566956,
-0.05734879896044731,
-0.02700934186577797,
0.03650146350264549,
0.0484604611992836,
0.14040052890777588,
-0.10143368691205978,
-0.04408973827958107,
0.14637699723243713,
-0.11551316827535629,
-0.0998838022351265,
0.09102941304445267,
-0.09287706017494202,
0.16812431812286377,
0.0676104947924614,
0.06155581399798393,
0.015156731940805912,
-0.08048155158758163,
-0.027210434898734093,
-0.06913807988166809,
-0.02100198343396187,
-0.010701688006520271,
0.045106709003448486,
0.016018571332097054,
-0.03541466221213341,
0.05306333303451538,
0.09478595852851868,
-0.007366551086306572,
-0.10116467624902725,
-0.06838089972734451,
0.017436373978853226,
-0.10026394575834274,
0.13309720158576965,
0.04573003202676773,
0.144912987947464,
-0.09675686806440353,
-0.08102429658174515,
0.04989815130829811,
0.08730801194906235,
-0.006620905362069607,
-0.014248620718717575,
-0.048030778765678406,
0.1053948625922203,
-0.1830396056175232,
-0.04926549643278122,
-0.12311582267284393,
-0.0527639240026474,
0.0026725835632532835,
0.10843663662672043,
0.016035322099924088,
0.11727097630500793,
0.03489050269126892,
0.04252428188920021,
-0.04841972142457962,
-0.005678441375494003,
0.02247045747935772,
0.016178583726286888,
-0.12717214226722717,
-0.1000867486000061,
0.04504469409584999,
-0.06053047627210617,
0.12406537681818008,
-0.25694721937179565,
-0.008524920791387558,
-0.020588554441928864,
0.1922173798084259,
0.047543685883283615,
-0.0037784227170050144,
0.12778402864933014,
0.023770222440361977,
-0.025211255997419357,
0.008775436319410801,
0.0336090624332428,
-0.017051538452506065,
-0.028914349153637886,
0.13646233081817627,
-0.059598907828330994,
0.15239885449409485,
0.14088129997253418,
-0.11244270205497742,
-0.06318876892328262,
0.0517810620367527,
-0.06582504510879517,
-0.009898917749524117,
-0.11280673742294312,
-0.0007578914519399405,
0.07481186836957932,
-0.03675096854567528,
0.12420868128538132,
-0.06811167299747467,
-0.07821119576692581,
0.021776141598820686,
-0.03603243827819824,
-0.049638450145721436,
0.15098975598812103,
-0.006282791495323181,
-0.1686697006225586,
0.13668689131736755,
0.05207018181681633,
0.00980162899941206,
0.20185723900794983,
-0.027347197756171227,
-0.08850272744894028,
0.04624210298061371,
-0.08056607842445374,
-0.08939780294895172,
0.11805282533168793,
-0.10988695919513702,
0.024675462394952774,
0.061624202877283096,
0.005826367065310478,
0.04588098078966141,
-0.15483760833740234,
-0.04015616327524185,
-0.021566124632954597,
0.004572635050863028,
-0.11618047207593918,
0.12232328951358795,
0.025866718962788582,
0.12951317429542542,
-0.03781933709979057,
0.014392250217497349,
0.03560411185026169,
-0.011134011670947075,
-0.12507452070713043,
0.1622939556837082,
-0.08461326360702515,
-0.27017468214035034,
-0.12452034652233124,
0.05514541268348694,
-0.019444972276687622,
-0.007737994659692049,
0.060019299387931824,
-0.08797390013933182,
-0.07415498048067093,
-0.090025395154953,
0.0038796812295913696,
0.0068579306825995445,
0.000623681175056845,
-0.09380097687244415,
0.018452206626534462,
-0.03484068065881729,
-0.14441758394241333,
0.009272580035030842,
-0.07389489561319351,
0.025124566629529,
0.07299648970365524,
-0.012136954814195633,
0.057632602751255035,
0.04033401235938072,
-0.04627300798892975,
0.02685673162341118,
-0.06823226064443588,
0.20522452890872955,
-0.07637583464384079,
0.014313783496618271,
0.15363997220993042,
-0.023027315735816956,
0.002877025166526437,
0.1839054375886917,
0.008119706064462662,
-0.10237940400838852,
0.01938946731388569,
-0.03705735132098198,
-0.011012362316250801,
-0.25195246934890747,
-0.0535980686545372,
-0.04671681672334671,
-0.0449206680059433,
-0.006942226551473141,
0.006483024917542934,
0.050422362983226776,
0.08014023303985596,
-0.026887115091085434,
0.01517497282475233,
-0.05459695681929588,
0.012386542744934559,
0.11238748580217361,
-0.012973873876035213,
0.15423858165740967,
-0.09380380809307098,
-0.025194332003593445,
0.05173423886299133,
-0.13374947011470795,
0.19594568014144897,
-0.016976915299892426,
-0.12378159165382385,
0.08899832516908646,
0.10496842861175537,
0.06643953174352646,
0.07907194644212723,
-0.007516892161220312,
-0.038250986486673355,
0.015004569664597511,
-0.026490528136491776,
-0.025031287223100662,
0.07711037993431091,
0.015817200765013695,
0.027264917269349098,
-0.08562967926263809,
-0.12011270225048065,
0.02399873174726963,
0.17568150162696838,
0.12571744620800018,
-0.2724773585796356,
-0.11486869305372238,
0.004709496162831783,
-0.10057128965854645,
-0.057838451117277145,
0.05552026256918907,
0.08720928430557251,
-0.10759744793176651,
-0.02362617291510105,
-0.015051841735839844,
0.05370977893471718,
-0.014109612442553043,
0.0011928562307730317,
-0.06624068319797516,
-0.07308129221200943,
-0.07081657648086548,
0.09309153258800507,
-0.2239171713590622,
0.3306727707386017,
-0.00711068045347929,
0.054341915994882584,
-0.06847803294658661,
-0.0553230419754982,
0.0034040699247270823,
0.08631481230258942,
0.09817232936620712,
-0.01316652912646532,
-0.060743097215890884,
-0.14043277502059937,
-0.05035948008298874,
0.04122748225927353,
0.03906644880771637,
-0.019760310649871826,
0.11355967074632645,
-0.012451265938580036,
-0.0010337065905332565,
0.03062325529754162,
-0.06606727838516235,
-0.1322408765554428,
-0.04820854961872101,
0.03031999059021473,
0.08216089755296707,
0.04744811728596687,
-0.07110028713941574,
-0.16041070222854614,
-0.17584192752838135,
0.05038733780384064,
0.04421846196055412,
-0.07833434641361237,
-0.10799313336610794,
0.13638368248939514,
0.10587647557258606,
-0.022825224325060844,
0.06034937873482704,
0.014082049950957298,
0.04463433101773262,
0.033390846103429794,
-0.08510188013315201,
0.10373739898204803,
-0.07361910492181778,
-0.2131277322769165,
-0.06983330845832825,
0.18092109262943268,
0.02217090129852295,
0.038076065480709076,
-0.00400644401088357,
0.1040046364068985,
-0.013408835977315903,
-0.038139890879392624,
0.0031451599206775427,
0.06862366944551468,
0.13541831076145172,
0.06455749273300171,
-0.0735238790512085,
-0.17365871369838715,
-0.019598105922341347,
-0.08627867698669434,
0.19228781759738922,
0.1746920496225357,
-0.10713692009449005,
0.07538098841905594,
0.12383387237787247,
-0.02397049218416214,
-0.26928767561912537,
0.07691416889429092,
-0.022634103894233704,
0.07564602792263031,
-0.0014807789120823145,
-0.0007966240518726408,
0.1630970686674118,
0.07625721395015717,
-0.042393140494823456,
0.09934401512145996,
-0.14118137955665588,
-0.15300050377845764,
0.1206347644329071,
0.07673957943916321,
0.2811138331890106,
-0.07901701331138611,
-0.01942698284983635,
0.047876060009002686,
-0.11692537367343903,
0.12784108519554138,
-0.093899205327034,
0.054247528314590454,
-0.021577242761850357,
0.06842246651649475,
0.018977785483002663,
-0.06476708501577377,
0.06966859847307205,
-0.06871400028467178,
0.04744880273938179,
-0.03184929117560387,
-0.08408281952142715,
0.13572277128696442,
0.01762673445045948,
0.08960163593292236,
0.05105728656053543,
0.0736553817987442,
-0.10973901301622391,
-0.012626527808606625,
-0.07783009856939316,
0.04293421283364296,
-0.03391842171549797,
-0.08921555429697037,
-0.07139337062835693,
0.07109636068344116,
0.00874406099319458,
-0.042468536645174026,
0.1193406954407692,
-0.09075696021318436,
0.19762161374092102,
0.03819924592971802,
0.10994474589824677,
-0.03835257142782211,
0.06623262912034988,
-0.025441424921154976,
-0.08930334448814392,
0.037753429263830185,
-0.19306965172290802,
0.023769209161400795,
0.12576885521411896,
0.014727236703038216,
0.0761476680636406,
0.02993452362716198,
-0.04379235580563545,
0.036680854856967926,
0.06569893658161163,
-0.17006495594978333,
-0.048968356102705,
-0.04906174913048744,
-0.13076473772525787,
-0.06561834365129471,
0.09338721632957458,
0.10613010078668594,
-0.08025112748146057,
-0.037725117057561874,
-0.016432903707027435,
0.010953187942504883,
0.007735850289463997,
0.16255533695220947,
0.04856070131063461,
0.02912924624979496,
-0.10347983241081238,
0.08077598363161087,
0.07804863899946213,
-0.06683454662561417,
-0.012674343772232533,
0.0489119328558445,
-0.11963530629873276,
-0.0909687951207161,
-0.08832813054323196,
0.05212675407528877,
-0.004995307885110378,
-0.04729567468166351,
-0.12664635479450226,
-0.05188107118010521,
0.03163990378379822,
0.23400339484214783,
0.056640900671482086,
0.0500616654753685,
-0.03508109971880913,
-0.07127431035041809,
-0.0720396563410759,
0.14937357604503632,
0.08137588948011398,
0.046124398708343506,
-0.12570428848266602,
0.11421186476945877,
0.012455632910132408,
0.06459937989711761,
-0.0774327963590622,
-0.012095208279788494,
-0.07440625131130219,
0.019167479127645493,
-0.07962904870510101,
-0.04119551554322243,
-0.0026163184083998203,
-0.045488446950912476,
-0.04940207675099373,
-0.1427878588438034,
-0.07152420282363892,
0.02834191732108593,
-0.07844851911067963,
0.08373557776212692,
0.01574300415813923,
0.08314473927021027,
-0.0872575119137764,
-0.03247590363025665,
0.042197827249765396,
-0.003909395541995764,
0.09231150150299072,
0.05539027974009514,
-0.014372637495398521,
0.09558378159999847,
-0.11806897819042206,
0.0329681858420372,
0.04753752052783966,
0.0029841007199138403,
0.0682402029633522,
-0.021526042371988297,
-0.01702157035470009,
0.06593973189592361,
0.07949372380971909,
0.06993655115365982,
0.008922472596168518,
-0.1051390990614891,
-0.039425428956747055,
-0.03953103721141815,
0.002287397626787424,
-0.04219967499375343,
-0.03659770265221596,
0.052117157727479935,
0.030445627868175507,
0.20488159358501434,
-0.08606986701488495,
0.018065953627228737,
-0.10212133079767227,
0.008100150153040886,
-0.045451752841472626,
-0.11931216716766357,
-0.15600064396858215,
0.0020911837927997112,
0.07501547783613205,
-0.07334667444229126,
0.2815004289150238,
-0.04553459957242012,
0.03903699666261673,
0.04567539319396019,
0.04124423861503601,
0.04822985827922821,
0.019246641546487808,
0.160547137260437,
0.08318363130092621,
-0.013969819992780685,
-0.11878123134374619,
0.013774626888334751,
0.02315821871161461,
-0.11553497612476349,
0.09214159846305847,
0.1637452393770218,
0.02156304009258747,
0.0601455494761467,
0.004196106921881437,
-0.02392812818288803,
-0.03807102143764496,
-0.07343229651451111,
-0.07119204103946686,
0.01842549443244934,
-0.037008706480264664,
0.17404121160507202,
0.23521985113620758,
-0.1112554743885994,
0.02886425331234932,
-0.03333357349038124,
-0.02130427211523056,
-0.07286778092384338,
-0.06728648394346237,
-0.09194635599851608,
-0.11145972460508347,
0.023466523736715317,
-0.07467406988143921,
0.0031242717523127794,
0.11050235480070114,
0.07688413560390472,
-0.03270595893263817,
0.1652684211730957,
0.00389352859929204,
-0.0072808959521353245,
0.061579275876283646,
0.0005144384340383112,
0.025190286338329315,
-0.08326007425785065,
-0.01055080071091652,
0.03059368208050728,
0.01631765253841877,
0.018430594354867935,
-0.04820519685745239,
-0.08301153033971786,
0.0034246942959725857,
-0.06170005723834038,
-0.09385719895362854,
-0.012228326871991158,
0.06347250193357468,
0.01306913048028946,
-0.0038476199842989445,
0.08074761182069778,
0.06644441187381744,
-0.026238054037094116,
0.22254803776741028,
-0.06395707279443741,
-0.04176299273967743,
-0.1312708556652069,
0.1702234297990799,
0.06153112277388573,
0.030489282682538033,
0.016196388751268387,
-0.1370341032743454,
0.029631484299898148,
0.2900559902191162,
0.29269522428512573,
-0.0020764367654919624,
0.01796402782201767,
0.010242127813398838,
0.009389455430209637,
-0.06429986655712128,
0.011274575255811214,
0.07612147182226181,
0.14381125569343567,
-0.06398438662290573,
-0.07511916011571884,
-0.02577926218509674,
-0.014614079147577286,
-0.012191073037683964,
0.07127659767866135,
-0.01811295561492443,
-0.05381619185209274,
-0.04924282804131508,
0.09926837682723999,
-0.07070775330066681,
-0.1677362471818924,
-0.07203317433595657,
-0.16050788760185242,
-0.15884491801261902,
-0.04908716306090355,
-0.04355396330356598,
0.024141496047377586,
0.01974976621568203,
0.009642079472541809,
0.038587890565395355,
0.027670202776789665,
0.03047601506114006,
-0.042201925069093704,
-0.12045258283615112,
0.09857316315174103,
0.02285936288535595,
0.13242711126804352,
-0.00972227193415165,
0.0905727967619896,
0.09101562947034836,
0.03675561025738716,
-0.05272644758224487,
0.08584032207727432,
0.07295100390911102,
-0.04484694451093674,
0.015048722736537457,
0.06972772628068924,
0.03974542394280434,
0.09129516035318375,
0.05695374310016632,
-0.10556700080633163,
0.08652973920106888,
-0.17392875254154205,
-0.11462552100419998,
-0.07194177061319351,
0.0632614716887474,
0.018380582332611084,
0.1231134831905365,
0.15143713355064392,
-0.06440417468547821,
0.004620410967618227,
-0.05141986161470413,
0.04192345216870308,
0.05047626420855522,
-0.10383966565132141,
0.012943055480718613,
-0.23630104959011078,
0.06266094744205475,
-0.021445266902446747,
0.007136266212910414,
-0.26712146401405334,
0.023261375725269318,
-0.06803353130817413,
-0.03932271897792816,
0.003023012075573206,
0.05744839832186699,
0.1583227813243866,
0.0237725917249918,
0.018875321373343468,
-0.06475339084863663,
-0.019328784197568893,
0.10522951185703278,
-0.12529686093330383,
-0.08056601881980896
] |
null | null | transformers | # Punctuator for Simplified Chinese
The model is fine-tuned based on `DistilBertForTokenClassification` for adding punctuations to plain text (simplified Chinese). The model is fine-tuned based on distilled model `bert-base-chinese`.
## Usage
```python
from transformers import DistilBertForTokenClassification, DistilBertTokenizerFast
model = DistilBertForTokenClassification.from_pretrained("Qishuai/distilbert_punctuator_zh")
tokenizer = DistilBertTokenizerFast.from_pretrained("Qishuai/distilbert_punctuator_zh")
```
## Model Overview
### Training data
Combination of following three dataset:
- News articles of People's Daily 2014. [Reference](https://github.com/InsaneLife/ChineseNLPCorpus)
### Model Performance
- Validation with MSRA training dataset. [Reference](https://github.com/InsaneLife/ChineseNLPCorpus/tree/master/NER/MSRA)
- Metrics Report:
| | precision | recall | f1-score | support |
|:----------------:|:---------:|:------:|:--------:|:-------:|
| C_COMMA | 0.67 | 0.59 | 0.63 | 91566 |
| C_DUNHAO | 0.50 | 0.37 | 0.42 | 21013 |
| C_EXLAMATIONMARK | 0.23 | 0.06 | 0.09 | 399 |
| C_PERIOD | 0.84 | 0.99 | 0.91 | 44258 |
| C_QUESTIONMARK | 0.00 | 1.00 | 0.00 | 0 |
| micro avg | 0.71 | 0.67 | 0.69 | 157236 |
| macro avg | 0.45 | 0.60 | 0.41 | 157236 |
| weighted avg | 0.69 | 0.67 | 0.68 | 157236 |
| {} | token-classification | Qishuai/distilbert_punctuator_zh | [
"transformers",
"pytorch",
"safetensors",
"distilbert",
"token-classification",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [] | TAGS
#transformers #pytorch #safetensors #distilbert #token-classification #autotrain_compatible #endpoints_compatible #region-us
| Punctuator for Simplified Chinese
=================================
The model is fine-tuned based on 'DistilBertForTokenClassification' for adding punctuations to plain text (simplified Chinese). The model is fine-tuned based on distilled model 'bert-base-chinese'.
Usage
-----
Model Overview
--------------
### Training data
Combination of following three dataset:
* News articles of People's Daily 2014. Reference
### Model Performance
* Validation with MSRA training dataset. Reference
* Metrics Report:
| [
"### Training data\n\n\nCombination of following three dataset:\n\n\n* News articles of People's Daily 2014. Reference",
"### Model Performance\n\n\n* Validation with MSRA training dataset. Reference\n* Metrics Report:"
] | [
"TAGS\n#transformers #pytorch #safetensors #distilbert #token-classification #autotrain_compatible #endpoints_compatible #region-us \n",
"### Training data\n\n\nCombination of following three dataset:\n\n\n* News articles of People's Daily 2014. Reference",
"### Model Performance\n\n\n* Validation with MSRA training dataset. Reference\n* Metrics Report:"
] | [
44,
23,
22
] | [
"passage: TAGS\n#transformers #pytorch #safetensors #distilbert #token-classification #autotrain_compatible #endpoints_compatible #region-us \n### Training data\n\n\nCombination of following three dataset:\n\n\n* News articles of People's Daily 2014. Reference### Model Performance\n\n\n* Validation with MSRA training dataset. Reference\n* Metrics Report:"
] | [
-0.1196964904665947,
0.10833539068698883,
-0.0011771655408665538,
0.021840320900082588,
0.24066433310508728,
0.03344782069325447,
0.06731154769659042,
0.0450604073703289,
-0.14201955497264862,
-0.0637652575969696,
0.15947268903255463,
0.18848659098148346,
-0.03034067340195179,
0.16603414714336395,
-0.0281695369631052,
-0.24676623940467834,
0.05325138941407204,
0.030265100300312042,
-0.08914073556661606,
0.1624341458082199,
0.11486916989088058,
-0.11017432063817978,
0.06247760355472565,
-0.014925414696335793,
-0.13477009534835815,
0.018927648663520813,
0.04647066444158554,
-0.09658224880695343,
0.14917825162410736,
-0.02050572820007801,
0.17390652000904083,
0.043791189789772034,
0.1016654521226883,
-0.10032414644956589,
0.027836108580231667,
0.02984277531504631,
0.04256061464548111,
0.08503233641386032,
0.056964341551065445,
-0.06566723436117172,
0.05787821486592293,
-0.001311472849920392,
0.074310801923275,
0.03877314180135727,
-0.10444553196430206,
-0.10372602194547653,
-0.09883345663547516,
0.000022707101379637606,
0.12890943884849548,
0.13131746649742126,
-0.012746913358569145,
0.21269574761390686,
-0.2768901586532593,
0.049057669937610626,
-0.04640176147222519,
-0.17596106231212616,
-0.055388957262039185,
0.12383370101451874,
-0.0036832159385085106,
-0.012822423130273819,
-0.07766152173280716,
0.016988104209303856,
0.040774036198854446,
0.04611721262335777,
0.06400836259126663,
-0.012838632799685001,
-0.07805527001619339,
0.0565006323158741,
-0.13528604805469513,
0.02728968858718872,
0.2604646384716034,
0.04540938884019852,
-0.004732940811663866,
-0.08036510646343231,
0.04155299812555313,
-0.01569940336048603,
-0.0024592236150056124,
-0.0829300656914711,
-0.03025742992758751,
-0.0485260896384716,
-0.16637876629829407,
0.0790947824716568,
-0.12017331272363663,
-0.07644034922122955,
-0.12596635520458221,
0.06943672895431519,
0.029525533318519592,
0.02012845315039158,
-0.021482551470398903,
0.13514050841331482,
0.12044652551412582,
-0.06724949181079865,
0.019577717408537865,
-0.12304502725601196,
-0.05904216691851616,
-0.09241140633821487,
-0.05201789736747742,
-0.07463502138853073,
0.0689229890704155,
0.0662713423371315,
0.005279097240418196,
0.0038524256087839603,
0.07010757923126221,
0.016713567078113556,
-0.03650466725230217,
0.09952298551797867,
-0.15389734506607056,
-0.02891433984041214,
-0.023404095321893692,
0.03203081712126732,
-0.055101584643125534,
0.049068812280893326,
-0.017196645960211754,
0.015825984999537468,
0.04930289462208748,
0.05766458436846733,
-0.13906697928905487,
0.09373138099908829,
-0.06676818430423737,
0.022464321926236153,
-0.0202511977404356,
-0.09528220444917679,
-0.0540156364440918,
-0.01985151134431362,
-0.08225329220294952,
0.0008585109608247876,
0.03229830786585808,
0.09615214914083481,
0.020256198942661285,
0.11005108058452606,
-0.08676350861787796,
-0.024117130786180496,
-0.06870444118976593,
-0.01022020261734724,
-0.018114512786269188,
-0.06766822189092636,
0.0423082634806633,
-0.13659434020519257,
-0.21636565029621124,
-0.03789130970835686,
0.033354707062244415,
-0.023067761212587357,
0.006592286750674248,
-0.07907532900571823,
-0.01898276060819626,
-0.014162284322082996,
-0.011432601138949394,
0.10642194747924805,
-0.030927816405892372,
0.10059937089681625,
-0.006582698319107294,
0.01763036474585533,
-0.00109051913022995,
0.03500887751579285,
-0.1202007457613945,
-0.03198347985744476,
-0.009597452357411385,
0.054279204457998276,
-0.05788910388946533,
0.13289876282215118,
-0.0664387196302414,
-0.06402890384197235,
-0.09812253713607788,
0.032187141478061676,
0.017331266775727272,
0.18152374029159546,
-0.07503389567136765,
-0.06008939817547798,
0.11069726943969727,
-0.061958197504282,
-0.1254543960094452,
0.06441310793161392,
-0.10251187533140182,
0.1984008550643921,
0.11080388724803925,
0.054822031408548355,
0.08088913559913635,
-0.06294411420822144,
0.028570983558893204,
-0.012687609530985355,
-0.08103940635919571,
-0.11515558511018753,
0.07625964283943176,
0.029841100797057152,
-0.026952162384986877,
0.06439802050590515,
0.019617389887571335,
0.11686684936285019,
-0.138154074549675,
-0.07790135592222214,
0.012354109436273575,
-0.12307994067668915,
0.05699935182929039,
0.06635741889476776,
0.16344134509563446,
-0.09407885372638702,
-0.04029737785458565,
0.06606033444404602,
0.08752506971359253,
0.013855884782969952,
-0.06057469919323921,
-0.11327454447746277,
-0.045000504702329636,
-0.08140131831169128,
-0.04833989217877388,
-0.12671403586864471,
-0.0425681434571743,
-0.0843440368771553,
0.15069080889225006,
-0.05398979410529137,
0.06206987053155899,
0.08437002450227737,
0.01665608026087284,
-0.06319983303546906,
-0.013515966944396496,
0.05504205450415611,
0.0485503263771534,
-0.027763785794377327,
-0.11568133533000946,
0.04538431391119957,
-0.05465099960565567,
0.18496869504451752,
-0.17419402301311493,
0.030314208939671516,
-0.023271363228559494,
0.2103855311870575,
0.036452196538448334,
-0.011467360891401768,
0.042785029858350754,
0.06294390559196472,
-0.001033303327858448,
0.00035534703056328,
0.05918559804558754,
0.0012136957375332713,
-0.07072632014751434,
0.08551492542028427,
0.003687950549647212,
0.247464120388031,
0.11409373581409454,
-0.09550713747739792,
-0.10048078745603561,
0.04368671029806137,
-0.05096680670976639,
-0.023215506225824356,
-0.14133386313915253,
0.0004684905579779297,
0.08297549188137054,
-0.017630187794566154,
0.14159922301769257,
0.02114167809486389,
-0.049760233610868454,
-0.021102655678987503,
-0.05253692343831062,
-0.011839380487799644,
0.061578962951898575,
0.021854067221283913,
-0.16434046626091003,
0.09655187278985977,
0.053736697882413864,
-0.02642211690545082,
0.18717613816261292,
-0.049048952758312225,
-0.05782514065504074,
0.01540454849600792,
-0.04716772958636284,
-0.04676874727010727,
0.08883308619260788,
-0.0053139799274504185,
0.02646862156689167,
0.03302016481757164,
0.07930095493793488,
0.03558642044663429,
-0.13282589614391327,
-0.08476654440164566,
0.003032810054719448,
0.032346416264772415,
-0.08273499459028244,
0.07494545727968216,
0.013547221198678017,
0.1192619726061821,
-0.023594781756401062,
-0.07052493840456009,
0.09785358607769012,
-0.014652161858975887,
-0.10826065391302109,
0.18196143209934235,
-0.049976203590631485,
-0.2089347243309021,
-0.14412200450897217,
-0.04097035527229309,
0.044615283608436584,
0.011776020750403404,
0.012093587778508663,
-0.14731131494045258,
-0.004913535434752703,
0.03665247559547424,
0.05554768815636635,
0.0553399957716465,
-0.008865817449986935,
-0.0230117030441761,
0.0848618820309639,
0.04027091711759567,
-0.08397485315799713,
-0.00398342264816165,
-0.06214753910899162,
-0.035732973366975784,
0.07095504552125931,
-0.08134734630584717,
0.10068405419588089,
0.09336169809103012,
0.009160241112112999,
0.05510259047150612,
-0.04349524900317192,
0.1340450644493103,
-0.08763670921325684,
-0.06025588884949684,
0.14604516327381134,
0.012620853260159492,
-0.033548690378665924,
0.11465950310230255,
0.031074171885848045,
-0.07684598118066788,
0.0072630541399121284,
0.016178159043192863,
-0.060153521597385406,
-0.21372666954994202,
-0.07985689491033554,
-0.040060609579086304,
-0.12314129620790482,
0.010444048792123795,
-0.0076878913678228855,
0.03873500972986221,
0.1076691597700119,
0.0748416855931282,
-0.09849077463150024,
-0.018751665949821472,
-0.00624513765797019,
0.02798137627542019,
-0.003516513155773282,
0.15294067561626434,
-0.10349833220243454,
-0.11917214840650558,
0.03296569734811783,
-0.04184446111321449,
0.21254794299602509,
-0.0171874538064003,
-0.14511898159980774,
0.07827018201351166,
0.08880235999822617,
0.02132803201675415,
0.10450837016105652,
0.07222721725702286,
-0.06345703452825546,
0.016141599044203758,
0.004210005048662424,
-0.06823442876338959,
0.05216435715556145,
0.01672462373971939,
-0.0042360094375908375,
-0.1483655869960785,
-0.024217898026108742,
0.07533381134271622,
0.20080053806304932,
0.0772872343659401,
-0.3325328528881073,
-0.04974469915032387,
0.013655075803399086,
-0.005467554088681936,
-0.02657240815460682,
0.06072049215435982,
0.023642614483833313,
-0.12450806051492691,
0.04149654880166054,
-0.058092571794986725,
0.08145938068628311,
-0.006000985857099295,
-0.0053366078063845634,
0.02249762788414955,
-0.03136292099952698,
-0.021979058161377907,
0.12141478806734085,
-0.24777880311012268,
0.30717727541923523,
-0.03334152698516846,
0.12358726561069489,
-0.04411277547478676,
-0.04579392820596695,
0.05505519360303879,
0.14377924799919128,
0.12140481173992157,
-0.03541979566216469,
-0.09839725494384766,
-0.15755151212215424,
-0.0418441966176033,
0.06559443473815918,
0.017057696357369423,
-0.0299017783254385,
0.023226002231240273,
-0.008984445594251156,
0.031491898000240326,
0.039674654603004456,
-0.09383301436901093,
-0.15479375422000885,
-0.07067100703716278,
-0.046336427330970764,
0.0501205213367939,
0.08974506705999374,
-0.08875680714845657,
-0.12305966764688492,
-0.05985233187675476,
0.07335352897644043,
0.03343683108687401,
-0.029063522815704346,
-0.1161995530128479,
0.07177378982305527,
0.04659184068441391,
-0.02076093852519989,
-0.0027797294314950705,
0.030901286751031876,
0.00966725591570139,
0.04806777089834213,
-0.04223120957612991,
0.045418672263622284,
-0.08806914836168289,
-0.0804627537727356,
-0.0819026455283165,
0.1271882951259613,
0.02809774875640869,
0.04121058061718941,
0.012603050097823143,
0.03775613382458687,
0.02043810673058033,
-0.05163367837667465,
0.019251681864261627,
-0.02525666542351246,
0.12415625900030136,
0.0744432881474495,
-0.1035931184887886,
-0.05614010617136955,
-0.03165041282773018,
-0.09030363708734512,
0.2183711975812912,
0.11013057082891464,
-0.11967934668064117,
0.04919065535068512,
0.11914705485105515,
-0.028901200741529465,
-0.2639460861682892,
0.02757466398179531,
0.046891678124666214,
0.06728942692279816,
-0.005154137499630451,
-0.10419957339763641,
0.12496940046548843,
0.11223745346069336,
-0.042190421372652054,
-0.11497261375188828,
-0.2351241558790207,
-0.09158062189817429,
0.18243327736854553,
0.09720330685377121,
0.20671208202838898,
-0.08766793459653854,
0.00642099604010582,
-0.014646768569946289,
-0.11281007528305054,
0.05185608193278313,
-0.07871603965759277,
0.09130048006772995,
-0.012716728262603283,
0.04590174928307533,
0.004462428856641054,
-0.08624523878097534,
0.13150273263454437,
0.002643963787704706,
0.049322132021188736,
-0.07861887663602829,
0.0014320388436317444,
0.05248506739735603,
-0.004054225981235504,
0.07069067656993866,
0.059602417051792145,
0.0569210983812809,
-0.13943138718605042,
-0.026499152183532715,
-0.0861063301563263,
0.020291130989789963,
0.02801484428346157,
-0.10094163566827774,
-0.018033629283308983,
0.08487902581691742,
0.02917787991464138,
-0.06921804696321487,
0.04975394532084465,
-0.0653538778424263,
0.18025615811347961,
-0.0475878044962883,
0.13029693067073822,
-0.07627160847187042,
0.004078310448676348,
-0.07488817721605301,
-0.08053582906723022,
0.04085247591137886,
-0.13548068702220917,
0.022809837013483047,
0.13205942511558533,
-0.0018506422638893127,
0.11921495944261551,
0.07470924407243729,
0.010637319646775723,
0.018691839650273323,
0.1606166511774063,
-0.1455022096633911,
0.04164823144674301,
-0.044414933770895004,
-0.04586214944720268,
-0.0824730321764946,
0.09265823662281036,
0.09233976155519485,
-0.029721174389123917,
-0.013956228271126747,
-0.025012468919157982,
-0.0027293595485389233,
-0.0995747447013855,
0.2787809669971466,
0.09703760594129562,
0.06943981349468231,
-0.09131389111280441,
0.033416181802749634,
0.009302851743996143,
-0.04976804181933403,
-0.045640427619218826,
-0.005268794484436512,
-0.12434302270412445,
-0.061849888414144516,
0.05010223016142845,
0.19097867608070374,
-0.044216688722372055,
-0.09789200127124786,
-0.1809832751750946,
-0.131153866648674,
0.07009949535131454,
0.13484029471874237,
0.13216538727283478,
0.11412614583969116,
-0.028532372787594795,
-0.11543019860982895,
-0.12131407111883163,
0.06960004568099976,
0.07424771040678024,
0.0020605584140866995,
-0.19690130650997162,
0.07717770338058472,
-0.010113484226167202,
0.0867062509059906,
-0.0893651470541954,
-0.035232629626989365,
-0.10912209004163742,
0.07336430251598358,
-0.08264251053333282,
-0.0832795798778534,
-0.015470942482352257,
-0.04782329872250557,
-0.007483220659196377,
-0.09340423345565796,
-0.08103810995817184,
0.010432426817715168,
-0.06973379105329514,
0.06593132764101028,
0.012577299028635025,
0.08265691995620728,
-0.022329166531562805,
-0.04506104066967964,
0.035242680460214615,
-0.022974589839577675,
0.038658034056425095,
0.04715714231133461,
-0.018109004944562912,
0.06014455482363701,
-0.11626238375902176,
-0.017812734469771385,
0.07545336335897446,
-0.052235882729291916,
0.03345846012234688,
-0.09632181376218796,
0.03805828094482422,
0.039448659867048264,
0.05247795581817627,
0.09028296172618866,
0.010333417914807796,
-0.09446888417005539,
0.029575997963547707,
-0.05826197937130928,
-0.03574897721409798,
-0.0938718244433403,
-0.036025095731019974,
0.09236142039299011,
0.08020144701004028,
0.22484926879405975,
-0.09187892079353333,
0.04560524970293045,
-0.12032150477170944,
-0.007719262968748808,
-0.03381359949707985,
-0.10914893448352814,
-0.09871793538331985,
-0.009891762398183346,
0.0722271129488945,
-0.037368107587099075,
0.25976040959358215,
-0.021593257784843445,
0.006374713033437729,
0.05115598067641258,
0.04948627948760986,
0.06311257928609848,
0.03390882536768913,
0.1359846293926239,
0.07047989964485168,
-0.020707368850708008,
-0.010188533924520016,
0.05148792639374733,
-0.02115180902183056,
-0.09585399925708771,
0.04524446651339531,
0.17923800647258759,
-0.021137557923793793,
0.0573924221098423,
0.032451462000608444,
-0.012183355167508125,
0.03134419396519661,
-0.07499610632658005,
-0.042428165674209595,
-0.05469655618071556,
-0.01066706981509924,
0.030056418851017952,
0.14890849590301514,
-0.1205979511141777,
0.008293062448501587,
-0.11911755800247192,
-0.05017280578613281,
-0.09888564050197601,
0.04181532561779022,
-0.08162190765142441,
-0.09435315430164337,
0.05762556195259094,
-0.04526267945766449,
-0.09045238047838211,
0.2107585221529007,
0.056528039276599884,
-0.014070943929255009,
0.15743087232112885,
-0.035142719745635986,
0.028158167377114296,
0.0034950091503560543,
-0.06709904968738556,
0.017690548673272133,
-0.1303148716688156,
-0.005876683164387941,
-0.02651030942797661,
0.020787976682186127,
0.01051587238907814,
-0.09550218284130096,
-0.08998095244169235,
-0.012500978074967861,
-0.06452231854200363,
-0.058125339448451996,
0.012911839410662651,
0.0501917339861393,
0.04009746015071869,
-0.016096575185656548,
0.040419623255729675,
0.003878582501783967,
0.007310613989830017,
0.26951029896736145,
-0.03312093764543533,
-0.14903181791305542,
-0.2271062433719635,
0.21910810470581055,
0.08669944107532501,
0.023839693516492844,
0.019546950235962868,
-0.051894307136535645,
0.03877320513129234,
0.24637718498706818,
0.293031245470047,
-0.04391777515411377,
-0.007189983036369085,
-0.01674542762339115,
-0.00966036319732666,
0.0011443988187238574,
0.07023081928491592,
0.06300829350948334,
0.09168373048305511,
-0.09516347944736481,
-0.028804291039705276,
-0.07104770839214325,
-0.02059352584183216,
0.032998133450746536,
0.11778750270605087,
0.060442160815000534,
-0.002278088591992855,
-0.07296153903007507,
0.1616470068693161,
-0.005881907418370247,
0.0104107316583395,
0.020164722576737404,
-0.13629604876041412,
-0.1486297845840454,
-0.04308534786105156,
0.061090972274541855,
0.0018273955211043358,
0.057424306869506836,
-0.025425687432289124,
-0.013249174691736698,
-0.029361629858613014,
0.031161582097411156,
-0.08660399168729782,
-0.1343328207731247,
0.06596730649471283,
0.13125868141651154,
0.14901483058929443,
-0.03872983902692795,
0.06799915432929993,
0.0931335836648941,
0.003129296936094761,
-0.06445155292749405,
0.11983518302440643,
0.010435805656015873,
0.013381639495491982,
0.05467434972524643,
0.06225152686238289,
-0.012226151302456856,
0.08363673090934753,
0.007311137393116951,
-0.22404012084007263,
0.09884395450353622,
-0.22115467488765717,
-0.09340789914131165,
-0.06013369932770729,
0.09498608857393265,
0.029241004958748817,
0.10887694358825684,
0.14320102334022522,
-0.03600771725177765,
0.0484192855656147,
-0.0909096896648407,
0.09606633335351944,
0.06805021315813065,
-0.11493369936943054,
-0.028512127697467804,
-0.2332720309495926,
-0.028661981225013733,
-0.04288791865110397,
-0.07541327178478241,
-0.2270471751689911,
-0.07349040359258652,
-0.07211479544639587,
-0.08829421550035477,
-0.015991801396012306,
0.10360193252563477,
0.09243454039096832,
0.08232651650905609,
-0.008947101421654224,
-0.0548706017434597,
0.0036029077600687742,
0.07025740295648575,
-0.127031609416008,
-0.0958137959241867
] |
null | null | transformers | Testing PPO-trainer
| {} | text2text-generation | QuickRead/PPO_training | [
"transformers",
"pytorch",
"pegasus",
"text2text-generation",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [] | TAGS
#transformers #pytorch #pegasus #text2text-generation #autotrain_compatible #endpoints_compatible #region-us
| Testing PPO-trainer
| [] | [
"TAGS\n#transformers #pytorch #pegasus #text2text-generation #autotrain_compatible #endpoints_compatible #region-us \n"
] | [
40
] | [
"passage: TAGS\n#transformers #pytorch #pegasus #text2text-generation #autotrain_compatible #endpoints_compatible #region-us \n"
] | [
-0.027585696429014206,
0.007313489448279142,
-0.007987958379089832,
0.028276974335312843,
0.1623227447271347,
0.029489101842045784,
0.14388062059879303,
0.1242440864443779,
0.009850045666098595,
-0.03792005777359009,
0.13300150632858276,
0.18938398361206055,
-0.010199892334640026,
0.11770597100257874,
-0.06683412194252014,
-0.2868051826953888,
0.05056614801287651,
0.0602492094039917,
0.013369484804570675,
0.11579764634370804,
0.08074385672807693,
-0.05475851520895958,
0.0842554122209549,
-0.022040201351046562,
-0.1553604155778885,
0.04147952422499657,
0.04605214670300484,
-0.10927402973175049,
0.11837435513734818,
0.05578752979636192,
0.1367022842168808,
0.02636982686817646,
-0.04433154687285423,
-0.1407824158668518,
0.03448762744665146,
-0.014720061793923378,
-0.061376310884952545,
0.027768997475504875,
0.09636995941400528,
-0.1085098460316658,
0.08680911362171173,
0.07155270874500275,
-0.0015284063993021846,
0.028706638142466545,
-0.15333975851535797,
-0.040771156549453735,
-0.029247241094708443,
0.05326661095023155,
0.07017087191343307,
0.0938696637749672,
-0.0023286195937544107,
0.12940660119056702,
-0.06973444670438766,
0.12064070254564285,
0.14831306040287018,
-0.3069976568222046,
-0.023601431399583817,
0.051164355129003525,
0.08196950703859329,
0.060703326016664505,
-0.013138356618583202,
0.038508057594299316,
0.004770491272211075,
0.03309858217835426,
-0.010194709524512291,
-0.09564581513404846,
-0.11341304332017899,
0.010270420461893082,
-0.0789228230714798,
-0.049713026732206345,
0.19981756806373596,
-0.08309976011514664,
0.07265247404575348,
-0.028500063344836235,
-0.10695131123065948,
-0.03853067383170128,
-0.027682418003678322,
-0.003380002686753869,
-0.06605763733386993,
0.069659523665905,
0.060532331466674805,
-0.07793574035167694,
-0.14536599814891815,
0.022297315299510956,
-0.20593146979808807,
0.16288544237613678,
0.018109871074557304,
0.05828370898962021,
-0.20405270159244537,
0.06471621245145798,
-0.000674026960041374,
-0.11155436187982559,
0.062426161020994186,
-0.1034383475780487,
0.056312501430511475,
-0.014963206835091114,
-0.06676226109266281,
-0.07738035917282104,
0.06127918139100075,
0.13003574311733246,
0.01805858500301838,
0.0425294004380703,
0.000416925351601094,
0.08716429769992828,
0.031563982367515564,
0.07732700556516647,
0.038204751908779144,
-0.043173011392354965,
0.05311184003949165,
-0.11223647743463516,
0.026160402223467827,
-0.06356550008058548,
-0.1620456576347351,
-0.04866127669811249,
0.03333169221878052,
0.0648343563079834,
0.030341170728206635,
0.04835530370473862,
-0.05115034803748131,
-0.02135358192026615,
0.042705047875642776,
-0.07428907603025436,
0.012977183796465397,
-0.011686811223626137,
0.027593163773417473,
0.1733904331922531,
0.01275497768074274,
0.015933217480778694,
-0.11262805014848709,
0.10365206748247147,
-0.04144426807761192,
0.022225968539714813,
-0.031255170702934265,
-0.06193317472934723,
0.032064761966466904,
-0.10890514403581619,
0.019574658945202827,
-0.15642012655735016,
-0.1485527753829956,
-0.002933168550953269,
0.03374786674976349,
-0.012360223568975925,
-0.04730048030614853,
-0.03417011722922325,
-0.013707662001252174,
0.052129972726106644,
-0.05791617929935455,
-0.07272955775260925,
-0.04702608659863472,
0.09800393879413605,
-0.02281559631228447,
0.09270476549863815,
-0.150568425655365,
0.06550474464893341,
-0.11241579800844193,
-0.03566019609570503,
-0.09736364334821701,
0.03307387977838516,
0.006077549885958433,
0.12585744261741638,
0.017013506963849068,
-0.016991665586829185,
-0.08182661980390549,
0.06388788670301437,
-0.02339104935526848,
0.20010937750339508,
-0.07624473422765732,
-0.10946335643529892,
0.2802630066871643,
-0.09066344052553177,
-0.17479628324508667,
0.099631667137146,
0.01105901412665844,
0.06156383827328682,
0.08827709406614304,
0.1586672067642212,
0.07052227854728699,
-0.03904104232788086,
0.10744263976812363,
0.10079006850719452,
-0.07359275221824646,
-0.1151127740740776,
-0.002174675464630127,
-0.012823573313653469,
-0.06286858022212982,
0.04804248735308647,
0.10530757158994675,
0.0724087730050087,
-0.0541774146258831,
-0.02969217859208584,
-0.02882094867527485,
-0.010560687631368637,
0.09563767910003662,
0.027810651808977127,
0.12510286271572113,
-0.08220651745796204,
-0.017728282138705254,
0.02057051472365856,
-0.024810077622532845,
0.003369437763467431,
0.05307779088616371,
-0.02282854914665222,
0.11395441740751266,
0.004360213875770569,
0.03882364556193352,
-0.19840337336063385,
-0.09772457927465439,
-0.03439540043473244,
0.13961806893348694,
0.015888620167970657,
0.10937141627073288,
0.06904613226652145,
-0.02631063014268875,
-0.014846011064946651,
0.009900660254061222,
0.13852858543395996,
-0.0032476484775543213,
-0.08019904792308807,
-0.05285057798027992,
0.07773507386445999,
-0.0647035464644432,
0.0158902108669281,
-0.032376978546381,
0.032091882079839706,
0.005679324734956026,
0.126190647482872,
0.008325958624482155,
0.04936828464269638,
-0.03833828866481781,
0.048992108553647995,
-0.07879862934350967,
0.033698808401823044,
0.10792797058820724,
0.012190357781946659,
-0.03352833539247513,
0.20314250886440277,
-0.18690161406993866,
0.27732783555984497,
0.21701301634311676,
-0.28866350650787354,
-0.0028289752081036568,
-0.031127361580729485,
-0.0049080392345786095,
0.012028669938445091,
0.017860518768429756,
0.005422690417617559,
0.06336858868598938,
0.011650816537439823,
0.19027258455753326,
-0.033916812390089035,
-0.029085587710142136,
-0.0008950633346103132,
-0.09117268770933151,
-0.019828008487820625,
0.06359892338514328,
0.0754622146487236,
-0.11167893558740616,
0.18867111206054688,
0.22761639952659607,
-0.005664023570716381,
0.15937748551368713,
0.01669486053287983,
0.0002934074145741761,
0.06447967886924744,
-0.02251943200826645,
-0.03158864378929138,
-0.07360739260911942,
-0.17277352511882782,
-0.0323944091796875,
0.06492999196052551,
0.02103135548532009,
0.09078200906515121,
-0.13177452981472015,
-0.03473881632089615,
0.006408968474715948,
0.016965724527835846,
-0.02021351270377636,
0.09830889105796814,
0.06264516711235046,
0.10635320842266083,
-0.03223403915762901,
-0.026686690747737885,
0.08994961529970169,
0.031702350825071335,
-0.08927229791879654,
0.15803951025009155,
-0.13032759726047516,
-0.33472248911857605,
-0.1961967945098877,
-0.17081882059574127,
-0.02966659516096115,
0.058547187596559525,
0.13498379290103912,
-0.06335697323083878,
-0.01770716719329357,
0.01830151490867138,
0.038136716932058334,
-0.059371933341026306,
0.02125345915555954,
-0.059107840061187744,
0.056364163756370544,
-0.08093783259391785,
-0.04274408519268036,
-0.06570637226104736,
-0.016855569556355476,
-0.0026700443122535944,
0.14938177168369293,
-0.13856683671474457,
0.0711100772023201,
0.15621811151504517,
-0.005724868271499872,
0.0581824816763401,
-0.0204726941883564,
0.1929926723241806,
-0.07750783115625381,
0.009875580668449402,
0.23746556043624878,
-0.043749213218688965,
0.09421402961015701,
0.138494074344635,
0.008542743511497974,
-0.069229356944561,
0.024806618690490723,
-0.054768387228250504,
-0.09452919661998749,
-0.2286616861820221,
-0.10746050626039505,
-0.13038843870162964,
0.06827724725008011,
0.04387153312563896,
0.03257381543517113,
0.11580502241849899,
0.08189541846513748,
-0.001768745481967926,
0.04645032808184624,
0.016050761565566063,
0.08648503571748734,
0.2514629065990448,
0.00127126753795892,
0.14783121645450592,
-0.06828110665082932,
-0.1300143152475357,
0.0998758003115654,
0.054202642291784286,
0.08561010658740997,
0.08967795222997665,
0.02371787466108799,
0.002894588513299823,
0.06688637286424637,
0.13068251311779022,
0.11806440353393555,
0.05838426202535629,
-0.015465357340872288,
-0.012761455029249191,
-0.010601499117910862,
-0.062138259410858154,
0.04551485925912857,
0.05189701169729233,
-0.14786280691623688,
-0.06467288732528687,
-0.07705499231815338,
0.06037766858935356,
0.08525983989238739,
0.06848956644535065,
-0.2065671980381012,
-0.0016404568450525403,
0.08934172987937927,
-0.018262570723891258,
-0.10992186516523361,
0.06580743193626404,
-0.024901799857616425,
-0.1476857215166092,
0.07907038927078247,
-0.04183332622051239,
0.12713627517223358,
-0.0483415462076664,
0.0881928950548172,
-0.06425303965806961,
-0.1288832277059555,
0.03440796956419945,
0.09157112240791321,
-0.2844795882701874,
0.21620750427246094,
-0.0030165596399456263,
-0.057867277413606644,
-0.07066868990659714,
-0.006100798025727272,
0.031237421557307243,
0.15481482446193695,
0.0766729786992073,
-0.0012382868444547057,
-0.09282297641038895,
-0.11645262688398361,
-0.018441256135702133,
0.009011243470013142,
0.13744759559631348,
-0.02134038135409355,
0.006231438834220171,
-0.052070535719394684,
-0.024937773123383522,
-0.04402768611907959,
-0.015024320222437382,
0.005306920036673546,
-0.18535037338733673,
0.07847286760807037,
0.051017697900533676,
0.036091018468141556,
0.030454594641923904,
-0.04382651671767235,
-0.09519557654857635,
0.21587572991847992,
-0.03829314559698105,
-0.10007729381322861,
-0.11453396826982498,
-0.09901604056358337,
0.04248863831162453,
-0.09575305134057999,
0.057071566581726074,
-0.09200248122215271,
0.021657638251781464,
-0.08639182895421982,
-0.18853840231895447,
0.1293722689151764,
-0.10336440056562424,
-0.012767946347594261,
-0.06509127467870712,
0.17806433141231537,
-0.07667209208011627,
0.016387995332479477,
0.022154074162244797,
0.016217214986681938,
-0.12289292365312576,
-0.07649209350347519,
-0.01373227871954441,
0.0025731856003403664,
0.03233199566602707,
0.015193344093859196,
-0.07542119175195694,
-0.047841135412454605,
-0.02482183650135994,
-0.026332199573516846,
0.29312369227409363,
0.17331668734550476,
-0.05373896285891533,
0.159318745136261,
0.09618104249238968,
-0.08316412568092346,
-0.33121350407600403,
-0.115946464240551,
-0.08833862096071243,
-0.017579875886440277,
-0.030331043526530266,
-0.15357118844985962,
0.0726977288722992,
-0.008525566197931767,
-0.02361355535686016,
0.10449516028165817,
-0.25195446610450745,
-0.10325172543525696,
0.15061473846435547,
-0.001681105000898242,
0.38637664914131165,
-0.12854944169521332,
-0.09129916876554489,
-0.06323869526386261,
-0.18781696259975433,
0.12713280320167542,
-0.02424454875290394,
0.08665256947278976,
-0.025318730622529984,
0.1189311295747757,
0.03689003363251686,
-0.03186932951211929,
0.09087345004081726,
0.0002512325590942055,
-0.008919776417315006,
-0.11167082190513611,
-0.006444161757826805,
0.06078359857201576,
-0.01989692822098732,
0.027989648282527924,
-0.018387574702501297,
0.003936428111046553,
-0.16014201939105988,
-0.037804458290338516,
-0.09336313605308533,
0.04309866577386856,
0.026587700471282005,
-0.03350997343659401,
0.0279763825237751,
-0.07552820444107056,
-0.014507034793496132,
0.011567247100174427,
0.20935572683811188,
-0.05274742469191551,
0.1701357215642929,
0.1241312101483345,
0.10410413891077042,
-0.135208860039711,
0.02291138656437397,
-0.07015801221132278,
-0.07691501080989838,
0.04072810336947441,
-0.08244165778160095,
0.05210975185036659,
0.12575723230838776,
-0.05860447883605957,
0.03527315706014633,
0.1000896468758583,
0.027045993134379387,
-0.009989666752517223,
0.14556513726711273,
-0.25419098138809204,
0.01899345964193344,
-0.06604690104722977,
0.007670600898563862,
0.06569205969572067,
0.05836820974946022,
0.17258399724960327,
0.03277072310447693,
-0.06186176836490631,
-0.013284128159284592,
-0.0066999453119933605,
-0.026414699852466583,
0.06506989896297455,
0.020127547904849052,
0.01753195747733116,
-0.11713390797376633,
0.046958230435848236,
0.01666394993662834,
-0.2020876258611679,
-0.007182493805885315,
0.20513387024402618,
-0.1323099434375763,
-0.12713544070720673,
0.025219053030014038,
0.06982459872961044,
-0.17893916368484497,
-0.03580104187130928,
-0.07586657255887985,
-0.12027594447135925,
0.06571675091981888,
0.1958850920200348,
0.10050273686647415,
0.06479695439338684,
-0.03090684860944748,
-0.035288020968437195,
-0.01921747624874115,
-0.0017136972164735198,
0.05709562450647354,
0.04122164472937584,
-0.07452833652496338,
0.03299440070986748,
-0.0369621179997921,
0.14526019990444183,
-0.09112148731946945,
-0.055046677589416504,
-0.14336557686328888,
0.037185199558734894,
-0.1270676851272583,
-0.05180307477712631,
-0.09207983314990997,
-0.06043447554111481,
-0.005520303267985582,
-0.03323851525783539,
-0.040147148072719574,
-0.057040657848119736,
-0.11917484551668167,
0.01771133951842785,
-0.05767437070608139,
0.010284320451319218,
-0.08568691462278366,
-0.015235191211104393,
0.1039268895983696,
-0.042803216725587845,
0.09270195662975311,
0.14184673130512238,
-0.08834710717201233,
0.09435155242681503,
-0.11113779991865158,
-0.09970304369926453,
0.10261854529380798,
0.02926977165043354,
0.06631676852703094,
0.07538145035505295,
0.02120576798915863,
0.0845770463347435,
0.05438769981265068,
0.06188758835196495,
0.05556515231728554,
-0.10286565124988556,
0.05364862456917763,
-0.035131338983774185,
-0.18136291205883026,
-0.03889384865760803,
-0.03900923952460289,
0.09032551199197769,
0.008460118435323238,
0.14533020555973053,
-0.058411095291376114,
0.12033247947692871,
-0.03810322657227516,
0.026920504868030548,
-0.0022480199113488197,
-0.18228110671043396,
-0.04332048445940018,
-0.09787475317716599,
0.012237093411386013,
0.020677054300904274,
0.2113143354654312,
0.02447398751974106,
0.09441017359495163,
0.0420667938888073,
0.058334771543741226,
-0.0022263480350375175,
0.018762605264782906,
0.16429509222507477,
0.10088097304105759,
-0.06769854575395584,
-0.09952136874198914,
0.07575821131467819,
0.027615059167146683,
0.020410293713212013,
0.1669931709766388,
0.01749906688928604,
-0.007070209365338087,
0.1254429966211319,
-0.02630334533751011,
0.08790914714336395,
-0.13574211299419403,
-0.21889838576316833,
-0.019804028794169426,
0.05489349737763405,
-0.009880405850708485,
0.08519899100065231,
0.1385723054409027,
-0.01635821722447872,
0.024940647184848785,
-0.05623963102698326,
-0.060569655150175095,
-0.19362306594848633,
-0.09277856349945068,
-0.08268585056066513,
-0.10180290043354034,
-0.014712493866682053,
-0.07669606059789658,
0.034582480788230896,
0.07411330193281174,
0.04844081401824951,
-0.06099122762680054,
0.08698969334363937,
0.03972983360290527,
-0.08971982449293137,
0.05367860570549965,
-0.03972978517413139,
0.06400992721319199,
0.0155027499422431,
-0.01754131354391575,
-0.14581698179244995,
-0.011364766396582127,
-0.022526832297444344,
0.061385203152894974,
-0.07153992354869843,
-0.010531934909522533,
-0.15179623663425446,
-0.11668801307678223,
-0.044254496693611145,
0.06920216232538223,
-0.03137056529521942,
0.11045622825622559,
0.01599840074777603,
-0.0007468123221769929,
0.03391840308904648,
0.20826643705368042,
-0.07851770520210266,
-0.09784555435180664,
-0.03908612206578255,
0.21046151220798492,
0.06415607780218124,
0.10688471049070358,
-0.035213347524404526,
0.007704419083893299,
-0.0994245782494545,
0.3409350514411926,
0.2867956757545471,
-0.07978424429893494,
0.02958511747419834,
0.03648075833916664,
0.04699551686644554,
0.12769319117069244,
0.12596286833286285,
0.07956866919994354,
0.2516948878765106,
-0.08430477976799011,
-0.03634202107787132,
-0.03634164482355118,
-0.018995359539985657,
-0.13541501760482788,
0.0857229009270668,
0.023966001346707344,
-0.05911729112267494,
-0.03720174729824066,
0.11271587014198303,
-0.2227962613105774,
0.1707015484571457,
-0.051902566105127335,
-0.19755250215530396,
-0.054956935346126556,
0.00896263774484396,
0.1479974389076233,
-0.001425327849574387,
0.09814303368330002,
-0.0043182880617678165,
-0.10310788452625275,
0.09938514977693558,
0.011675244197249413,
-0.22907690703868866,
0.006209614686667919,
0.0485866405069828,
-0.12346213310956955,
-0.019764546304941177,
-0.013896752148866653,
0.04705127701163292,
0.06802719086408615,
0.06532581150531769,
-0.015458318404853344,
0.02425820752978325,
0.015196547843515873,
0.001029869425110519,
0.006488078739494085,
0.039524778723716736,
0.0029825230594724417,
-0.11641877144575119,
0.06432265788316727,
-0.14936137199401855,
0.04568292945623398,
-0.05962099879980087,
-0.020651979371905327,
0.021721867844462395,
0.024487338960170746,
-0.050546206533908844,
0.044824473559856415,
0.07680115103721619,
0.0020370136480778456,
-0.009929941035807133,
-0.029844362288713455,
-0.04111442714929581,
0.016692133620381355,
-0.06962182372808456,
-0.12554603815078735,
-0.11211634427309036,
-0.12078654021024704,
0.08791758120059967,
0.016070706769824028,
-0.19791674613952637,
-0.013232938013970852,
-0.11572209000587463,
0.056993354111909866,
-0.1938692033290863,
0.10010140389204025,
0.05595893785357475,
-0.0007451901328749955,
-0.005284659098833799,
-0.041861411184072495,
0.052014973014593124,
0.08704550564289093,
-0.10205619037151337,
-0.0989319309592247
] |
null | null | transformers |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# fine-tune-Pegasus
This model is a fine-tuned version of [google/pegasus-large](https://huggingface.co/google/pegasus-large) on the xsum dataset.
It achieves the following results on the evaluation set:
- Loss: 2.3242
- Rouge1: 17.993
- Rouge2: 2.9392
- Rougel: 12.313
- Rougelsum: 13.3091
- Gen Len: 67.0552
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 6.35e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: cosine
- lr_scheduler_warmup_steps: 500
- num_epochs: 1.0
- mixed_precision_training: Native AMP
### Training results
### Framework versions
- Transformers 4.16.2
- Pytorch 1.10.1
- Datasets 1.17.0
- Tokenizers 0.10.3
| {"tags": ["generated_from_trainer"], "datasets": ["xsum"], "metrics": ["rouge"], "model-index": [{"name": "fine-tune-Pegasus", "results": [{"task": {"type": "text2text-generation", "name": "Sequence-to-sequence Language Modeling"}, "dataset": {"name": "xsum", "type": "xsum", "args": "default"}, "metrics": [{"type": "rouge", "value": 17.993, "name": "Rouge1"}]}]}]} | text2text-generation | QuickRead/fine-tune-Pegasus | [
"transformers",
"pytorch",
"pegasus",
"text2text-generation",
"generated_from_trainer",
"dataset:xsum",
"model-index",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [] | TAGS
#transformers #pytorch #pegasus #text2text-generation #generated_from_trainer #dataset-xsum #model-index #autotrain_compatible #endpoints_compatible #region-us
|
# fine-tune-Pegasus
This model is a fine-tuned version of google/pegasus-large on the xsum dataset.
It achieves the following results on the evaluation set:
- Loss: 2.3242
- Rouge1: 17.993
- Rouge2: 2.9392
- Rougel: 12.313
- Rougelsum: 13.3091
- Gen Len: 67.0552
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 6.35e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: cosine
- lr_scheduler_warmup_steps: 500
- num_epochs: 1.0
- mixed_precision_training: Native AMP
### Training results
### Framework versions
- Transformers 4.16.2
- Pytorch 1.10.1
- Datasets 1.17.0
- Tokenizers 0.10.3
| [
"# fine-tune-Pegasus\n\nThis model is a fine-tuned version of google/pegasus-large on the xsum dataset.\nIt achieves the following results on the evaluation set:\n- Loss: 2.3242\n- Rouge1: 17.993\n- Rouge2: 2.9392\n- Rougel: 12.313\n- Rougelsum: 13.3091\n- Gen Len: 67.0552",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 6.35e-05\n- train_batch_size: 8\n- eval_batch_size: 8\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: cosine\n- lr_scheduler_warmup_steps: 500\n- num_epochs: 1.0\n- mixed_precision_training: Native AMP",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.2\n- Pytorch 1.10.1\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] | [
"TAGS\n#transformers #pytorch #pegasus #text2text-generation #generated_from_trainer #dataset-xsum #model-index #autotrain_compatible #endpoints_compatible #region-us \n",
"# fine-tune-Pegasus\n\nThis model is a fine-tuned version of google/pegasus-large on the xsum dataset.\nIt achieves the following results on the evaluation set:\n- Loss: 2.3242\n- Rouge1: 17.993\n- Rouge2: 2.9392\n- Rougel: 12.313\n- Rougelsum: 13.3091\n- Gen Len: 67.0552",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 6.35e-05\n- train_batch_size: 8\n- eval_batch_size: 8\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: cosine\n- lr_scheduler_warmup_steps: 500\n- num_epochs: 1.0\n- mixed_precision_training: Native AMP",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.2\n- Pytorch 1.10.1\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] | [
57,
85,
6,
12,
8,
3,
120,
4,
30
] | [
"passage: TAGS\n#transformers #pytorch #pegasus #text2text-generation #generated_from_trainer #dataset-xsum #model-index #autotrain_compatible #endpoints_compatible #region-us \n# fine-tune-Pegasus\n\nThis model is a fine-tuned version of google/pegasus-large on the xsum dataset.\nIt achieves the following results on the evaluation set:\n- Loss: 2.3242\n- Rouge1: 17.993\n- Rouge2: 2.9392\n- Rougel: 12.313\n- Rougelsum: 13.3091\n- Gen Len: 67.0552## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 6.35e-05\n- train_batch_size: 8\n- eval_batch_size: 8\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: cosine\n- lr_scheduler_warmup_steps: 500\n- num_epochs: 1.0\n- mixed_precision_training: Native AMP### Training results### Framework versions\n\n- Transformers 4.16.2\n- Pytorch 1.10.1\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] | [
-0.10061316937208176,
0.15815170109272003,
-0.0034844817128032446,
0.07043377310037613,
0.1372310072183609,
0.029794372618198395,
0.064967580139637,
0.13718903064727783,
-0.033152926713228226,
0.11948690563440323,
0.09085584431886673,
0.0347362756729126,
0.08892984688282013,
0.1560259461402893,
0.03881567344069481,
-0.2738052010536194,
0.018619369715452194,
-0.01216825284063816,
-0.06002155691385269,
0.090980663895607,
0.11033497750759125,
-0.0750255435705185,
0.06360232084989548,
0.017770318314433098,
-0.09853816032409668,
0.001857395633123815,
-0.04348276928067207,
-0.04227868840098381,
0.08334581553936005,
0.02976926788687706,
0.036411065608263016,
0.0038328629452735186,
0.11345318704843521,
-0.26666226983070374,
-0.00005143953239894472,
0.059133436530828476,
0.034531086683273315,
0.09120310842990875,
0.07888078689575195,
0.018032221123576164,
0.11110585182905197,
-0.1674867570400238,
0.09531368315219879,
0.05783881992101669,
-0.07656048983335495,
-0.18889231979846954,
-0.09432417899370193,
0.08291500806808472,
0.09335088729858398,
0.0980784222483635,
-0.00691567175090313,
0.129710391163826,
-0.04838830605149269,
0.05397409200668335,
0.24405686557292938,
-0.23653319478034973,
-0.07054589688777924,
0.029399322345852852,
0.09852366894483566,
0.036641817539930344,
-0.10439680516719818,
0.013140399940311909,
0.04105152562260628,
0.0035450407303869724,
0.08113197237253189,
0.01252006646245718,
0.06483914703130722,
-0.027801774442195892,
-0.11648597568273544,
-0.06923243403434753,
0.16551490128040314,
0.09423497319221497,
-0.04110443964600563,
-0.15558049082756042,
-0.016209140419960022,
-0.10912865400314331,
-0.056953735649585724,
-0.03721816465258598,
0.03251517191529274,
-0.04592382162809372,
-0.022243071347475052,
-0.053081534802913666,
-0.05214088782668114,
-0.06042756885290146,
0.06681506335735321,
0.10717763006687164,
0.04478337615728378,
-0.022225119173526764,
0.008187619037926197,
0.08035790175199509,
0.0313214436173439,
-0.11034294962882996,
-0.059748001396656036,
-0.012654053047299385,
-0.09659358114004135,
-0.04199395701289177,
-0.02327246591448784,
0.08352576941251755,
0.02744162827730179,
0.19664128124713898,
-0.03495727851986885,
0.09112346172332764,
0.053159188479185104,
-0.007673005573451519,
-0.005298202857375145,
0.15069980919361115,
-0.037662066519260406,
-0.05086507648229599,
-0.02283601090312004,
0.08818605542182922,
-0.009097549133002758,
-0.021698055788874626,
-0.051510389894247055,
-0.024239087477326393,
0.08278064429759979,
0.07549144327640533,
0.024165907874703407,
0.003779095131903887,
-0.08014090359210968,
-0.037668101489543915,
0.04122218117117882,
-0.12033257633447647,
0.059602946043014526,
-0.000628003675956279,
-0.0607575960457325,
-0.010817945934832096,
0.02461468055844307,
0.00007457895844709128,
-0.06206390634179115,
0.017739132046699524,
-0.07348156720399857,
-0.024567006155848503,
-0.05665788799524307,
-0.031121401116251945,
0.025557061657309532,
-0.04216616600751877,
-0.001572715467773378,
-0.07348611205816269,
-0.1180456206202507,
-0.06636244803667068,
0.04005051776766777,
-0.08214753866195679,
-0.11295861750841141,
-0.02681857906281948,
-0.02741369418799877,
0.040621738880872726,
-0.007164912298321724,
0.11719068884849548,
-0.03155144676566124,
0.06061693653464317,
-0.003217389341443777,
0.030203862115740776,
0.06680544465780258,
0.04787450283765793,
-0.04897044971585274,
0.057106345891952515,
-0.05885150656104088,
0.09971010684967041,
-0.11633904278278351,
0.012308655306696892,
-0.15971258282661438,
-0.10553311556577682,
-0.03809456154704094,
-0.013643529266119003,
0.08585663139820099,
0.1493309736251831,
-0.10337794572114944,
-0.03933116793632507,
0.17075404524803162,
-0.040550149977207184,
-0.09735486656427383,
0.10106933116912842,
-0.00801109243184328,
-0.00399235263466835,
0.04168437048792839,
0.12780188024044037,
0.11989350616931915,
-0.1143137738108635,
-0.011472647078335285,
0.024066876620054245,
0.09042587876319885,
0.027653690427541733,
0.09931459277868271,
-0.04852481558918953,
-0.028184182941913605,
0.020209025591611862,
-0.06995152682065964,
0.004796943627297878,
-0.08054524660110474,
-0.08055314421653748,
-0.03825651481747627,
-0.04516056552529335,
0.03521246835589409,
0.022284099832177162,
0.0048514194786548615,
-0.048215821385383606,
-0.12410198897123337,
0.025241248309612274,
0.12379304319620132,
-0.04836905002593994,
0.013041266240179539,
-0.050819702446460724,
0.022119006142020226,
0.02740013226866722,
0.003625914454460144,
-0.17192061245441437,
-0.14514894783496857,
0.06627791374921799,
-0.15043103694915771,
0.011727061122655869,
-0.03457453474402428,
0.05051190406084061,
0.04344610869884491,
-0.027007097378373146,
-0.030714573338627815,
-0.11019793152809143,
-0.018826719373464584,
-0.09748062491416931,
-0.1494821459054947,
-0.06345159560441971,
-0.021204961463809013,
0.18915317952632904,
-0.234903484582901,
0.0026958913076668978,
-0.03344330936670303,
0.12385082989931107,
-0.010712433606386185,
-0.07428733259439468,
0.005166323855519295,
0.010112244635820389,
-0.0013255976373329759,
-0.10544443875551224,
0.036681078374385834,
0.002580104861408472,
-0.10380180925130844,
-0.016299953684210777,
-0.13506603240966797,
0.01694049872457981,
0.06029855087399483,
0.088348887860775,
-0.08897754549980164,
-0.07049448043107986,
-0.06654214859008789,
-0.03902237489819527,
-0.08006013184785843,
-0.013223424553871155,
0.13993076980113983,
0.01629306562244892,
0.10647813230752945,
-0.05747094005346298,
-0.07508855313062668,
0.03249555453658104,
0.01988052949309349,
-0.041610319167375565,
0.11423397809267044,
0.09485642611980438,
-0.07246895879507065,
0.06997207552194595,
0.06129208207130432,
0.009662782773375511,
0.1035214364528656,
-0.03877291455864906,
-0.09103848785161972,
-0.024003636091947556,
0.027334580197930336,
0.0016725730383768678,
0.0962279662489891,
-0.1015457734465599,
0.011106951162219048,
0.05080219730734825,
0.014559871517121792,
0.01869216002523899,
-0.12648265063762665,
-0.0031214456539601088,
0.0313575454056263,
-0.03836839273571968,
-0.00638685142621398,
-0.01596001721918583,
-0.014268486760556698,
0.08405999839305878,
0.05452360585331917,
0.009095659479498863,
-0.004867760464549065,
-0.013586065731942654,
-0.10292194038629532,
0.18252235651016235,
-0.08633619546890259,
-0.17904487252235413,
-0.1129603162407875,
0.053568024188280106,
-0.0385855995118618,
-0.02340056747198105,
0.01955604739487171,
-0.08757615834474564,
-0.05134030058979988,
-0.0860920175909996,
-0.02314414456486702,
-0.08120923489332199,
0.0007653001812286675,
0.06959032267332077,
0.013871570117771626,
0.08572110533714294,
-0.10943249613046646,
-0.0018265352118760347,
-0.008944493718445301,
-0.05743703618645668,
-0.020990537479519844,
0.01062515564262867,
0.10335815697908401,
0.08837065100669861,
0.027548065409064293,
0.016881518065929413,
-0.031082183122634888,
0.23189474642276764,
-0.09479721635580063,
0.004169331397861242,
0.1224791407585144,
0.06510550528764725,
0.0699533000588417,
0.10010559856891632,
0.024771712720394135,
-0.08620240539312363,
0.04378737136721611,
0.051339052617549896,
-0.018324799835681915,
-0.25185781717300415,
-0.050903141498565674,
-0.04399576038122177,
-0.05445545166730881,
0.16161103546619415,
0.06547164916992188,
-0.012450255453586578,
0.058632608503103256,
-0.06582164764404297,
0.05986195057630539,
0.012762205675244331,
0.09233204275369644,
0.07952124625444412,
0.04040774703025818,
0.08293136209249496,
-0.017049064859747887,
-0.01641087792813778,
0.06773947179317474,
-0.00746057229116559,
0.2567217946052551,
0.018540389835834503,
0.16203883290290833,
0.013338400050997734,
0.14688313007354736,
-0.028973443433642387,
0.032962389290332794,
0.07269551604986191,
0.01456490159034729,
0.005976460874080658,
-0.058629319071769714,
-0.034857019782066345,
0.03668325021862984,
0.05148661509156227,
0.0052458723075687885,
-0.10098235309123993,
0.03251827135682106,
-0.010832205414772034,
0.25512275099754333,
0.047698039561510086,
-0.2886562943458557,
-0.07974226772785187,
0.009120175614953041,
-0.022498417645692825,
-0.09841233491897583,
0.0011387703707441688,
0.044280022382736206,
-0.17229847609996796,
0.07051956653594971,
-0.04834364727139473,
0.09891103208065033,
-0.04192237928509712,
-0.009852509014308453,
0.051753368228673935,
0.09172451496124268,
0.008699637837707996,
0.09643864631652832,
-0.18719545006752014,
0.22753563523292542,
-0.01679151877760887,
0.0715232864022255,
-0.055273085832595825,
0.05680226534605026,
0.006190773099660873,
0.019571399316191673,
0.15239796042442322,
0.020770162343978882,
-0.07912988215684891,
-0.17082101106643677,
-0.11506948620080948,
0.03860515356063843,
0.11675144731998444,
-0.1191335991024971,
0.07629452645778656,
-0.060601793229579926,
-0.018233269453048706,
0.018941141664981842,
-0.05803840979933739,
-0.19867685437202454,
-0.16556715965270996,
0.04883774369955063,
-0.025656122714281082,
0.02251533977687359,
-0.08098044991493225,
-0.09938772767782211,
-0.02133755013346672,
0.18879815936088562,
-0.0315263457596302,
-0.045729659497737885,
-0.16699708998203278,
0.10044741630554199,
0.14396630227565765,
-0.08637034893035889,
0.05140841007232666,
-0.008956712670624256,
0.17343944311141968,
0.031188257038593292,
-0.045418061316013336,
0.06724472343921661,
-0.08392287790775299,
-0.16172540187835693,
-0.036118049174547195,
0.16303925216197968,
0.03172474727034569,
0.05793127045035362,
0.021494172513484955,
0.028446460142731667,
-0.00795019418001175,
-0.11201154440641403,
0.04631209000945091,
0.08667560666799545,
0.017285626381635666,
0.053914330899715424,
-0.07183374464511871,
0.03606174513697624,
-0.05898713693022728,
-0.02320852316915989,
0.11966168880462646,
0.25143498182296753,
-0.0887228325009346,
0.07939184457063675,
0.04875709116458893,
-0.0858810767531395,
-0.1734989434480667,
-0.011730821803212166,
0.13903455436229706,
0.008336402475833893,
0.07039978355169296,
-0.22798267006874084,
0.08486325293779373,
0.09864825755357742,
-0.0303387101739645,
-0.0033241095952689648,
-0.28402912616729736,
-0.11956422030925751,
0.05506046116352081,
0.07505028694868088,
0.01601710543036461,
-0.12215697765350342,
-0.0709402933716774,
-0.046434566378593445,
-0.11275999248027802,
0.09108585119247437,
0.015771765261888504,
0.09486738592386246,
-0.013321706093847752,
0.022318152710795403,
0.03705056384205818,
-0.021661775186657906,
0.1707410216331482,
0.02925276942551136,
0.026293223723769188,
-0.040769197046756744,
0.06209336966276169,
0.011609240435063839,
-0.07708913087844849,
0.0803341269493103,
-0.029260462149977684,
0.06853615492582321,
-0.17434963583946228,
-0.03510648384690285,
-0.04698943346738815,
0.07935076951980591,
-0.06754086911678314,
-0.026236843317747116,
-0.031096862629055977,
0.051444899290800095,
0.08321273326873779,
0.0005117722903378308,
0.0670870915055275,
0.0332757905125618,
0.05648118257522583,
0.022824211046099663,
0.10350753366947174,
0.04677414894104004,
-0.15581496059894562,
-0.02744937315583229,
-0.014239869080483913,
0.04726475104689598,
-0.11195048689842224,
0.019656172022223473,
0.11317168921232224,
0.04777051880955696,
0.1082257479429245,
0.023217609152197838,
-0.07273634523153305,
-0.0068783811293542385,
0.03547283634543419,
-0.054671525955200195,
-0.22869715094566345,
-0.04005212336778641,
0.018213365226984024,
-0.17533430457115173,
-0.024546844884753227,
0.08525240421295166,
-0.05141156539320946,
-0.03766542673110962,
-0.024375401437282562,
0.032324012368917465,
0.017020534723997116,
0.15159370005130768,
0.019651377573609352,
0.08442471921443939,
-0.07179498672485352,
0.08968358486890793,
0.08543165028095245,
-0.07725226879119873,
0.060516346246004105,
0.08294323831796646,
-0.0729118138551712,
-0.03039185144007206,
0.05959118530154228,
0.045769356191158295,
-0.019738174974918365,
-0.01981898583471775,
-0.03845306113362312,
-0.08265308290719986,
0.06119903177022934,
-0.03688865900039673,
0.029066452756524086,
-0.019204627722501755,
0.014052401296794415,
0.018483199179172516,
-0.12202715128660202,
0.08238160610198975,
0.06487209349870682,
0.06246785819530487,
-0.1107621043920517,
0.0008947040187194943,
0.03280454874038696,
0.05988286808133125,
-0.004801150411367416,
-0.006211536470800638,
-0.0966930240392685,
-0.029592707753181458,
-0.0590488538146019,
-0.009704070165753365,
-0.04663060978055,
0.0015920717269182205,
-0.03643287718296051,
-0.053638655692338943,
-0.02522503212094307,
0.05332757160067558,
-0.05905749648809433,
-0.11134494096040726,
-0.013224232010543346,
0.10433702170848846,
-0.11342528462409973,
0.01196574978530407,
0.06511479616165161,
-0.12395496666431427,
0.09622328728437424,
0.03537704795598984,
0.049915656447410583,
0.006677395664155483,
-0.09307999163866043,
0.00604398176074028,
-0.006643862463533878,
0.029174935072660446,
0.025840235874056816,
-0.14022840559482574,
0.005609665997326374,
-0.0312945581972599,
0.029113255441188812,
-0.00592737877741456,
-0.017497045919299126,
-0.14032989740371704,
-0.05447809770703316,
-0.08116480708122253,
-0.06069856137037277,
-0.046980638056993484,
0.04950498417019844,
0.04647595435380936,
0.022665593773126602,
0.12238052487373352,
-0.05098127946257591,
0.04995573312044144,
-0.22535207867622375,
-0.02077472023665905,
-0.027658162638545036,
-0.0011346759274601936,
-0.057203806936740875,
-0.04482191801071167,
0.0910373106598854,
-0.01072672102600336,
0.10234609991312027,
-0.009102347306907177,
0.10576323419809341,
0.03277386352419853,
-0.021475285291671753,
0.008939682506024837,
0.009004208259284496,
0.1736088991165161,
0.09575159847736359,
0.0018441890133544803,
0.12347099930047989,
-0.015174640342593193,
0.050918255001306534,
0.09151175618171692,
0.12984208762645721,
0.16399721801280975,
0.011392434127628803,
0.0595601461827755,
0.038697969168424606,
-0.12008821964263916,
-0.1568850427865982,
0.11707211285829544,
-0.026628369465470314,
0.11941322684288025,
-0.043937090784311295,
0.12696896493434906,
0.08136649429798126,
-0.16282762587070465,
0.04677383974194527,
-0.05095916613936424,
-0.09988824278116226,
-0.11505881696939468,
-0.10367502272129059,
-0.09298289567232132,
-0.1229071095585823,
0.014823737554252148,
-0.10261496156454086,
0.014591846615076065,
0.08466435223817825,
0.0067206500098109245,
0.014986216090619564,
0.11083517223596573,
-0.00164752546697855,
-0.019796140491962433,
0.07728631049394608,
0.004187094047665596,
-0.005728156305849552,
-0.04848272353410721,
-0.06579247117042542,
0.045565053820610046,
0.011690312065184116,
0.09960438311100006,
-0.027989501133561134,
-0.027117813006043434,
0.06131066009402275,
-0.005730742122977972,
-0.09272467344999313,
0.012171629816293716,
0.000034307890018681064,
0.006656685844063759,
0.045980218797922134,
0.041829049587249756,
-0.007696264423429966,
-0.04012349620461464,
0.27170443534851074,
-0.02237432450056076,
-0.018791040405631065,
-0.1276574730873108,
0.1402890980243683,
0.04574042186141014,
-0.015187488868832588,
0.04454350471496582,
-0.08294616639614105,
-0.005810543429106474,
0.13933365046977997,
0.08777441829442978,
0.002517652465030551,
-0.026956578716635704,
0.011070233769714832,
-0.01519967895001173,
-0.01554005965590477,
0.1135931983590126,
0.10458242148160934,
0.00729888491332531,
-0.053600773215293884,
0.01152398344129324,
-0.000048811598389875144,
-0.06379333883523941,
-0.10060020536184311,
0.06451824307441711,
0.026674684137105942,
0.01849384233355522,
-0.031595416367053986,
0.10603413730859756,
-0.0018828301690518856,
-0.13953711092472076,
0.02170061320066452,
-0.13032659888267517,
-0.18025042116641998,
-0.02578802965581417,
0.03225167095661163,
0.02892783284187317,
0.06264916807413101,
0.008424978703260422,
-0.021821659058332443,
0.16799230873584747,
-0.007006514351814985,
-0.06651980429887772,
-0.09686381369829178,
0.06508315354585648,
-0.016886334866285324,
0.2717937231063843,
0.0030435388907790184,
0.050394102931022644,
0.10335534065961838,
0.006537869572639465,
-0.16864262521266937,
-0.01214334461838007,
0.08263809233903885,
-0.02168571949005127,
0.0825362503528595,
0.15154649317264557,
-0.0361432209610939,
0.04282673820853233,
0.04345299303531647,
-0.08944200724363327,
-0.03070763871073723,
-0.05887206643819809,
0.008314360864460468,
-0.08320944011211395,
0.04427030682563782,
-0.06876496970653534,
0.1506478488445282,
0.18313437700271606,
-0.07276605814695358,
-0.04664347693324089,
-0.0673842504620552,
0.028496118262410164,
0.02633637562394142,
0.13933828473091125,
-0.009788802824914455,
-0.19899041950702667,
0.011003036051988602,
-0.012247322127223015,
0.04236380755901337,
-0.19340020418167114,
-0.09380073100328445,
0.022490279749035835,
-0.04121583700180054,
-0.037244945764541626,
0.11576896160840988,
0.03484043851494789,
-0.005396642256528139,
-0.039536308497190475,
-0.1538889855146408,
-0.045844968408346176,
0.15309089422225952,
-0.1396285742521286,
-0.04439801722764969
] |
null | null | transformers |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# pegasus-reddit
This model is a fine-tuned version of [google/pegasus-large](https://huggingface.co/google/pegasus-large) on the reddit dataset.
It achieves the following results on the evaluation set:
- Loss: 3.3329
- Rouge1: 23.967
- Rouge2: 5.0032
- Rougel: 15.3267
- Rougelsum: 18.5905
- Gen Len: 69.2193
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 6.35e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: cosine
- lr_scheduler_warmup_steps: 500
- num_epochs: 1.0
- mixed_precision_training: Native AMP
### Training results
### Framework versions
- Transformers 4.16.2
- Pytorch 1.10.1
- Datasets 1.17.0
- Tokenizers 0.10.3
| {"tags": ["generated_from_trainer"], "datasets": ["reddit"], "metrics": ["rouge"], "model-index": [{"name": "pegasus-reddit", "results": [{"task": {"type": "text2text-generation", "name": "Sequence-to-sequence Language Modeling"}, "dataset": {"name": "reddit", "type": "reddit", "args": "default"}, "metrics": [{"type": "rouge", "value": 23.967, "name": "Rouge1"}]}]}]} | text2text-generation | QuickRead/pegasus-reddit | [
"transformers",
"pytorch",
"pegasus",
"text2text-generation",
"generated_from_trainer",
"dataset:reddit",
"model-index",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [] | TAGS
#transformers #pytorch #pegasus #text2text-generation #generated_from_trainer #dataset-reddit #model-index #autotrain_compatible #endpoints_compatible #region-us
|
# pegasus-reddit
This model is a fine-tuned version of google/pegasus-large on the reddit dataset.
It achieves the following results on the evaluation set:
- Loss: 3.3329
- Rouge1: 23.967
- Rouge2: 5.0032
- Rougel: 15.3267
- Rougelsum: 18.5905
- Gen Len: 69.2193
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 6.35e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: cosine
- lr_scheduler_warmup_steps: 500
- num_epochs: 1.0
- mixed_precision_training: Native AMP
### Training results
### Framework versions
- Transformers 4.16.2
- Pytorch 1.10.1
- Datasets 1.17.0
- Tokenizers 0.10.3
| [
"# pegasus-reddit\n\nThis model is a fine-tuned version of google/pegasus-large on the reddit dataset.\nIt achieves the following results on the evaluation set:\n- Loss: 3.3329\n- Rouge1: 23.967\n- Rouge2: 5.0032\n- Rougel: 15.3267\n- Rougelsum: 18.5905\n- Gen Len: 69.2193",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 6.35e-05\n- train_batch_size: 8\n- eval_batch_size: 8\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: cosine\n- lr_scheduler_warmup_steps: 500\n- num_epochs: 1.0\n- mixed_precision_training: Native AMP",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.2\n- Pytorch 1.10.1\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] | [
"TAGS\n#transformers #pytorch #pegasus #text2text-generation #generated_from_trainer #dataset-reddit #model-index #autotrain_compatible #endpoints_compatible #region-us \n",
"# pegasus-reddit\n\nThis model is a fine-tuned version of google/pegasus-large on the reddit dataset.\nIt achieves the following results on the evaluation set:\n- Loss: 3.3329\n- Rouge1: 23.967\n- Rouge2: 5.0032\n- Rougel: 15.3267\n- Rougelsum: 18.5905\n- Gen Len: 69.2193",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 6.35e-05\n- train_batch_size: 8\n- eval_batch_size: 8\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: cosine\n- lr_scheduler_warmup_steps: 500\n- num_epochs: 1.0\n- mixed_precision_training: Native AMP",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.2\n- Pytorch 1.10.1\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] | [
57,
84,
6,
12,
8,
3,
120,
4,
30
] | [
"passage: TAGS\n#transformers #pytorch #pegasus #text2text-generation #generated_from_trainer #dataset-reddit #model-index #autotrain_compatible #endpoints_compatible #region-us \n# pegasus-reddit\n\nThis model is a fine-tuned version of google/pegasus-large on the reddit dataset.\nIt achieves the following results on the evaluation set:\n- Loss: 3.3329\n- Rouge1: 23.967\n- Rouge2: 5.0032\n- Rougel: 15.3267\n- Rougelsum: 18.5905\n- Gen Len: 69.2193## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 6.35e-05\n- train_batch_size: 8\n- eval_batch_size: 8\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: cosine\n- lr_scheduler_warmup_steps: 500\n- num_epochs: 1.0\n- mixed_precision_training: Native AMP### Training results### Framework versions\n\n- Transformers 4.16.2\n- Pytorch 1.10.1\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] | [
-0.08968821167945862,
0.15167908370494843,
-0.003516512457281351,
0.06106335669755936,
0.14183315634727478,
0.03170726075768471,
0.08100544661283493,
0.1316315233707428,
-0.021527603268623352,
0.11805922538042068,
0.08950934559106827,
0.04142456129193306,
0.08842755109071732,
0.13110029697418213,
0.027014320716261864,
-0.2692709267139435,
0.030815619975328445,
-0.016563160344958305,
-0.0603005476295948,
0.09752406924962997,
0.1139836385846138,
-0.0751785933971405,
0.06863624602556229,
0.023689933121204376,
-0.11308914422988892,
0.0079001784324646,
-0.04650792479515076,
-0.043663959950208664,
0.08051196485757828,
0.031913720071315765,
0.04121198505163193,
0.01609088107943535,
0.10330123454332352,
-0.2753234803676605,
0.005207553971558809,
0.06580130010843277,
0.030986391007900238,
0.08946718275547028,
0.08902406692504883,
0.008291716687381268,
0.11882001906633377,
-0.15995125472545624,
0.09597288072109222,
0.05993639677762985,
-0.07512760907411575,
-0.18685808777809143,
-0.09035815298557281,
0.07763803750276566,
0.09335639327764511,
0.10553792119026184,
-0.010761680081486702,
0.14264996349811554,
-0.05579586699604988,
0.04425932839512825,
0.24077142775058746,
-0.23054008185863495,
-0.06495789438486099,
0.015189851634204388,
0.08127938956022263,
0.044972267001867294,
-0.10415686666965485,
0.006466568447649479,
0.03240470215678215,
0.0036744452081620693,
0.05853479728102684,
0.0173137579113245,
0.07249701023101807,
-0.04340649023652077,
-0.11764705181121826,
-0.06816496700048447,
0.17020627856254578,
0.10459877550601959,
-0.04196939244866371,
-0.14316771924495697,
-0.020694361999630928,
-0.09359202533960342,
-0.053423795849084854,
-0.03094497136771679,
0.02643042802810669,
-0.03423697501420975,
-0.03574373200535774,
-0.04266224801540375,
-0.054150067269802094,
-0.05528702214360237,
0.060022417455911636,
0.120236337184906,
0.04709843918681145,
-0.015442660078406334,
-0.0034174781758338213,
0.06685525178909302,
0.044589653611183167,
-0.11779356002807617,
-0.055707234889268875,
-0.022021055221557617,
-0.09278777241706848,
-0.03135838732123375,
-0.03406143933534622,
0.041823163628578186,
0.0429268479347229,
0.19010072946548462,
-0.024144228547811508,
0.09502696245908737,
0.04826764017343521,
-0.004034700337797403,
-0.004189849365502596,
0.1488054245710373,
-0.04322829842567444,
-0.042091574519872665,
-0.017419302836060524,
0.07333669811487198,
-0.008118375204503536,
-0.020239830017089844,
-0.053029824048280716,
-0.012930816039443016,
0.08831080049276352,
0.0704089105129242,
0.024741580709815025,
0.010767584666609764,
-0.0732404813170433,
-0.04249957948923111,
0.05750960484147072,
-0.11314204335212708,
0.0470484122633934,
0.004062669351696968,
-0.05601460859179497,
-0.0023295104037970304,
0.010047645308077335,
-0.0034566360991448164,
-0.06013191491365433,
0.020816057920455933,
-0.0782255008816719,
-0.029798582196235657,
-0.055698636919260025,
-0.04162154719233513,
0.02410547435283661,
-0.03587363287806511,
0.007134658750146627,
-0.08576062321662903,
-0.10328665375709534,
-0.07105863094329834,
0.03575188294053078,
-0.07807400077581406,
-0.10521672666072845,
-0.025720790028572083,
-0.026754170656204224,
0.04706720635294914,
-0.0049618082121014595,
0.101163849234581,
-0.03185078874230385,
0.053172554820775986,
-0.01151077076792717,
0.023497790098190308,
0.05323735997080803,
0.050484586507081985,
-0.0645211711525917,
0.053342316299676895,
-0.06328260898590088,
0.10561539977788925,
-0.11499584466218948,
0.01600515842437744,
-0.16448204219341278,
-0.10688454657793045,
-0.03965431824326515,
-0.01770881563425064,
0.08283507078886032,
0.1469000279903412,
-0.09961782395839691,
-0.03949122130870819,
0.1609618216753006,
-0.04945317283272743,
-0.09997954964637756,
0.10153627395629883,
-0.010874487459659576,
-0.014137789607048035,
0.03328835219144821,
0.12764997780323029,
0.1296473890542984,
-0.09065331518650055,
-0.02657773718237877,
0.007535285782068968,
0.08131248503923416,
0.02038644440472126,
0.09648428857326508,
-0.04232867807149887,
-0.012640419416129589,
0.013117036782205105,
-0.05547591298818588,
0.00954885222017765,
-0.07745182514190674,
-0.07150810211896896,
-0.04344373568892479,
-0.048310521990060806,
0.003861777251586318,
0.019885512068867683,
0.006213241256773472,
-0.05687074735760689,
-0.13112184405326843,
0.016489004716277122,
0.11960139125585556,
-0.04832479730248451,
0.013988909311592579,
-0.060376547276973724,
0.0166033748537302,
0.0168666522949934,
0.006647311616688967,
-0.18171274662017822,
-0.15178513526916504,
0.06720758229494095,
-0.14475183188915253,
0.02168012224137783,
-0.030548956245183945,
0.048409514129161835,
0.03159118816256523,
-0.021667908877134323,
-0.03188309445977211,
-0.10806526988744736,
-0.015775999054312706,
-0.0831286683678627,
-0.144673153758049,
-0.05260518565773964,
-0.02346433699131012,
0.17403367161750793,
-0.2117682248353958,
-0.0019182054093107581,
-0.014060312882065773,
0.1323511004447937,
-0.009976967237889767,
-0.07594597339630127,
0.006388735491782427,
0.0002807899727486074,
-0.006484955549240112,
-0.10323584079742432,
0.030443057417869568,
0.006126102525740862,
-0.10970821231603622,
-0.0035589176695793867,
-0.11978710442781448,
0.02270374819636345,
0.06861201673746109,
0.0802997425198555,
-0.08112132549285889,
-0.037932801991701126,
-0.06784286350011826,
-0.032565414905548096,
-0.08114002645015717,
-0.0066873012110590935,
0.15680472552776337,
0.02601836621761322,
0.09518498182296753,
-0.05241486802697182,
-0.05913037061691284,
0.03645794466137886,
0.022204823791980743,
-0.044966742396354675,
0.11268773674964905,
0.09797472506761551,
-0.10283014178276062,
0.06811130791902542,
0.06292075663805008,
0.013081834651529789,
0.08214163035154343,
-0.03968735784292221,
-0.09128101915121078,
-0.011432892642915249,
0.016741221770644188,
0.003654476022347808,
0.0834883525967598,
-0.07943639904260635,
0.021893823519349098,
0.052893415093421936,
0.020447921007871628,
0.020395908504724503,
-0.12432634085416794,
-0.0004433313733898103,
0.027232812717556953,
-0.029048746451735497,
-0.028669582679867744,
-0.005002055782824755,
-0.004242647904902697,
0.08545766025781631,
0.05311549827456474,
-0.0038578046951442957,
-0.004594665020704269,
-0.01804407685995102,
-0.09487918764352798,
0.18132200837135315,
-0.081990085542202,
-0.1858091801404953,
-0.11181855946779251,
0.05375201255083084,
-0.042897943407297134,
-0.02547197975218296,
0.025801382958889008,
-0.10152287036180496,
-0.06129462644457817,
-0.08004514873027802,
-0.010233805514872074,
-0.09550613164901733,
0.0038734525442123413,
0.07197465002536774,
0.016452988609671593,
0.07857146859169006,
-0.10205643624067307,
-0.0024673885200172663,
-0.007402177434414625,
-0.05144155025482178,
-0.015274127945303917,
0.004238586872816086,
0.11323992908000946,
0.08917367458343506,
0.018753858283162117,
0.016063008457422256,
-0.023090187460184097,
0.23298294842243195,
-0.09339212626218796,
-0.003867233404889703,
0.12678363919258118,
0.06187523156404495,
0.07068859040737152,
0.10433810204267502,
0.02455117180943489,
-0.0937325581908226,
0.04896903783082962,
0.05410531163215637,
-0.02378149889409542,
-0.2593870162963867,
-0.044078897684812546,
-0.046246860176324844,
-0.07761785387992859,
0.1605428159236908,
0.07126632332801819,
-0.03121182695031166,
0.07113257050514221,
-0.05933818593621254,
0.042062416672706604,
0.004965918604284525,
0.10070163011550903,
0.09964505583047867,
0.05344998463988304,
0.08598900586366653,
-0.02579009346663952,
-0.021691974252462387,
0.06819182634353638,
-0.02317873015999794,
0.2534157335758209,
0.023947544395923615,
0.1683148294687271,
0.013920523226261139,
0.12560626864433289,
-0.017115142196416855,
0.03269415721297264,
0.07654467970132828,
0.0215626060962677,
-0.0003031189553439617,
-0.04740912839770317,
-0.045415475964546204,
0.03354736417531967,
0.051836319267749786,
-0.0009413117077201605,
-0.09177550673484802,
0.04497712478041649,
-0.006524181459099054,
0.26728343963623047,
0.040597837418317795,
-0.2903256118297577,
-0.08370787650346756,
0.007932771928608418,
-0.029078582301735878,
-0.09871121495962143,
0.006866933312267065,
0.02301199920475483,
-0.1616356372833252,
0.07310473918914795,
-0.04239794984459877,
0.10804790258407593,
-0.059648193418979645,
-0.007027159444987774,
0.04332268610596657,
0.10132177174091339,
0.002988688414916396,
0.0963299423456192,
-0.1947944015264511,
0.21488235890865326,
-0.016892319545149803,
0.05670297145843506,
-0.056679315865039825,
0.046703558415174484,
0.0031617353670299053,
0.018963543698191643,
0.14261998236179352,
0.020448220893740654,
-0.11069678515195847,
-0.14788700640201569,
-0.11614829301834106,
0.04461245238780975,
0.1276361346244812,
-0.11146044731140137,
0.0792802944779396,
-0.06189382076263428,
-0.013038797304034233,
0.021756183356046677,
-0.05664198845624924,
-0.20160672068595886,
-0.17630846798419952,
0.046187348663806915,
-0.011881032958626747,
0.018476758152246475,
-0.07911952584981918,
-0.09545108675956726,
-0.0021515581756830215,
0.20218504965305328,
-0.009683587588369846,
-0.045804914087057114,
-0.171514093875885,
0.09621049463748932,
0.14729173481464386,
-0.09184261411428452,
0.03814567252993584,
-0.01887165755033493,
0.17238245904445648,
0.03140823170542717,
-0.05510544776916504,
0.07428758591413498,
-0.08820875734090805,
-0.15735884010791779,
-0.035540685057640076,
0.14724300801753998,
0.03254488483071327,
0.05551433190703392,
0.0210171677172184,
0.03079976700246334,
-0.017212020233273506,
-0.11610887944698334,
0.0488349124789238,
0.06534275412559509,
6.522088646931934e-9,
0.052911557257175446,
-0.057647328823804855,
0.04961545392870903,
-0.06682046502828598,
-0.027809856459498405,
0.12468256056308746,
0.2514638602733612,
-0.08353409916162491,
0.062453825026750565,
0.03672316297888756,
-0.07162129878997803,
-0.16046024858951569,
-0.012246381491422653,
0.1309340000152588,
-0.0011006381828337908,
0.052958693355321884,
-0.23239685595035553,
0.08645755052566528,
0.08718705922365189,
-0.02853470668196678,
0.006523594725877047,
-0.26327478885650635,
-0.11387206614017487,
0.07822086662054062,
0.078896664083004,
0.042899344116449356,
-0.12918464839458466,
-0.07090425491333008,
-0.06241108104586601,
-0.12547942996025085,
0.10759033262729645,
0.03686431422829628,
0.09588895738124847,
-0.018870504572987556,
0.039650823920965195,
0.03414340317249298,
-0.022203519940376282,
0.16138246655464172,
0.0348982959985733,
0.02484756149351597,
-0.04515307396650314,
0.07719312608242035,
0.01789456605911255,
-0.06521959602832794,
0.07597009837627411,
-0.026033224537968636,
0.07021492719650269,
-0.16287648677825928,
-0.03420432657003403,
-0.04558980464935303,
0.06800077110528946,
-0.05691387876868248,
-0.025795279070734978,
-0.04214141145348549,
0.04676683619618416,
0.09196247905492783,
-0.0033708640839904547,
0.09969109296798706,
0.03310587257146835,
0.0502806194126606,
0.021143563091754913,
0.10153385996818542,
0.05146731063723564,
-0.1473487913608551,
-0.03009968250989914,
-0.020920217037200928,
0.04639348387718201,
-0.10913968086242676,
0.02451997809112072,
0.11089593917131424,
0.040393076837062836,
0.10488193482160568,
0.025555841624736786,
-0.08203443884849548,
-0.003495604731142521,
0.03863197937607765,
-0.05311789736151695,
-0.23199379444122314,
-0.03145705908536911,
0.045069415122270584,
-0.1701909601688385,
-0.02952171117067337,
0.09511781483888626,
-0.05473627522587776,
-0.039614204317331314,
-0.019314700737595558,
0.03352697193622589,
0.026340030133724213,
0.14562608301639557,
0.025914838537573814,
0.08146493881940842,
-0.07727158069610596,
0.10016915947198868,
0.08539620786905289,
-0.06839562952518463,
0.05993230268359184,
0.07223238050937653,
-0.07777994871139526,
-0.02543451637029648,
0.057762738317251205,
0.03194037452340126,
-0.05081774294376373,
-0.027539687231183052,
-0.05767097696661949,
-0.0931975394487381,
0.06571336090564728,
-0.03690623492002487,
0.03133491054177284,
-0.019451262429356575,
0.015960317105054855,
0.010180500335991383,
-0.12537816166877747,
0.06750179082155228,
0.05382228270173073,
0.06412242352962494,
-0.10936159640550613,
-0.03869330883026123,
0.03459965065121651,
0.053257789462804794,
-0.005016192328184843,
-0.0013733069645240903,
-0.0959276407957077,
-0.03264207765460014,
-0.07082606852054596,
-0.005269638262689114,
-0.03722281754016876,
0.005743466783314943,
-0.03463003784418106,
-0.05601843446493149,
-0.0412864163517952,
0.05314195156097412,
-0.06545884162187576,
-0.1170564517378807,
-0.013013269752264023,
0.09770873188972473,
-0.1219550147652626,
0.016520921140909195,
0.0692593976855278,
-0.12066785246133804,
0.09328123182058334,
0.042361706495285034,
0.043934017419815063,
0.014597869478166103,
-0.09617675840854645,
-0.004002733156085014,
0.0018443905282765627,
0.029634365811944008,
0.031626708805561066,
-0.14043280482292175,
0.00794821884483099,
-0.02055373042821884,
0.018528228625655174,
-0.007246905937790871,
-0.013010947033762932,
-0.1340266317129135,
-0.05659782141447067,
-0.08069642633199692,
-0.05405653268098831,
-0.048372216522693634,
0.044696517288684845,
0.05294647440314293,
0.03933393582701683,
0.14078772068023682,
-0.05205412954092026,
0.05153762549161911,
-0.22068318724632263,
-0.017588015645742416,
-0.019843393936753273,
-0.006844555959105492,
-0.03548138588666916,
-0.040552347898483276,
0.09020571410655975,
-0.009851701557636261,
0.1179070770740509,
-0.02348918467760086,
0.10208716243505478,
0.040912218391895294,
0.00895332358777523,
-0.005009409040212631,
0.0024090022780001163,
0.16951093077659607,
0.0991494208574295,
0.004535652231425047,
0.1300927847623825,
-0.027290640398859978,
0.055773403495550156,
0.07704474031925201,
0.14456860721111298,
0.16478212177753448,
0.015321559272706509,
0.057122036814689636,
0.04366924986243248,
-0.10313226282596588,
-0.12928012013435364,
0.12327684462070465,
-0.016353651881217957,
0.11267154663801193,
-0.03239276632666588,
0.1410772204399109,
0.09049055725336075,
-0.14774659276008606,
0.046244509518146515,
-0.04942768067121506,
-0.08306889235973358,
-0.1203402578830719,
-0.08367772400379181,
-0.10050580650568008,
-0.1325255036354065,
0.023800212889909744,
-0.10391215980052948,
0.011335758492350578,
0.0777292549610138,
0.006855243816971779,
0.014587821438908577,
0.10262856632471085,
0.009394630789756775,
-0.025256521999835968,
0.07465685904026031,
-0.009841273538768291,
-0.023965055122971535,
-0.025671225041151047,
-0.06730112433433533,
0.05075104534626007,
0.015141687355935574,
0.1024019792675972,
-0.025005323812365532,
-0.03206063061952591,
0.06505525857210159,
-0.018001023679971695,
-0.0977434441447258,
0.006994376424700022,
0.009137674234807491,
-0.00031328259501606226,
0.04775101691484451,
0.0432068333029747,
-0.0014474914642050862,
-0.04044374078512192,
0.2672644555568695,
-0.016569821164011955,
-0.009285122156143188,
-0.11917944252490997,
0.1399850994348526,
0.0474388524889946,
-0.019414057955145836,
0.047034334391355515,
-0.08872642368078232,
-0.011816979385912418,
0.12270976603031158,
0.09222831577062607,
-0.013333906419575214,
-0.02584725245833397,
0.0054686241783201694,
-0.01273468229919672,
-0.013191853649914265,
0.10741272568702698,
0.1059613823890686,
-0.013413471169769764,
-0.05219249054789543,
0.011723936535418034,
-0.013528239913284779,
-0.05727643892168999,
-0.08569338172674179,
0.0655069574713707,
0.03591129183769226,
0.01605336368083954,
-0.04034632071852684,
0.11614527553319931,
-0.021279258653521538,
-0.14496859908103943,
0.006475942209362984,
-0.12318249046802521,
-0.17585128545761108,
-0.02636176161468029,
0.01748981885612011,
0.028466882184147835,
0.06940999627113342,
0.008627143688499928,
-0.02202669158577919,
0.153597891330719,
-0.0021690509747713804,
-0.07433715462684631,
-0.09229691326618195,
0.06407138705253601,
-0.024284085258841515,
0.2563626766204834,
0.006370215211063623,
0.03932449594140053,
0.10778467357158661,
0.011475600302219391,
-0.1710769236087799,
0.006417240481823683,
0.0766848549246788,
-0.02212078683078289,
0.08216454833745956,
0.15966282784938812,
-0.04003841429948807,
0.04975106567144394,
0.053481701761484146,
-0.09682659804821014,
-0.02750554494559765,
-0.08359368145465851,
0.009644619189202785,
-0.08585380762815475,
0.04395904392004013,
-0.07450791448354721,
0.1480002999305725,
0.18773634731769562,
-0.0725022628903389,
-0.04687810316681862,
-0.06871397793292999,
0.030415529385209084,
0.024139102548360825,
0.12962117791175842,
-0.015576542355120182,
-0.19096331298351288,
0.00952574796974659,
0.013881026767194271,
0.04514272138476372,
-0.20386171340942383,
-0.08888708800077438,
0.015272240154445171,
-0.044852402061223984,
-0.0331578254699707,
0.11867692321538925,
0.030819596722722054,
0.00007364380144281313,
-0.04566477984189987,
-0.15891514718532562,
-0.039177194237709045,
0.14973431825637817,
-0.13923296332359314,
-0.04196726530790329
] |
null | null | transformers | <!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# wav2vec2-xlsr-et-lm-1B
This model was finetuned with mozilla_foundation/common_voice_8_0 et with train+other+validation splits.
It achieves the following results on the test set:
(Loss reported with last eval step at step 2000/2040 during training)
- Loss: 0.2150
- Wer: 0.2012
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.00005
- train_batch_size: 32
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 1
- total_train_batch_size: 16
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 500
- num_epochs: 10
- mixed_precision_training: Native AMP
### Training results
### Framework versions
- Transformers 4.17.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.18.3
- Tokenizers 0.11.0
| {"language": "et", "tags": ["generated_from_trainer", "mozilla-foundation/common_voice_8_0", "audio", "automatic-speech-recognition", "speech", "robust-speech-event", "hf-asr-leaderboard"], "datasets": ["mozilla-foundation/common_voice_8_0"], "metrics": ["wer", "cer"], "model-index": [{"name": "XLS-R 1B Wav2Vec2 Estonian by Rasmus Toivanen", "results": [{"task": {"type": "automatic-speech-recognition", "name": "Automatic Speech Recognition"}, "dataset": {"name": "Common Voice 8", "type": "mozilla-foundation/common_voice_8_0", "args": "et"}, "metrics": [{"type": "wer", "value": 20.12, "name": "Test WER"}, {"type": "cer", "value": 3.82, "name": "Test CER"}]}, {"task": {"type": "automatic-speech-recognition", "name": "Automatic Speech Recognition"}, "dataset": {"name": "Robust Speech Event - Dev Data", "type": "speech-recognition-community-v2/dev_data", "args": "et"}, "metrics": [{"type": "wer", "value": 40.77, "name": "Test WER"}, {"type": "cer", "value": 12.32, "name": "Test CER"}]}, {"task": {"type": "automatic-speech-recognition", "name": "Automatic Speech Recognition"}, "dataset": {"name": "Robust Speech Event - Test Data", "type": "speech-recognition-community-v2/eval_data", "args": "et"}, "metrics": [{"type": "wer", "value": 41.97, "name": "Test WER"}]}]}]} | automatic-speech-recognition | RASMUS/wav2vec2-xlsr-1b-et | [
"transformers",
"pytorch",
"tensorboard",
"wav2vec2",
"automatic-speech-recognition",
"generated_from_trainer",
"mozilla-foundation/common_voice_8_0",
"audio",
"speech",
"robust-speech-event",
"hf-asr-leaderboard",
"et",
"dataset:mozilla-foundation/common_voice_8_0",
"model-index",
"endpoints_compatible",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [
"et"
] | TAGS
#transformers #pytorch #tensorboard #wav2vec2 #automatic-speech-recognition #generated_from_trainer #mozilla-foundation/common_voice_8_0 #audio #speech #robust-speech-event #hf-asr-leaderboard #et #dataset-mozilla-foundation/common_voice_8_0 #model-index #endpoints_compatible #region-us
|
# wav2vec2-xlsr-et-lm-1B
This model was finetuned with mozilla_foundation/common_voice_8_0 et with train+other+validation splits.
It achieves the following results on the test set:
(Loss reported with last eval step at step 2000/2040 during training)
- Loss: 0.2150
- Wer: 0.2012
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.00005
- train_batch_size: 32
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 1
- total_train_batch_size: 16
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 500
- num_epochs: 10
- mixed_precision_training: Native AMP
### Training results
### Framework versions
- Transformers 4.17.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.18.3
- Tokenizers 0.11.0
| [
"# wav2vec2-xlsr-et-lm-1B\n\nThis model was finetuned with mozilla_foundation/common_voice_8_0 et with train+other+validation splits.\nIt achieves the following results on the test set:\n(Loss reported with last eval step at step 2000/2040 during training)\n- Loss: 0.2150 \n- Wer: 0.2012",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 0.00005\n- train_batch_size: 32\n- eval_batch_size: 8\n- seed: 42\n- gradient_accumulation_steps: 1\n- total_train_batch_size: 16\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_steps: 500\n- num_epochs: 10\n- mixed_precision_training: Native AMP",
"### Training results",
"### Framework versions\n\n- Transformers 4.17.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.18.3\n- Tokenizers 0.11.0"
] | [
"TAGS\n#transformers #pytorch #tensorboard #wav2vec2 #automatic-speech-recognition #generated_from_trainer #mozilla-foundation/common_voice_8_0 #audio #speech #robust-speech-event #hf-asr-leaderboard #et #dataset-mozilla-foundation/common_voice_8_0 #model-index #endpoints_compatible #region-us \n",
"# wav2vec2-xlsr-et-lm-1B\n\nThis model was finetuned with mozilla_foundation/common_voice_8_0 et with train+other+validation splits.\nIt achieves the following results on the test set:\n(Loss reported with last eval step at step 2000/2040 during training)\n- Loss: 0.2150 \n- Wer: 0.2012",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 0.00005\n- train_batch_size: 32\n- eval_batch_size: 8\n- seed: 42\n- gradient_accumulation_steps: 1\n- total_train_batch_size: 16\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_steps: 500\n- num_epochs: 10\n- mixed_precision_training: Native AMP",
"### Training results",
"### Framework versions\n\n- Transformers 4.17.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.18.3\n- Tokenizers 0.11.0"
] | [
113,
89,
6,
12,
8,
3,
141,
4,
38
] | [
"passage: TAGS\n#transformers #pytorch #tensorboard #wav2vec2 #automatic-speech-recognition #generated_from_trainer #mozilla-foundation/common_voice_8_0 #audio #speech #robust-speech-event #hf-asr-leaderboard #et #dataset-mozilla-foundation/common_voice_8_0 #model-index #endpoints_compatible #region-us \n# wav2vec2-xlsr-et-lm-1B\n\nThis model was finetuned with mozilla_foundation/common_voice_8_0 et with train+other+validation splits.\nIt achieves the following results on the test set:\n(Loss reported with last eval step at step 2000/2040 during training)\n- Loss: 0.2150 \n- Wer: 0.2012## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 0.00005\n- train_batch_size: 32\n- eval_batch_size: 8\n- seed: 42\n- gradient_accumulation_steps: 1\n- total_train_batch_size: 16\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_steps: 500\n- num_epochs: 10\n- mixed_precision_training: Native AMP### Training results### Framework versions\n\n- Transformers 4.17.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.18.3\n- Tokenizers 0.11.0"
] | [
-0.07720434665679932,
0.17329658567905426,
-0.005314873531460762,
0.0411718487739563,
0.12210500240325928,
-0.004158522933721542,
0.05548492819070816,
0.15938515961170197,
-0.07927711308002472,
0.12474995851516724,
0.02764916978776455,
0.05132296308875084,
0.0930299386382103,
0.09320192039012909,
0.004652605392038822,
-0.20952710509300232,
-0.006364268250763416,
-0.057580817490816116,
-0.009772495366632938,
0.09349697083234787,
0.11649157851934433,
-0.08802102506160736,
0.0194853525608778,
0.0007695130188949406,
-0.0620160810649395,
0.027692558243870735,
-0.06822601705789566,
-0.056495703756809235,
0.0712117850780487,
0.028189411386847496,
0.022498831152915955,
0.02371220290660858,
0.08296813815832138,
-0.31141650676727295,
-0.0029869566205888987,
0.0894460454583168,
0.043669793754816055,
0.062091998755931854,
0.10207924246788025,
-0.0396713949739933,
0.03792760521173477,
-0.13912105560302734,
0.08132055401802063,
0.05830850452184677,
-0.07882063090801239,
-0.16982409358024597,
-0.09511513262987137,
0.050203852355480194,
0.09230223298072815,
0.1081734150648117,
-0.036036815494298935,
0.11596531420946121,
-0.07121312618255615,
0.05851835757493973,
0.22899483144283295,
-0.2297111451625824,
-0.038557715713977814,
-0.0028765855822712183,
0.06522221118211746,
0.021237868815660477,
-0.12777459621429443,
0.011465626768767834,
0.0379771925508976,
0.0035672287922352552,
0.06548520177602768,
0.01159578561782837,
-0.010085408575832844,
0.00651584891602397,
-0.10641045868396759,
-0.03440529480576515,
0.12102635949850082,
0.06906642019748688,
-0.03495286777615547,
-0.1728426069021225,
0.0027597222942858934,
-0.12472863495349884,
-0.02034289762377739,
-0.020823905244469643,
0.02018730901181698,
-0.03899195045232773,
-0.05775146558880806,
0.0016904263757169247,
-0.0646689310669899,
-0.032435573637485504,
0.06294117867946625,
0.08776963502168655,
0.031045427545905113,
-0.04249053820967674,
0.02447998709976673,
0.09275176376104355,
0.028263388201594353,
-0.1520967036485672,
-0.045729830861091614,
0.009089688770473003,
-0.15387947857379913,
-0.05060885101556778,
-0.022885866463184357,
-0.015096332877874374,
0.023970048874616623,
0.16555087268352509,
0.0023371344432234764,
0.0921173244714737,
0.004918813705444336,
0.0004347972571849823,
0.022164082154631615,
0.1338910460472107,
-0.04901636391878128,
-0.11150450259447098,
-0.058602262288331985,
0.1050366684794426,
-0.018183747306466103,
-0.026681335642933846,
-0.04394739866256714,
0.023804673925042152,
0.09729503095149994,
0.09172233194112778,
0.017973517999053,
-0.005356470122933388,
-0.08692620694637299,
-0.023236911743879318,
-0.013299912214279175,
-0.16368643939495087,
0.0598691962659359,
0.02004542015492916,
-0.06549084931612015,
-0.014376264065504074,
-0.005735569167882204,
0.02296300232410431,
-0.05310339108109474,
0.0846085250377655,
-0.03853919357061386,
-0.002446916652843356,
-0.048751622438430786,
-0.0504055880010128,
0.03918150067329407,
-0.0559530183672905,
-0.01367747038602829,
-0.05025417357683182,
-0.10587465763092041,
-0.07711835950613022,
0.04055298492312431,
-0.09795674681663513,
-0.040539879351854324,
-0.041583091020584106,
-0.029694750905036926,
0.02529301866889,
-0.027482176199555397,
0.13190831243991852,
-0.03741927072405815,
0.05894042178988457,
-0.02246212027966976,
0.013573236763477325,
0.15087343752384186,
0.06682019680738449,
-0.05320568010210991,
0.04938031733036041,
-0.10816583782434464,
0.13087962567806244,
-0.11700141429901123,
0.018958237022161484,
-0.1741216778755188,
-0.0694495290517807,
-0.013825989328324795,
-0.022526374086737633,
0.09052460640668869,
0.11525449901819229,
-0.183492511510849,
-0.04710681363940239,
0.09575722366571426,
-0.03403868526220322,
-0.051074277609586716,
0.10400143265724182,
-0.02936519868671894,
0.02912575751543045,
0.044925425201654434,
0.16257229447364807,
0.0909193828701973,
-0.13386377692222595,
-0.043418511748313904,
-0.03502478078007698,
0.0752478837966919,
0.13359715044498444,
0.0628013089299202,
-0.07776948809623718,
0.07564099133014679,
0.012984400615096092,
-0.011435038410127163,
-0.012017691507935524,
-0.05579577758908272,
-0.08095378428697586,
0.0048414492048323154,
-0.0638645589351654,
0.029983477666974068,
0.021118048578500748,
-0.006492553278803825,
-0.06555045396089554,
-0.14629340171813965,
0.047223471105098724,
0.1133069396018982,
-0.05119355767965317,
0.02544211968779564,
-0.09551090002059937,
0.01625807024538517,
-0.015453541651368141,
0.002634746953845024,
-0.18905587494373322,
-0.026132924482226372,
0.055311497300863266,
-0.08883033692836761,
0.03694995865225792,
-0.00536706019192934,
0.05882790684700012,
0.022493362426757812,
-0.025388585403561592,
-0.01715759001672268,
-0.057248715311288834,
-0.0031545739620923996,
-0.05713629350066185,
-0.19933505356311798,
-0.05924788489937782,
-0.02366838976740837,
0.2187374234199524,
-0.18228214979171753,
-0.010825787670910358,
0.04627322033047676,
0.1442977637052536,
0.00796402059495449,
-0.08553856611251831,
0.023649971932172775,
0.028877418488264084,
0.0023294922430068254,
-0.09308788925409317,
0.015000201761722565,
0.002896464429795742,
-0.09846530854701996,
0.0028181003872305155,
-0.16675983369350433,
-0.03629690781235695,
0.06959052383899689,
0.08827202022075653,
-0.11133267730474472,
-0.06310372054576874,
-0.05505947396159172,
-0.04913085326552391,
-0.07590688019990921,
-0.02688552811741829,
0.2221929281949997,
0.04789811372756958,
0.08550132811069489,
-0.06093154847621918,
-0.09460887312889099,
0.0024207141250371933,
0.03751120716333389,
-0.024440523236989975,
0.11422896385192871,
0.04113064706325531,
-0.09888723492622375,
0.05300913006067276,
0.050673700869083405,
0.04311643913388252,
0.12216264754533768,
-0.04207407683134079,
-0.0967116504907608,
-0.04065760225057602,
0.02681725099682808,
0.01876400038599968,
0.10149727761745453,
-0.11860138922929764,
0.007684576325118542,
0.04984234645962715,
0.007144864182919264,
0.018147829920053482,
-0.11647689342498779,
0.015287180431187153,
0.059283144772052765,
-0.03359311819076538,
0.006318750791251659,
-0.03661477193236351,
0.014571599662303925,
0.05889732763171196,
0.02539706788957119,
-0.0017755978042259812,
-0.022797873243689537,
-0.03754386678338051,
-0.08904397487640381,
0.12397820502519608,
-0.11075390130281448,
-0.19169430434703827,
-0.10144025832414627,
-0.00826319307088852,
-0.031918127089738846,
-0.02756357006728649,
0.028286945074796677,
-0.09151468425989151,
-0.07203737646341324,
-0.09173083305358887,
-0.007598640862852335,
-0.06209488958120346,
-0.02156086079776287,
0.07448447495698929,
0.038978494703769684,
0.09479974210262299,
-0.12699522078037262,
0.03178626671433449,
0.0061796060763299465,
-0.0346861258149147,
-0.024988101795315742,
0.05848018452525139,
0.09143086522817612,
0.12432608008384705,
0.0402972549200058,
0.019170287996530533,
-0.042318668216466904,
0.18497484922409058,
-0.1376318335533142,
0.008872468955814838,
0.10169483721256256,
-0.007380297873169184,
0.060445379465818405,
0.13174493610858917,
0.01208590343594551,
-0.08011887222528458,
0.0293282400816679,
0.07183485478162766,
-0.019742297008633614,
-0.2587311863899231,
-0.029313910752534866,
-0.038227569311857224,
-0.0724877268075943,
0.13259758055210114,
0.05759385600686073,
0.037715643644332886,
0.018498463556170464,
-0.05178961530327797,
0.016023611649870872,
0.03144944831728935,
0.08185926824808121,
0.027091722935438156,
0.0340087004005909,
0.09762274473905563,
-0.01120895054191351,
-0.011877702549099922,
0.039891261607408524,
0.008729882538318634,
0.23090510070323944,
0.004478076007217169,
0.17763729393482208,
0.03576737642288208,
0.11845985800027847,
-0.043377477675676346,
0.02736850082874298,
0.020174268633127213,
0.005959988571703434,
0.021251410245895386,
-0.08727331459522247,
-0.021670334041118622,
0.0456516295671463,
0.09540079534053802,
-0.007490196730941534,
-0.05386552959680557,
0.005348606035113335,
0.04160406067967415,
0.2691289782524109,
0.06334429234266281,
-0.20421503484249115,
-0.04060128331184387,
0.020528560504317284,
-0.04726653918623924,
-0.05846258997917175,
-0.01823599450290203,
0.08462244272232056,
-0.13746780157089233,
0.08824652433395386,
-0.018168983981013298,
0.10397215932607651,
-0.08089188486337662,
-0.010995998978614807,
0.028946559876203537,
0.09097827225923538,
0.0060642375610768795,
0.09196104109287262,
-0.18300950527191162,
0.16922752559185028,
0.021610014140605927,
0.10074163973331451,
-0.07366163283586502,
0.06769528985023499,
-0.007107768673449755,
-0.03752830997109413,
0.12618698179721832,
-0.0068131485022604465,
-0.05962550267577171,
-0.14502356946468353,
-0.10173725336790085,
-0.010510031133890152,
0.14517514407634735,
-0.09993100166320801,
0.08920037001371384,
-0.0385529063642025,
-0.028021911159157753,
0.013468533754348755,
-0.069646917283535,
-0.18103443086147308,
-0.1778849959373474,
0.062157370150089264,
-0.009256413206458092,
0.034369539469480515,
-0.07696886360645294,
-0.07917118072509766,
-0.11634338647127151,
0.2330321967601776,
-0.055790528655052185,
-0.027798471972346306,
-0.14442265033721924,
0.05775761231780052,
0.1702621579170227,
-0.04418668523430824,
0.013446670956909657,
0.03462524712085724,
0.16816307604312897,
0.005790890660136938,
-0.02501978911459446,
0.04915184527635574,
-0.05571037530899048,
-0.16347427666187286,
-0.06744043529033661,
0.175132155418396,
0.04427800327539444,
0.06413814425468445,
0.006873720325529575,
0.011573449708521366,
0.02195921167731285,
-0.07356436550617218,
0.04620872065424919,
0.07742850482463837,
0.022212570533156395,
0.044588565826416016,
-0.03126018866896629,
0.0006866019684821367,
-0.10642082244157791,
-0.04956735670566559,
0.11356762796640396,
0.2330866903066635,
-0.07585600018501282,
0.12326391786336899,
0.06218045949935913,
-0.08452966064214706,
-0.1482538878917694,
0.02019045129418373,
0.1239163726568222,
0.019092798233032227,
0.05514610931277275,
-0.17866747081279755,
0.03818954527378082,
0.09344475716352463,
-0.013769985176622868,
0.042137984186410904,
-0.28650885820388794,
-0.1383485645055771,
0.02376614511013031,
0.01695452630519867,
-0.09166102856397629,
-0.13114184141159058,
-0.07614369690418243,
-0.06105981022119522,
-0.1216762438416481,
0.01050394494086504,
-0.025202147662639618,
0.10418224334716797,
0.030013738200068474,
0.0016131934244185686,
0.04250185564160347,
-0.038146670907735825,
0.14688564836978912,
0.05220642313361168,
0.027656685560941696,
-0.04810419678688049,
0.05400531738996506,
0.0717197060585022,
-0.06576816737651825,
0.04616784304380417,
-0.04807443544268608,
0.021121688187122345,
-0.14875079691410065,
-0.03380706533789635,
-0.04333370551466942,
0.03517640754580498,
-0.057490747421979904,
-0.03888308256864548,
-0.03814368695020676,
0.060654520988464355,
0.08845741301774979,
-0.005864112637937069,
0.03179929777979851,
-0.040268559008836746,
0.07719241082668304,
0.13678833842277527,
0.08662046492099762,
0.012943173758685589,
-0.16329598426818848,
0.003917249385267496,
-0.0066232536919415,
0.02222229540348053,
-0.10698891431093216,
0.04380466043949127,
0.11278029531240463,
0.053200773894786835,
0.14079587161540985,
0.0032348507083952427,
-0.12268374115228653,
0.0034655602648854256,
0.03484824299812317,
-0.03440067544579506,
-0.16347636282444,
-0.007878681644797325,
0.04268192499876022,
-0.13351920247077942,
-0.029541581869125366,
0.10851623117923737,
0.0002134625829057768,
-0.01898183487355709,
-0.008065477944910526,
0.03296014666557312,
-0.0011105756275355816,
0.18570682406425476,
-0.009311079978942871,
0.10055907815694809,
-0.08499421179294586,
0.1159110963344574,
0.11541622132062912,
-0.06836003065109253,
0.06435566395521164,
0.04549267143011093,
-0.04223021864891052,
-0.02195640467107296,
0.028660917654633522,
0.07731229066848755,
0.07124535739421844,
-0.022897936403751373,
-0.049163006246089935,
-0.08556505292654037,
0.06734931468963623,
0.005094493273645639,
0.007970103994011879,
-0.011606899090111256,
0.003990152385085821,
0.004820862784981728,
-0.12445969879627228,
0.07614796608686447,
0.07360350340604782,
0.045102011412382126,
-0.09912775456905365,
0.058962929993867874,
0.024262838065624237,
0.008905000053346157,
0.002857161918655038,
-0.03047761879861355,
-0.05780565366148949,
0.0118469949811697,
-0.08292114734649658,
-0.020426081493496895,
-0.05483173951506615,
-0.006691851187497377,
-0.0058026304468512535,
-0.02247493527829647,
-0.028772350400686264,
0.05292113125324249,
-0.07744019478559494,
-0.1072712168097496,
-0.014394917525351048,
0.07664117962121964,
-0.13104365766048431,
-0.005826533772051334,
0.04470859840512276,
-0.1360618770122528,
0.08732346445322037,
0.03548968955874443,
0.021513426676392555,
-0.011966858990490437,
-0.07343057543039322,
-0.03591572865843773,
0.010407261550426483,
0.025086967274546623,
0.04437483474612236,
-0.1580541729927063,
-0.005556958727538586,
-0.0434405617415905,
0.002940800739452243,
0.006259220186620951,
-0.024313082918524742,
-0.1404387503862381,
-0.02815542183816433,
-0.05556773021817207,
-0.050802554935216904,
-0.030234530568122864,
0.051838070154190063,
0.07942627370357513,
0.01794104278087616,
0.1423685997724533,
-0.03972920402884483,
0.08538416028022766,
-0.21226242184638977,
-0.018978700041770935,
-0.010786623694002628,
-0.0020614590030163527,
-0.0006189474370330572,
-0.019769905135035515,
0.08847767114639282,
-0.03515585884451866,
0.10159548372030258,
-0.02835896983742714,
0.12048836797475815,
0.03894752264022827,
-0.07002705335617065,
-0.0082551846280694,
0.02734173834323883,
0.1299222856760025,
0.07736857235431671,
-0.012191233225166798,
0.06271246075630188,
-0.05094224587082863,
0.06322422623634338,
0.030815202742815018,
0.0610710084438324,
0.18301530182361603,
0.07429029047489166,
0.03513303026556969,
0.07211501151323318,
-0.145600825548172,
-0.14100229740142822,
0.16589178144931793,
-0.05865534022450447,
0.10243357717990875,
-0.045777447521686554,
0.1118222177028656,
0.10312344878911972,
-0.16554497182369232,
0.08510933816432953,
-0.06404303759336472,
-0.08531446009874344,
-0.07711262255907059,
-0.08113572746515274,
-0.08380197733640671,
-0.09863852709531784,
0.04606092348694801,
-0.07007253170013428,
0.05312640219926834,
0.065119668841362,
0.025151992216706276,
0.03422556817531586,
0.0969272032380104,
-0.03175000846385956,
-0.029515733942389488,
0.09756907820701599,
-0.001714712125249207,
-0.007586381398141384,
-0.05141599103808403,
-0.030192149803042412,
0.0791768729686737,
0.019758768379688263,
0.11024782806634903,
-0.00865053292363882,
-0.006449038628488779,
0.03843884542584419,
0.009108416736125946,
-0.09506378322839737,
0.02963605523109436,
-0.018001286312937737,
0.01791185885667801,
0.09158530831336975,
0.055180873721838,
0.023633303120732307,
-0.05036873370409012,
0.23923347890377045,
-0.051349837332963943,
-0.07069767266511917,
-0.14862391352653503,
0.12467797100543976,
0.05541457608342171,
0.02835206314921379,
0.03530532866716385,
-0.1232515200972557,
0.005076051224023104,
0.1365838199853897,
0.12368360161781311,
-0.013127067126333714,
-0.009149420075118542,
0.006507053505629301,
-0.007852486334741116,
-0.04485349357128143,
0.049460891634225845,
0.09017179906368256,
-0.007107762154191732,
-0.03421834111213684,
0.05037718266248703,
0.007788574323058128,
-0.07152818888425827,
-0.0664944127202034,
0.10081609338521957,
-0.006044833455234766,
0.025144003331661224,
-0.023015206679701805,
0.1044275313615799,
-0.005159389227628708,
-0.24765276908874512,
0.040883880108594894,
-0.14432749152183533,
-0.1933489292860031,
-0.010580457746982574,
0.10480847954750061,
0.004257091786712408,
0.058212265372276306,
0.024693716317415237,
-0.017573993653059006,
0.13694080710411072,
0.014248931780457497,
-0.020137717947363853,
-0.0834927037358284,
0.06538106501102448,
-0.06778831779956818,
0.20440958440303802,
-0.013259127736091614,
0.043380655348300934,
0.10322275012731552,
0.030619937926530838,
-0.14045798778533936,
0.025406692177057266,
0.0906294584274292,
-0.08747301250696182,
0.07895474880933762,
0.21920929849147797,
-0.0335836336016655,
0.13463708758354187,
0.07235729694366455,
-0.0736243724822998,
0.010627178475260735,
-0.049058254808187485,
0.02608882449567318,
-0.08610007166862488,
0.028337255120277405,
-0.03847154602408409,
0.13624349236488342,
0.14801950752735138,
-0.052930980920791626,
-0.005329796578735113,
-0.06829200685024261,
-0.0027033304795622826,
0.01755586266517639,
0.12478350102901459,
-0.0344688706099987,
-0.19694513082504272,
0.030873382464051247,
-0.013746255077421665,
0.06246255710721016,
-0.21721558272838593,
-0.10089153796434402,
0.09187403321266174,
-0.07309021055698395,
-0.043728701770305634,
0.11220218986272812,
0.053893789649009705,
0.009522590786218643,
-0.04581836611032486,
-0.19545292854309082,
-0.00733723770827055,
0.13469921052455902,
-0.1534881591796875,
-0.015590159222483635
] |
null | null | transformers |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# wav2vec2-xlsr-1b-ru
This model is a fine-tuned version of [facebook/wav2vec2-xls-r-1b](https://huggingface.co/facebook/wav2vec2-xls-r-1b) on the common_voice dataset.
It achieves the following results on the evaluation set:
- Loss: 0.1352
- Wer: 0.0971
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 32
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 500
- num_epochs: 10
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Wer |
|:-------------:|:-----:|:-----:|:---------------:|:------:|
| 0.5462 | 0.35 | 500 | 0.4027 | 0.3575 |
| 0.498 | 0.69 | 1000 | 0.2588 | 0.2513 |
| 0.4279 | 1.04 | 1500 | 0.2265 | 0.2204 |
| 0.4099 | 1.38 | 2000 | 0.2189 | 0.1979 |
| 0.4688 | 1.73 | 2500 | 0.2100 | 0.1920 |
| 0.2241 | 2.07 | 3000 | 0.1980 | 0.1767 |
| 0.2056 | 2.42 | 3500 | 0.2020 | 0.1683 |
| 0.3423 | 2.76 | 4000 | 0.1862 | 0.1606 |
| 0.2478 | 3.11 | 4500 | 0.1787 | 0.1563 |
| 0.3079 | 3.45 | 5000 | 0.1759 | 0.1555 |
| 0.2477 | 3.8 | 5500 | 0.1713 | 0.1423 |
| 0.1718 | 4.14 | 6000 | 0.1695 | 0.1391 |
| 0.1675 | 4.49 | 6500 | 0.1677 | 0.1372 |
| 0.1631 | 4.83 | 7000 | 0.1652 | 0.1333 |
| 0.1429 | 5.18 | 7500 | 0.1605 | 0.1308 |
| 0.1505 | 5.52 | 8000 | 0.1612 | 0.1245 |
| 0.1385 | 5.87 | 8500 | 0.1487 | 0.1225 |
| 0.1285 | 6.22 | 9000 | 0.1526 | 0.1201 |
| 0.1153 | 6.56 | 9500 | 0.1464 | 0.1172 |
| 0.1159 | 6.91 | 10000 | 0.1505 | 0.1143 |
| 0.1061 | 7.25 | 10500 | 0.1444 | 0.1106 |
| 0.1016 | 7.6 | 11000 | 0.1427 | 0.1075 |
| 0.1125 | 7.94 | 11500 | 0.1386 | 0.1045 |
| 0.0937 | 8.29 | 12000 | 0.1403 | 0.1022 |
| 0.1059 | 8.63 | 12500 | 0.1406 | 0.1022 |
| 0.0857 | 8.98 | 13000 | 0.1372 | 0.0992 |
| 0.0901 | 9.32 | 13500 | 0.1380 | 0.0977 |
| 0.0913 | 9.67 | 14000 | 0.1352 | 0.0971 |
### Framework versions
- Transformers 4.17.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.18.3
- Tokenizers 0.11.0
| {"language": "ru", "tags": ["audio", "automatic-speech-recognition", "generated_from_trainer", "hf-asr-leaderboard", "mozilla-foundation/common_voice_8_0", "robust-speech-event", "speech"], "datasets": ["mozilla-foundation/common_voice_8_0"], "metrics": ["wer", "cer"], "model-index": [{"name": "XLS-R 1B Wav2Vec2 Russian by Rasmus Toivanen", "results": [{"task": {"type": "automatic-speech-recognition", "name": "Automatic Speech Recognition"}, "dataset": {"name": "Common Voice 8", "type": "mozilla-foundation/common_voice_8_0", "args": "ru"}, "metrics": [{"type": "wer", "value": 10.83, "name": "Test WER"}, {"type": "cer", "value": 2.41, "name": "Test CER"}]}, {"task": {"type": "automatic-speech-recognition", "name": "Automatic Speech Recognition"}, "dataset": {"name": "Robust Speech Event - Dev Data", "type": "speech-recognition-community-v2/dev_data", "args": "ru"}, "metrics": [{"type": "wer", "value": 37.71, "name": "Test WER"}, {"type": "cer", "value": 12.98, "name": "Test CER"}]}, {"task": {"type": "automatic-speech-recognition", "name": "Automatic Speech Recognition"}, "dataset": {"name": "Robust Speech Event - Test Data", "type": "speech-recognition-community-v2/eval_data", "args": "ru"}, "metrics": [{"type": "wer", "value": 31.89, "name": "Test WER"}]}]}]} | automatic-speech-recognition | RASMUS/wav2vec2-xlsr-1b-ru | [
"transformers",
"pytorch",
"tensorboard",
"wav2vec2",
"automatic-speech-recognition",
"audio",
"generated_from_trainer",
"hf-asr-leaderboard",
"mozilla-foundation/common_voice_8_0",
"robust-speech-event",
"speech",
"ru",
"dataset:mozilla-foundation/common_voice_8_0",
"model-index",
"endpoints_compatible",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [
"ru"
] | TAGS
#transformers #pytorch #tensorboard #wav2vec2 #automatic-speech-recognition #audio #generated_from_trainer #hf-asr-leaderboard #mozilla-foundation/common_voice_8_0 #robust-speech-event #speech #ru #dataset-mozilla-foundation/common_voice_8_0 #model-index #endpoints_compatible #region-us
| wav2vec2-xlsr-1b-ru
===================
This model is a fine-tuned version of facebook/wav2vec2-xls-r-1b on the common\_voice dataset.
It achieves the following results on the evaluation set:
* Loss: 0.1352
* Wer: 0.0971
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 5e-05
* train\_batch\_size: 32
* eval\_batch\_size: 8
* seed: 42
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* lr\_scheduler\_warmup\_steps: 500
* num\_epochs: 10
* mixed\_precision\_training: Native AMP
### Training results
### Framework versions
* Transformers 4.17.0.dev0
* Pytorch 1.10.2+cu102
* Datasets 1.18.3
* Tokenizers 0.11.0
| [
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 5e-05\n* train\\_batch\\_size: 32\n* eval\\_batch\\_size: 8\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 500\n* num\\_epochs: 10\n* mixed\\_precision\\_training: Native AMP",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.17.0.dev0\n* Pytorch 1.10.2+cu102\n* Datasets 1.18.3\n* Tokenizers 0.11.0"
] | [
"TAGS\n#transformers #pytorch #tensorboard #wav2vec2 #automatic-speech-recognition #audio #generated_from_trainer #hf-asr-leaderboard #mozilla-foundation/common_voice_8_0 #robust-speech-event #speech #ru #dataset-mozilla-foundation/common_voice_8_0 #model-index #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 5e-05\n* train\\_batch\\_size: 32\n* eval\\_batch\\_size: 8\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 500\n* num\\_epochs: 10\n* mixed\\_precision\\_training: Native AMP",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.17.0.dev0\n* Pytorch 1.10.2+cu102\n* Datasets 1.18.3\n* Tokenizers 0.11.0"
] | [
113,
131,
4,
38
] | [
"passage: TAGS\n#transformers #pytorch #tensorboard #wav2vec2 #automatic-speech-recognition #audio #generated_from_trainer #hf-asr-leaderboard #mozilla-foundation/common_voice_8_0 #robust-speech-event #speech #ru #dataset-mozilla-foundation/common_voice_8_0 #model-index #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 5e-05\n* train\\_batch\\_size: 32\n* eval\\_batch\\_size: 8\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 500\n* num\\_epochs: 10\n* mixed\\_precision\\_training: Native AMP### Training results### Framework versions\n\n\n* Transformers 4.17.0.dev0\n* Pytorch 1.10.2+cu102\n* Datasets 1.18.3\n* Tokenizers 0.11.0"
] | [
-0.1272619217634201,
0.14350906014442444,
-0.0040164305828511715,
0.035193685442209244,
0.10958191007375717,
0.0014769304543733597,
0.12147525697946548,
0.12549778819084167,
-0.06388498842716217,
0.12450619786977768,
0.07334374636411667,
0.09095851331949234,
0.08585664629936218,
0.13789819180965424,
-0.012014959007501602,
-0.2867072820663452,
0.0348285436630249,
-0.042261168360710144,
-0.07656056433916092,
0.09852208197116852,
0.07496048510074615,
-0.09728872776031494,
0.021309684962034225,
0.0007952714804559946,
-0.10430064797401428,
0.006181663367897272,
-0.03661500662565231,
-0.0729278102517128,
0.07389285415410995,
0.02809903398156166,
0.018213586881756783,
0.040731169283390045,
0.09188562631607056,
-0.2740655243396759,
0.016109196469187737,
0.047799188643693924,
0.028660155832767487,
0.048428189009428024,
0.10593070834875107,
-0.013688424602150917,
0.0720752403140068,
-0.047896869480609894,
0.0350678414106369,
0.05654614418745041,
-0.08413432538509369,
-0.21365994215011597,
-0.07330960780382156,
0.034154124557971954,
0.11325028538703918,
0.10055388510227203,
-0.05424800515174866,
0.026054872199892998,
-0.08406148105859756,
0.08527081459760666,
0.21821098029613495,
-0.2125912457704544,
-0.06514163315296173,
-0.056039731949567795,
0.04609271138906479,
0.06713160872459412,
-0.1093965470790863,
0.00078367511741817,
0.019800113514065742,
0.017137285321950912,
0.06482116132974625,
0.0031576938927173615,
0.03104662522673607,
-0.024052293971180916,
-0.1435297280550003,
-0.03566987067461014,
0.14508342742919922,
0.0866735428571701,
-0.009417485445737839,
-0.11450975388288498,
-0.021367648616433144,
-0.16766752302646637,
-0.058184657245874405,
0.013841735199093819,
0.0190933495759964,
-0.030160430818796158,
-0.08117898553609848,
0.032875172793865204,
-0.03866790235042572,
-0.07134967297315598,
0.06541340053081512,
0.12040378153324127,
0.04213815554976463,
-0.0463537722826004,
0.0012129726819694042,
0.06458607316017151,
0.07726560533046722,
-0.168564110994339,
-0.010896720923483372,
0.035989388823509216,
-0.10495232045650482,
0.0015472035156562924,
-0.0030427295714616776,
0.032955266535282135,
0.05505313724279404,
0.13002079725265503,
0.0010642576962709427,
0.09472665190696716,
0.008733869530260563,
0.01196242030709982,
-0.05994316563010216,
0.13061076402664185,
-0.07417698949575424,
-0.0966717079281807,
-0.04581458494067192,
0.1195421889424324,
-0.000625532993581146,
-0.0240600798279047,
-0.07320574671030045,
0.058976881206035614,
0.08803750574588776,
0.04957566782832146,
-0.004032311961054802,
-0.0023244465701282024,
-0.06724108755588531,
-0.031708184629678726,
-0.033576127141714096,
-0.12378166615962982,
0.042187973856925964,
0.06811722368001938,
-0.05357058718800545,
0.029945366084575653,
-0.04173087328672409,
0.03235460817813873,
-0.050097137689590454,
0.08251968026161194,
-0.05094220116734505,
-0.002900330349802971,
-0.05264898017048836,
-0.09025144577026367,
0.05827005207538605,
-0.08570650219917297,
0.0008198513532988727,
-0.06469574570655823,
-0.0496792197227478,
-0.07198859751224518,
0.040341295301914215,
-0.055586379021406174,
-0.07571037858724594,
-0.11343447118997574,
-0.087375707924366,
0.05065785348415375,
-0.02550588920712471,
0.12089665979146957,
-0.05921004340052605,
0.0847598984837532,
0.012131284922361374,
0.06245991960167885,
0.0825035572052002,
0.09416091442108154,
-0.014737299643456936,
0.04193414747714996,
-0.08097514510154724,
0.12227347493171692,
-0.12168994545936584,
0.033110931515693665,
-0.129452645778656,
-0.09459594637155533,
-0.03129212185740471,
0.01696418598294258,
0.09301165491342545,
0.13253694772720337,
-0.16822975873947144,
-0.12699481844902039,
0.1798098385334015,
-0.04214172810316086,
-0.06302250176668167,
0.1395639330148697,
0.0031950839329510927,
-0.056359753012657166,
0.03746546804904938,
0.22230736911296844,
0.11526478826999664,
-0.09771405905485153,
-0.025551967322826385,
-0.0719306468963623,
0.12158538401126862,
0.030125239863991737,
0.09784036874771118,
-0.05038032308220863,
0.0515027716755867,
-0.002536213956773281,
0.001810855115763843,
0.0618022084236145,
-0.07544170320034027,
-0.08778470754623413,
0.004798531997948885,
-0.08485230058431625,
0.002754573244601488,
0.050800010561943054,
0.007622183300554752,
-0.06654252111911774,
-0.11801457405090332,
0.008251946419477463,
0.09016412496566772,
-0.11273247748613358,
0.02254544384777546,
-0.08634071797132492,
0.06969521939754486,
-0.027199050411581993,
-0.00012662178778555244,
-0.14468081295490265,
0.013369366526603699,
0.04130837321281433,
-0.040375251322984695,
0.04240444675087929,
-0.04946616291999817,
0.05770804360508919,
0.04016450420022011,
-0.020187340676784515,
-0.07273313403129578,
-0.04398210719227791,
-0.0013166562421247363,
-0.04180309548974037,
-0.23470661044120789,
-0.053966619074344635,
-0.03385355696082115,
0.18619883060455322,
-0.16313377022743225,
-0.00731211481615901,
0.03487468883395195,
0.1286150962114334,
0.022952327504754066,
-0.06891126185655594,
0.027335472404956818,
0.07291946560144424,
-0.005919919814914465,
-0.06692928820848465,
0.020688751712441444,
-0.0021934343967586756,
-0.07834520936012268,
0.0139435064047575,
-0.14770036935806274,
0.0881214365363121,
0.10202492773532867,
0.029068708419799805,
-0.04745161160826683,
-0.009586657397449017,
-0.050460685044527054,
-0.054388973861932755,
-0.039551060646772385,
-0.010303270071744919,
0.1555754542350769,
0.008341372944414616,
0.09252090752124786,
-0.07092054933309555,
-0.03327087312936783,
0.0560217909514904,
0.025047264993190765,
-0.030734922736883163,
0.15502817928791046,
0.08878178894519806,
-0.028769254684448242,
0.11313995718955994,
0.02874107100069523,
-0.04667969048023224,
0.1891060173511505,
-0.07020595669746399,
-0.10799845308065414,
-0.021518448367714882,
0.0073950388468801975,
0.012893246486783028,
0.10354765504598618,
-0.18131640553474426,
-0.03290429711341858,
0.015734929591417313,
0.02173248492181301,
0.026199497282505035,
-0.17073239386081696,
0.0026873506139963865,
0.022815832868218422,
-0.09216916561126709,
-0.04512317106127739,
0.01356092281639576,
-0.01825958490371704,
0.07577135413885117,
0.000717970309779048,
-0.09705347567796707,
-0.030271824449300766,
-0.03858102113008499,
-0.10471093654632568,
0.1364002674818039,
-0.10570687800645828,
-0.1616133749485016,
-0.11638513952493668,
-0.03752613067626953,
-0.01936405897140503,
-0.008095042780041695,
0.05902433395385742,
-0.1194574162364006,
-0.037259217351675034,
-0.05351390689611435,
0.010691841132938862,
-0.04454691708087921,
0.02083124965429306,
0.050021156668663025,
-0.0030346119310706854,
0.03883005678653717,
-0.08536313474178314,
0.017192257568240166,
-0.01835191622376442,
0.008660071529448032,
-0.012248026207089424,
0.013243076391518116,
0.09903183579444885,
0.17344442009925842,
0.07751742005348206,
0.05214352905750275,
-0.022551454603672028,
0.20708823204040527,
-0.14157065749168396,
0.000438248593127355,
0.08921027183532715,
-0.0018391188932582736,
0.03926175460219383,
0.17161282896995544,
0.044011179357767105,
-0.07203196734189987,
0.012553163804113865,
0.036975178867578506,
-0.023578081279993057,
-0.2364397794008255,
-0.03538962081074715,
-0.08832111209630966,
-0.03472987189888954,
0.08718124032020569,
0.036119233816862106,
0.012983320280909538,
0.01541811041533947,
-0.02172895520925522,
-0.02215280756354332,
0.045018162578344345,
0.02929835580289364,
0.06213565543293953,
0.03606031835079193,
0.10724437981843948,
-0.008393412455916405,
-0.0471716970205307,
0.01728421077132225,
-0.007694014348089695,
0.23968219757080078,
0.052877627313137054,
0.1837843805551529,
0.04306625574827194,
0.1458827257156372,
0.015758255496621132,
0.04639580845832825,
0.029536472633481026,
0.01186314970254898,
0.03569392114877701,
-0.05716501176357269,
-0.016496889293193817,
0.03959306702017784,
0.17500156164169312,
-0.0050414144061505795,
-0.09125450253486633,
0.02765052765607834,
0.010422615334391594,
0.3386065363883972,
0.0861208587884903,
-0.2722918391227722,
-0.05605082958936691,
0.018698029220104218,
-0.0772920772433281,
-0.025355100631713867,
0.04925672337412834,
0.12241452187299728,
-0.06086172163486481,
0.07286413013935089,
-0.03574744239449501,
0.09272895008325577,
-0.06203027442097664,
0.00021144954371266067,
0.07189670205116272,
0.09896864742040634,
0.008374838158488274,
0.037949029356241226,
-0.2519710958003998,
0.258794903755188,
-0.00831465981900692,
0.09318166971206665,
-0.04499373212456703,
0.03862713277339935,
0.03730211406946182,
-0.06690508127212524,
0.10621097683906555,
-0.003950704820454121,
-0.146604523062706,
-0.14964383840560913,
-0.11660059541463852,
0.011050838977098465,
0.1389806568622589,
-0.052779294550418854,
0.10137566924095154,
-0.027008624747395515,
-0.056462518870830536,
0.03398553654551506,
-0.080355204641819,
-0.1490502655506134,
-0.11798315495252609,
0.05123466998338699,
0.0660298764705658,
0.09334664791822433,
-0.06627876311540604,
-0.1062900722026825,
-0.0864671990275383,
0.11105574667453766,
-0.13206037878990173,
-0.009076427668333054,
-0.12943601608276367,
0.03694145753979683,
0.1597355753183365,
-0.06340814381837845,
0.041872285306453705,
0.03417332097887993,
0.13387073576450348,
0.011878653429448605,
-0.03157275915145874,
0.1045280396938324,
-0.08928358554840088,
-0.17832069098949432,
-0.04307232424616814,
0.20270991325378418,
0.04841137304902077,
0.07591601461172104,
-0.013925961218774319,
0.034452468156814575,
0.00705410772934556,
-0.06137100234627724,
0.10820968449115753,
0.057756874710321426,
-0.043520476669073105,
0.040701985359191895,
-0.028620414435863495,
-0.04304163530468941,
-0.09507357329130173,
-0.055551305413246155,
0.13012687861919403,
0.2215968668460846,
-0.06627120077610016,
0.07961282134056091,
0.07552877068519592,
-0.06223119795322418,
-0.16309203207492828,
-0.01305040530860424,
0.1284591555595398,
0.04888584837317467,
-0.03733871504664421,
-0.23719845712184906,
0.021474163979291916,
0.046062808483839035,
-0.01687069982290268,
0.08658555150032043,
-0.3309677541255951,
-0.12806947529315948,
0.10683394968509674,
0.03903548792004585,
-0.0033516932744532824,
-0.1402074694633484,
-0.07296784222126007,
-0.024991475045681,
-0.08359488844871521,
0.05032344162464142,
-0.027470016852021217,
0.13920600712299347,
0.027807027101516724,
0.031096747145056725,
0.007734831888228655,
-0.04219460114836693,
0.11798935383558273,
0.048700001090765,
0.007380207534879446,
-0.006542277056723833,
0.014795485883951187,
-0.015664314851164818,
-0.0427866168320179,
0.015781402587890625,
-0.0673762857913971,
0.006685627158731222,
-0.10996826738119125,
-0.04021086171269417,
-0.08396109938621521,
0.003288846928626299,
-0.028728744015097618,
-0.006679567042738199,
-0.004386145155876875,
0.036793433129787445,
0.10005463659763336,
0.02266288548707962,
0.10358402132987976,
-0.0679909810423851,
0.10638017952442169,
0.09873730689287186,
0.13108010590076447,
-0.003473560092970729,
-0.1041543111205101,
-0.027222786098718643,
-0.02116130292415619,
0.043234843760728836,
-0.06676259636878967,
0.03759507089853287,
0.14356598258018494,
0.04783811420202255,
0.1437177062034607,
0.051846735179424286,
-0.09774819016456604,
0.0173370111733675,
0.06268966943025589,
-0.07432319223880768,
-0.18080203235149384,
-0.0350220613181591,
-0.008544440381228924,
-0.10639892518520355,
-0.01353501994162798,
0.11158279329538345,
-0.03944643959403038,
-0.0038151326589286327,
0.002463460201397538,
0.04778054729104042,
-0.047304559499025345,
0.23685315251350403,
0.02724524587392807,
0.0779324620962143,
-0.11721765995025635,
0.07892388105392456,
0.039704710245132446,
-0.11263106763362885,
0.0599731020629406,
0.10555271804332733,
-0.04315650463104248,
-0.015547605231404305,
0.009736567735671997,
0.06350263208150864,
0.05886663496494293,
-0.05763356387615204,
-0.10474329441785812,
-0.1585129052400589,
0.09846915304660797,
0.08375534415245056,
0.00034152116859331727,
0.016756055876612663,
-0.03934578597545624,
0.041167255491018295,
-0.09604853391647339,
0.09720300137996674,
0.10535521805286407,
0.05313091725111008,
-0.13553282618522644,
0.07446891069412231,
0.010901621542870998,
0.005143600981682539,
-0.008928918279707432,
-0.022621270269155502,
-0.114811010658741,
0.03733989596366882,
-0.08646472543478012,
-0.013266701251268387,
-0.07588830590248108,
-0.011910703964531422,
0.005452883429825306,
-0.05762089043855667,
-0.07038353383541107,
0.03017415665090084,
-0.11169038712978363,
-0.04430422559380531,
-0.03707845136523247,
0.06678833067417145,
-0.10831743478775024,
0.004185030702501535,
0.0260731503367424,
-0.12846806645393372,
0.08972104638814926,
0.07130894809961319,
-0.01353334542363882,
0.016737423837184906,
-0.11014964431524277,
-0.050745297223329544,
0.039315953850746155,
0.006629783660173416,
0.028113683685660362,
-0.18569231033325195,
-0.007224861066788435,
-0.013722621835768223,
0.022062057629227638,
-0.01581493206322193,
0.005516330245882273,
-0.10009942203760147,
-0.007015622220933437,
-0.06445585191249847,
-0.05511893704533577,
-0.04761003702878952,
0.06216076388955116,
0.10143759846687317,
0.007200499996542931,
0.15430384874343872,
-0.0810144916176796,
0.0653059110045433,
-0.19447077810764313,
0.01928100921213627,
-0.010409850627183914,
-0.08099862933158875,
-0.04289771988987923,
-0.02503531239926815,
0.10398028045892715,
-0.05460818111896515,
0.0695062205195427,
-0.03168020024895668,
0.023046566173434258,
0.020504647865891457,
-0.10010338574647903,
-0.0005975810345262289,
0.06763265281915665,
0.1730659306049347,
0.06967611610889435,
-0.03296734392642975,
0.053538303822278976,
-0.030841045081615448,
0.07068115472793579,
0.13886934518814087,
0.13846942782402039,
0.1493569314479828,
0.08447729051113129,
0.08755087107419968,
0.11428070068359375,
-0.12374038249254227,
-0.13257461786270142,
0.15902981162071228,
-0.07593133300542831,
0.1381223052740097,
-0.018377242609858513,
0.1922164261341095,
0.09197632223367691,
-0.17325866222381592,
0.0636909157037735,
-0.034952178597450256,
-0.0836099237203598,
-0.11802610754966736,
-0.07824131101369858,
-0.0908191129565239,
-0.16662392020225525,
0.03380589187145233,
-0.10307398438453674,
0.07619338482618332,
0.03870486468076706,
0.05351706221699715,
0.0330570787191391,
0.1067034974694252,
0.05095960199832916,
-0.008788535371422768,
0.1348171830177307,
-0.008739929646253586,
-0.03726515546441078,
-0.052602656185626984,
-0.09481427073478699,
0.07104110717773438,
-0.02895340323448181,
0.07473045587539673,
-0.03753029555082321,
-0.12833403050899506,
0.07171047478914261,
-0.0025680968537926674,
-0.10004627704620361,
0.030308909714221954,
-0.029041044414043427,
0.05045690760016441,
0.09374050796031952,
0.04533858224749565,
-0.03184144198894501,
0.007449796423316002,
0.1688324511051178,
-0.08737761527299881,
-0.051007822155952454,
-0.13854524493217468,
0.14205247163772583,
-0.0030829356983304024,
0.014019472524523735,
0.019158074632287025,
-0.07008512318134308,
-0.026576150208711624,
0.1750643402338028,
0.158026322722435,
-0.010463155806064606,
-0.02416292577981949,
0.02260136790573597,
-0.0028148426208645105,
-0.03487655520439148,
0.06592369824647903,
0.11943425238132477,
0.09103655070066452,
-0.0315786711871624,
-0.02590394951403141,
-0.03570341318845749,
-0.09019732475280762,
-0.028586503118276596,
0.07817307114601135,
0.024936426430940628,
-0.02959487773478031,
-0.020090559497475624,
0.1266692727804184,
-0.12605327367782593,
-0.11290688067674637,
-0.0025130431167781353,
-0.15261124074459076,
-0.18097062408924103,
-0.039276547729969025,
0.05722091719508171,
0.07632453739643097,
0.030860893428325653,
0.004444296937435865,
-0.03968817740678787,
0.11973437666893005,
0.01928706467151642,
-0.0555848628282547,
-0.07513567805290222,
0.05550609156489372,
-0.1356123387813568,
0.14418993890285492,
-0.0474303774535656,
0.041481707245111465,
0.09486651420593262,
0.08741666376590729,
-0.06301736831665039,
0.04001859948039055,
0.08720102906227112,
-0.15051153302192688,
0.043812382966279984,
0.2286018282175064,
-0.05294927582144737,
0.15097160637378693,
0.039069026708602905,
-0.0739411786198616,
0.016381707042455673,
-0.07548680901527405,
-0.03591472655534744,
-0.05629029497504234,
0.008657988160848618,
-0.04679029807448387,
0.1347305327653885,
0.1649577021598816,
-0.07327619194984436,
-0.02042744867503643,
-0.03427649289369583,
0.01657339558005333,
0.022830745205283165,
0.12127849459648132,
-0.042240627110004425,
-0.28886210918426514,
-0.0025328565388917923,
-0.051619432866573334,
0.02595808543264866,
-0.20661240816116333,
-0.0761878564953804,
0.0024011004716157913,
-0.04494992271065712,
-0.07798387110233307,
0.11169900000095367,
0.08738898485898972,
0.03859550133347511,
-0.05870242789387703,
-0.1092357337474823,
-0.009639766067266464,
0.18651121854782104,
-0.18409740924835205,
-0.0476817712187767
] |
null | null | transformers |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# wav2vec2-xlsr-fi-lm-1B
This model is a fine-tuned version of [facebook/wav2vec2-xls-r-1b](https://huggingface.co/facebook/wav2vec2-xls-r-1b) on the common voice train/dev/other datasets.
It achieves the following results on the evaluation set without language model:
- Loss: 0.1853
- Wer: 0.2205
With language model:
- Wer: 0.1026
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0003
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 32
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 500
- num_epochs: 10
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Wer |
|:-------------:|:-----:|:----:|:---------------:|:------:|
| 0.8158 | 0.67 | 400 | 0.4835 | 0.6310 |
| 0.5679 | 1.33 | 800 | 0.4806 | 0.5538 |
| 0.6055 | 2.0 | 1200 | 0.3888 | 0.5083 |
| 0.5353 | 2.67 | 1600 | 0.3258 | 0.4365 |
| 0.4883 | 3.33 | 2000 | 0.3313 | 0.4204 |
| 0.4513 | 4.0 | 2400 | 0.2924 | 0.3904 |
| 0.3753 | 4.67 | 2800 | 0.2593 | 0.3608 |
| 0.3478 | 5.33 | 3200 | 0.2832 | 0.3551 |
| 0.3796 | 6.0 | 3600 | 0.2495 | 0.3402 |
| 0.2556 | 6.67 | 4000 | 0.2342 | 0.3106 |
| 0.229 | 7.33 | 4400 | 0.2181 | 0.2812 |
| 0.205 | 8.0 | 4800 | 0.2041 | 0.2523 |
| 0.1654 | 8.67 | 5200 | 0.2015 | 0.2416 |
| 0.152 | 9.33 | 5600 | 0.1942 | 0.2294 |
| 0.1569 | 10.0 | 6000 | 0.1853 | 0.2205 |
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.1+cu102
- Datasets 1.17.1.dev0
- Tokenizers 0.11.0
| {"language": ["fi"], "license": "apache-2.0", "tags": ["generated_from_trainer", "automatic-speech-recognition", "robust-speech-event", "hf-asr-leaderboard"], "model-index": [{"name": "wav2vec2-xlsr-fi-lm-1B", "results": []}]} | automatic-speech-recognition | RASMUS/wav2vec2-xlsr-fi-lm-1B | [
"transformers",
"pytorch",
"wav2vec2",
"automatic-speech-recognition",
"generated_from_trainer",
"robust-speech-event",
"hf-asr-leaderboard",
"fi",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [
"fi"
] | TAGS
#transformers #pytorch #wav2vec2 #automatic-speech-recognition #generated_from_trainer #robust-speech-event #hf-asr-leaderboard #fi #license-apache-2.0 #endpoints_compatible #region-us
| wav2vec2-xlsr-fi-lm-1B
======================
This model is a fine-tuned version of facebook/wav2vec2-xls-r-1b on the common voice train/dev/other datasets.
It achieves the following results on the evaluation set without language model:
* Loss: 0.1853
* Wer: 0.2205
With language model:
* Wer: 0.1026
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 0.0003
* train\_batch\_size: 8
* eval\_batch\_size: 8
* seed: 42
* gradient\_accumulation\_steps: 4
* total\_train\_batch\_size: 32
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* lr\_scheduler\_warmup\_steps: 500
* num\_epochs: 10
* mixed\_precision\_training: Native AMP
### Training results
### Framework versions
* Transformers 4.16.0.dev0
* Pytorch 1.10.1+cu102
* Datasets 1.17.1.dev0
* Tokenizers 0.11.0
| [
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 0.0003\n* train\\_batch\\_size: 8\n* eval\\_batch\\_size: 8\n* seed: 42\n* gradient\\_accumulation\\_steps: 4\n* total\\_train\\_batch\\_size: 32\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 500\n* num\\_epochs: 10\n* mixed\\_precision\\_training: Native AMP",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.16.0.dev0\n* Pytorch 1.10.1+cu102\n* Datasets 1.17.1.dev0\n* Tokenizers 0.11.0"
] | [
"TAGS\n#transformers #pytorch #wav2vec2 #automatic-speech-recognition #generated_from_trainer #robust-speech-event #hf-asr-leaderboard #fi #license-apache-2.0 #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 0.0003\n* train\\_batch\\_size: 8\n* eval\\_batch\\_size: 8\n* seed: 42\n* gradient\\_accumulation\\_steps: 4\n* total\\_train\\_batch\\_size: 32\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 500\n* num\\_epochs: 10\n* mixed\\_precision\\_training: Native AMP",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.16.0.dev0\n* Pytorch 1.10.1+cu102\n* Datasets 1.17.1.dev0\n* Tokenizers 0.11.0"
] | [
72,
158,
4,
41
] | [
"passage: TAGS\n#transformers #pytorch #wav2vec2 #automatic-speech-recognition #generated_from_trainer #robust-speech-event #hf-asr-leaderboard #fi #license-apache-2.0 #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 0.0003\n* train\\_batch\\_size: 8\n* eval\\_batch\\_size: 8\n* seed: 42\n* gradient\\_accumulation\\_steps: 4\n* total\\_train\\_batch\\_size: 32\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 500\n* num\\_epochs: 10\n* mixed\\_precision\\_training: Native AMP### Training results### Framework versions\n\n\n* Transformers 4.16.0.dev0\n* Pytorch 1.10.1+cu102\n* Datasets 1.17.1.dev0\n* Tokenizers 0.11.0"
] | [
-0.12302989512681961,
0.08383636176586151,
-0.0038985866121947765,
0.05213101580739021,
0.10078099370002747,
0.007684479467570782,
0.08726751059293747,
0.15860812366008759,
-0.0801321491599083,
0.08758663386106491,
0.10272802412509918,
0.086327463388443,
0.0689024031162262,
0.14735838770866394,
-0.03661045804619789,
-0.2745139002799988,
0.026067892089486122,
-0.0025647871661931276,
-0.09831345826387405,
0.10357722640037537,
0.09360344707965851,
-0.10569582134485245,
0.028425108641386032,
0.019320422783493996,
-0.09664898365736008,
-0.0017512431368231773,
-0.031744133681058884,
-0.05968404933810234,
0.11774204671382904,
0.047431375831365585,
0.07340209186077118,
0.03500790148973465,
0.07110393047332764,
-0.26970821619033813,
0.012884413823485374,
0.05522725731134415,
0.03304029628634453,
0.06507178395986557,
0.08526276051998138,
-0.014384856447577477,
0.12686075270175934,
-0.08926752209663391,
0.07449255138635635,
0.04840894415974617,
-0.09854839742183685,
-0.3122718334197998,
-0.10093371570110321,
0.05200684070587158,
0.11956963688135147,
0.07875774055719376,
-0.027431035414338112,
0.07436298578977585,
-0.08142412453889847,
0.0905589610338211,
0.22754791378974915,
-0.24997912347316742,
-0.058651797473430634,
-0.031036734580993652,
0.03603766858577728,
0.03980972245335579,
-0.10378333181142807,
-0.032248418778181076,
0.017317647114396095,
0.03911787271499634,
0.1123432144522667,
0.0004684647428803146,
-0.008524448610842228,
0.0044774520210921764,
-0.1462225615978241,
-0.04227536544203758,
0.0940760150551796,
0.07502523809671402,
-0.023469334468245506,
-0.11564486473798752,
-0.02626466192305088,
-0.19177156686782837,
-0.059026069939136505,
-0.000784163421485573,
0.03528306260704994,
-0.03567997366189957,
-0.07321106642484665,
0.01754053495824337,
-0.0659385472536087,
-0.07024995237588882,
0.014112235978245735,
0.14860643446445465,
0.04946628957986832,
-0.026163805276155472,
0.0004167780571151525,
0.09948671609163284,
0.034297116100788116,
-0.1401386559009552,
-0.014421463944017887,
0.0422368161380291,
-0.10701033473014832,
-0.021130165085196495,
-0.03336508944630623,
-0.025060173124074936,
0.030715113505721092,
0.13526703417301178,
-0.05527285113930702,
0.10385788977146149,
0.012572354637086391,
0.016768820583820343,
-0.09362424910068512,
0.16546937823295593,
-0.0365782305598259,
-0.054608382284641266,
-0.035258494317531586,
0.11356022208929062,
0.0074584889225661755,
-0.008102215826511383,
-0.07699023932218552,
0.02374289184808731,
0.0880512148141861,
0.04370984062552452,
-0.024147341027855873,
0.017273828387260437,
-0.06824253499507904,
-0.015208037570118904,
0.03017444536089897,
-0.11429083347320557,
0.04325128346681595,
0.04100513458251953,
-0.05977476388216019,
0.002091875532642007,
0.0050395336002111435,
0.014697736129164696,
-0.024205414578318596,
0.11218792200088501,
-0.045768409967422485,
0.008814175613224506,
-0.0676812008023262,
-0.09939207136631012,
0.03621578589081764,
-0.044898003339767456,
-0.009266416542232037,
-0.077967569231987,
-0.09869016706943512,
-0.058497074991464615,
0.04992642626166344,
-0.0472026951611042,
-0.05588670074939728,
-0.07890411466360092,
-0.05705716460943222,
0.05695640295743942,
-0.02756626345217228,
0.16638131439685822,
-0.07011156529188156,
0.09567775577306747,
0.021725161001086235,
0.05659147724509239,
0.035375822335481644,
0.06410808861255646,
-0.03253330662846565,
0.04203632473945618,
-0.15154144167900085,
0.10442805290222168,
-0.08536609262228012,
0.04783669859170914,
-0.14413432776927948,
-0.1031523048877716,
-0.011388938874006271,
-0.0003681901434902102,
0.09621099382638931,
0.10601355135440826,
-0.18501649796962738,
-0.09858427196741104,
0.1837814897298813,
-0.07657328993082047,
-0.07781120389699936,
0.1429329663515091,
-0.03062993660569191,
-0.03510985150933266,
0.05007991939783096,
0.16675043106079102,
0.1006062775850296,
-0.09727209806442261,
0.0072565567679703236,
-0.0528230220079422,
0.10362337529659271,
0.02223413623869419,
0.0828494131565094,
-0.04620640352368355,
0.03948122262954712,
0.0013902096543461084,
-0.024585744366049767,
0.07724140584468842,
-0.07236549258232117,
-0.08200863748788834,
-0.030321283265948296,
-0.07430361956357956,
0.013120872899889946,
0.04084157571196556,
0.0372377447783947,
-0.09663237631320953,
-0.12745265662670135,
0.007523129694163799,
0.11352936923503876,
-0.09817208349704742,
0.041730862110853195,
-0.0820874571800232,
0.052070457488298416,
-0.009889474138617516,
0.0034992783330380917,
-0.17215414345264435,
0.024071644991636276,
0.02833959460258484,
-0.06462331116199493,
0.019154582172632217,
0.0021161818876862526,
0.0814911499619484,
0.03809453174471855,
-0.056424304842948914,
-0.05348866060376167,
-0.041717033833265305,
-0.002445340622216463,
-0.07521510869264603,
-0.23601755499839783,
-0.06593000143766403,
-0.039339613169431686,
0.17520718276500702,
-0.21425586938858032,
0.010105620138347149,
0.05438714101910591,
0.13494718074798584,
0.038446906954050064,
-0.04055206850171089,
0.00006463117460953072,
0.09008249640464783,
-0.00927110854536295,
-0.06983160227537155,
0.0440080501139164,
-0.002515034517273307,
-0.14436626434326172,
0.0089663565158844,
-0.12259393185377121,
0.06709921360015869,
0.10388203710317612,
0.03300858661532402,
-0.08830279111862183,
-0.07233010977506638,
-0.05724287033081055,
-0.05302926525473595,
-0.02384820580482483,
0.02477201633155346,
0.21150733530521393,
0.03869684040546417,
0.12149839848279953,
-0.05970040336251259,
-0.04449914023280144,
0.03181426599621773,
0.025957096368074417,
-0.0018677633488550782,
0.1464502066373825,
0.03812739998102188,
-0.05535692721605301,
0.08670385926961899,
0.08838047087192535,
-0.06183880567550659,
0.13764142990112305,
-0.07487237453460693,
-0.09369038045406342,
-0.023575447499752045,
0.028198061510920525,
0.022604143247008324,
0.09829041361808777,
-0.13448792695999146,
-0.021753815934062004,
0.024099338799715042,
0.0227036215364933,
0.011062308214604855,
-0.20019540190696716,
0.00655555771663785,
0.044802386313676834,
-0.07370368391275406,
-0.011813407763838768,
-0.005819851532578468,
0.003524384694173932,
0.08013647794723511,
0.01847194880247116,
-0.07594676315784454,
-0.007385427597910166,
-0.020730996504426003,
-0.08145269751548767,
0.1736072301864624,
-0.10139042884111404,
-0.14099499583244324,
-0.10904721915721893,
-0.04786928743124008,
-0.0117536261677742,
-0.025041865184903145,
0.0586073212325573,
-0.11906614899635315,
-0.03464752808213234,
-0.06300025433301926,
0.03315674886107445,
-0.04213203862309456,
0.026545044034719467,
0.013475129380822182,
0.0008587012998759747,
0.06727059930562973,
-0.10614511370658875,
0.005503055173903704,
-0.01879764162003994,
-0.005968193989247084,
0.03288216143846512,
0.03511055186390877,
0.07303211838006973,
0.16000540554523468,
0.03895353898406029,
0.03193856030702591,
-0.0559568889439106,
0.15409903228282928,
-0.1109093427658081,
-0.02357531525194645,
0.10259489715099335,
-0.0032508354634046555,
0.045988909900188446,
0.13516321778297424,
0.05007128417491913,
-0.08031278848648071,
0.004657522775232792,
0.02342149056494236,
-0.012382508255541325,
-0.2250240594148636,
-0.036427512764930725,
-0.06764739751815796,
-0.037787385284900665,
0.11310359835624695,
0.039832647889852524,
-0.020555727183818817,
0.015868159011006355,
-0.013917635194957256,
0.005739324260503054,
0.005876048002392054,
0.06080062687397003,
0.1135600283741951,
0.04331116005778313,
0.11765438318252563,
-0.028207339346408844,
-0.027364104986190796,
0.03844289854168892,
-0.004889096599072218,
0.2277967929840088,
0.0035161850973963737,
0.16570888459682465,
0.04136534407734871,
0.15498073399066925,
0.009043154306709766,
0.03556203469634056,
0.0185818150639534,
-0.01697934977710247,
0.012019287794828415,
-0.0584246851503849,
-0.0371849350631237,
0.04758908972144127,
0.10328438878059387,
0.019418014213442802,
-0.1118876188993454,
-0.017759623005986214,
0.02650393918156624,
0.367039293050766,
0.07180333882570267,
-0.2869220972061157,
-0.08833281695842743,
0.00896934513002634,
-0.08611207455396652,
-0.028307853266596794,
0.023620475083589554,
0.102322056889534,
-0.08740774542093277,
0.08959804475307465,
-0.05751259997487068,
0.09530234336853027,
-0.07440533488988876,
0.0066163851879537106,
0.049124330282211304,
0.09562921524047852,
0.004246918950229883,
0.04243988171219826,
-0.27496540546417236,
0.2799282371997833,
-0.0037843023892492056,
0.08170188963413239,
-0.04541587457060814,
0.03850270062685013,
0.04392453655600548,
-0.02013273350894451,
0.06090683490037918,
-0.014779563993215561,
-0.12863993644714355,
-0.17464624345302582,
-0.09425806999206543,
0.02662266418337822,
0.10442061722278595,
-0.03622206300497055,
0.11068272590637207,
-0.01880665309727192,
-0.02410934306681156,
0.043623197823762894,
-0.050234854221343994,
-0.09614711254835129,
-0.1108514741063118,
0.02258319780230522,
0.05773483216762543,
0.06527144461870193,
-0.09110791236162186,
-0.10372905433177948,
-0.07770555466413498,
0.14981164038181305,
-0.08267189562320709,
-0.02206416428089142,
-0.12697727978229523,
0.0717361643910408,
0.14843109250068665,
-0.07033013552427292,
0.0447494201362133,
0.015623741783201694,
0.1047399714589119,
0.0017247251234948635,
-0.02426508627831936,
0.10235763341188431,
-0.06939159333705902,
-0.19804467260837555,
-0.05349632352590561,
0.16571222245693207,
0.042593762278556824,
0.07207643240690231,
-0.029530026018619537,
0.03529269993305206,
-0.016461649909615517,
-0.07027655839920044,
0.06703473627567291,
0.05021819472312927,
0.01923093944787979,
0.052424971014261246,
-0.02258864790201187,
-0.02789583057165146,
-0.07404444366693497,
-0.07441136240959167,
0.14941854774951935,
0.2963573932647705,
-0.09379610419273376,
0.0714077278971672,
0.042899101972579956,
-0.039056528359651566,
-0.12255765497684479,
-0.00038611158379353583,
0.12228349596261978,
0.04196435958147049,
-0.0016747097251936793,
-0.20552384853363037,
0.015664635226130486,
0.08568251132965088,
-0.020128775388002396,
0.10032635182142258,
-0.31761348247528076,
-0.13452325761318207,
0.11810140311717987,
0.07381274551153183,
0.005655484739691019,
-0.15549589693546295,
-0.05754045769572258,
-0.01956990174949169,
-0.1140085905790329,
0.04311440885066986,
-0.032189950346946716,
0.1257888227701187,
-0.01699446327984333,
0.05017353221774101,
0.01871737837791443,
-0.043144263327121735,
0.1396515816450119,
-0.01056948397308588,
0.052348583936691284,
-0.007759659551084042,
0.044037796556949615,
-0.03661660850048065,
-0.07016629725694656,
0.012425066903233528,
-0.09713087230920792,
0.013383553363382816,
-0.12186073511838913,
-0.027612989768385887,
-0.08394628018140793,
0.018738215789198875,
-0.02595713920891285,
-0.0242605097591877,
-0.02947363257408142,
0.032558366656303406,
0.06840565800666809,
-0.00600510323420167,
0.12708717584609985,
-0.06276126205921173,
0.14912162721157074,
0.0986904427409172,
0.08621785044670105,
-0.028685955330729485,
-0.10558047890663147,
-0.007983746007084846,
-0.01987244188785553,
0.052376314997673035,
-0.13147512078285217,
0.031186114996671677,
0.13830731809139252,
0.03636461868882179,
0.16090482473373413,
0.0493311770260334,
-0.07369046658277512,
0.023970067501068115,
0.0591493658721447,
-0.07213692367076874,
-0.13364414870738983,
0.0023276410065591335,
0.038907211273908615,
-0.12290728837251663,
0.0001880789059214294,
0.12235754728317261,
-0.048480741679668427,
-0.011720548383891582,
0.01078715082257986,
0.026793347671628,
-0.048773761838674545,
0.21546109020709991,
0.020668664947152138,
0.07518365979194641,
-0.10471149533987045,
0.07118530571460724,
0.06142062693834305,
-0.17349299788475037,
0.043738704174757004,
0.08507043123245239,
-0.05682917684316635,
-0.02229619398713112,
0.03325057402253151,
0.10213137418031693,
0.02948254533112049,
-0.06397993862628937,
-0.11296720802783966,
-0.15235747396945953,
0.09998195618391037,
0.09934467822313309,
0.03667664900422096,
0.03113216534256935,
-0.033665068447589874,
0.027513621374964714,
-0.08766911178827286,
0.07654684782028198,
0.0790570005774498,
0.05874408781528473,
-0.11529950052499771,
0.14928340911865234,
0.01261837873607874,
0.002969625173136592,
0.005990811623632908,
-0.00882029626518488,
-0.09682105481624603,
0.02179543487727642,
-0.13966763019561768,
-0.014325782656669617,
-0.06533282995223999,
0.00763426348567009,
0.012425129301846027,
-0.045486997812986374,
-0.04100498557090759,
0.02141403593122959,
-0.12035751342773438,
-0.04294370114803314,
-0.021877234801650047,
0.06670235842466354,
-0.10255254805088043,
-0.02923254854977131,
0.01923806220293045,
-0.11240532249212265,
0.08968283981084824,
0.056431178003549576,
0.003293046960607171,
0.024481670930981636,
-0.08659709244966507,
-0.006044594570994377,
0.04799184575676918,
-0.007052851840853691,
0.03205541893839836,
-0.1775093674659729,
-0.01807275414466858,
-0.01733976975083351,
0.024060353636741638,
0.005177279934287071,
0.0467129647731781,
-0.11942131817340851,
-0.028459947556257248,
-0.039115242660045624,
-0.062313541769981384,
-0.053592193871736526,
0.052558012306690216,
0.08006714284420013,
0.038820259273052216,
0.1577681452035904,
-0.09349128603935242,
0.05368942394852638,
-0.21069908142089844,
-0.004015062469989061,
-0.031967051327228546,
-0.06810256093740463,
-0.06834162026643753,
-0.02935321442782879,
0.09354950487613678,
-0.05737420544028282,
0.09216565638780594,
-0.0665881484746933,
0.0545080229640007,
0.03656872361898422,
-0.12006562948226929,
-0.009707039222121239,
0.03704391047358513,
0.1766403764486313,
0.0527995266020298,
-0.02949351817369461,
0.07132992893457413,
-0.000046376932004932314,
0.06411426514387131,
0.16179504990577698,
0.14873744547367096,
0.16092312335968018,
0.06986254453659058,
0.10588020831346512,
0.04607325419783592,
-0.11315669119358063,
-0.11527620255947113,
0.14096656441688538,
-0.047913748770952225,
0.14185015857219696,
-0.02698717638850212,
0.2289457768201828,
0.09087029844522476,
-0.1924230307340622,
0.07079172879457474,
-0.0305634755641222,
-0.07438387721776962,
-0.09716930240392685,
-0.03970007970929146,
-0.07897292077541351,
-0.18786469101905823,
0.004098749253898859,
-0.08223313093185425,
0.05762777850031853,
0.040415357798337936,
0.04732634499669075,
0.03429095819592476,
0.09834284335374832,
0.04066053405404091,
0.0019666224252432585,
0.11774878203868866,
0.010424493812024593,
-0.015987634658813477,
-0.05793801322579384,
-0.09038000553846359,
0.05827636644244194,
-0.04964181408286095,
0.05648929998278618,
-0.033335473388433456,
-0.09557551145553589,
0.06326570361852646,
0.010326335206627846,
-0.10118202120065689,
0.030071141198277473,
-0.0011347499676048756,
0.079172782599926,
0.10510700196027756,
0.043271951377391815,
-0.0056928545236587524,
-0.01937694102525711,
0.20667313039302826,
-0.09386718273162842,
-0.0683298110961914,
-0.13199327886104584,
0.2473965883255005,
0.02358146570622921,
-0.019625874236226082,
0.019380101934075356,
-0.07000356912612915,
-0.009449122473597527,
0.1465459167957306,
0.13724260032176971,
-0.014080416411161423,
-0.008529656566679478,
0.013890738599002361,
-0.013672895729541779,
-0.039019688963890076,
0.06685064733028412,
0.12098696827888489,
0.05234423652291298,
-0.05487758666276932,
-0.034676890820264816,
-0.03206717222929001,
-0.05506543442606926,
-0.019199209287762642,
0.06763152033090591,
0.010982020758092403,
-0.010728766210377216,
-0.02024495042860508,
0.09358590096235275,
-0.029584677889943123,
-0.1480041742324829,
0.04863079637289047,
-0.1911323070526123,
-0.17494738101959229,
-0.023197658360004425,
0.06307418644428253,
0.03698591887950897,
0.04712151736021042,
-0.0011279397876933217,
-0.02646174654364586,
0.12168118357658386,
-0.014267995953559875,
-0.046234093606472015,
-0.11834448575973511,
0.09561323374509811,
-0.08151749521493912,
0.1417587399482727,
-0.03708888590335846,
0.04719971865415573,
0.11814841628074646,
0.08928994089365005,
-0.07222956418991089,
0.047497864812612534,
0.05917227640748024,
-0.12222413718700409,
0.04040255397558212,
0.16963641345500946,
-0.042722783982753754,
0.125372514128685,
0.04925122484564781,
-0.09407584369182587,
0.032172758132219315,
-0.07948943227529526,
-0.06517728418111801,
-0.051299676299095154,
-0.002148536266759038,
-0.034104082733392715,
0.13477548956871033,
0.20981866121292114,
-0.051104336977005005,
-0.010121497325599194,
-0.05563286319375038,
0.011695264838635921,
0.037680793553590775,
0.10766192525625229,
-0.06306789070367813,
-0.2481689453125,
0.031104153022170067,
0.02299554832279682,
0.013308070600032806,
-0.23173503577709198,
-0.11186650395393372,
0.026576688513159752,
-0.056520625948905945,
-0.057676300406455994,
0.11360782384872437,
0.04753483086824417,
0.05387003719806671,
-0.055886175483465195,
-0.12243131548166275,
-0.021474137902259827,
0.16895896196365356,
-0.16986611485481262,
-0.05296378955245018
] |
null | null | transformers | <!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# wav2vec2-xlsr-fi-train-aug-lm-1B
This model was trained from scratch on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.1499
- Wer: 0.1955
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 16
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 100
- num_epochs: 4
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Wer |
|:-------------:|:-----:|:----:|:---------------:|:------:|
| 0.6473 | 0.29 | 400 | 0.2857 | 0.3825 |
| 0.6039 | 0.58 | 800 | 0.2459 | 0.3476 |
| 0.4757 | 0.87 | 1200 | 0.2338 | 0.3274 |
| 0.4473 | 1.15 | 1600 | 0.2246 | 0.3128 |
| 0.4322 | 1.44 | 2000 | 0.1962 | 0.2805 |
| 0.3961 | 1.73 | 2400 | 0.2070 | 0.2797 |
| 0.3642 | 2.02 | 2800 | 0.1790 | 0.2473 |
| 0.3561 | 2.31 | 3200 | 0.1769 | 0.2375 |
| 0.282 | 2.6 | 3600 | 0.1672 | 0.2263 |
| 0.2978 | 2.89 | 4000 | 0.1636 | 0.2192 |
| 0.2722 | 3.17 | 4400 | 0.1637 | 0.2102 |
| 0.2924 | 3.46 | 4800 | 0.1506 | 0.2021 |
| 0.2631 | 3.75 | 5200 | 0.1499 | 0.1955 |
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.1+cu102
- Datasets 1.17.1.dev0
- Tokenizers 0.11.0
| {"language": "fi", "tags": ["generated_from_trainer", "mozilla-foundation/common_voice_7_0", "audio", "automatic-speech-recognition", "speech"], "datasets": ["mozilla-foundation/common_voice_7_0"], "metrics": ["wer", "cer"]} | automatic-speech-recognition | RASMUS/wav2vec2-xlsr-fi-train-aug-bigLM-1B | [
"transformers",
"pytorch",
"wav2vec2",
"automatic-speech-recognition",
"generated_from_trainer",
"mozilla-foundation/common_voice_7_0",
"audio",
"speech",
"fi",
"dataset:mozilla-foundation/common_voice_7_0",
"endpoints_compatible",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [
"fi"
] | TAGS
#transformers #pytorch #wav2vec2 #automatic-speech-recognition #generated_from_trainer #mozilla-foundation/common_voice_7_0 #audio #speech #fi #dataset-mozilla-foundation/common_voice_7_0 #endpoints_compatible #region-us
| wav2vec2-xlsr-fi-train-aug-lm-1B
================================
This model was trained from scratch on the None dataset.
It achieves the following results on the evaluation set:
* Loss: 0.1499
* Wer: 0.1955
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 0.0001
* train\_batch\_size: 8
* eval\_batch\_size: 8
* seed: 42
* gradient\_accumulation\_steps: 2
* total\_train\_batch\_size: 16
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* lr\_scheduler\_warmup\_steps: 100
* num\_epochs: 4
* mixed\_precision\_training: Native AMP
### Training results
### Framework versions
* Transformers 4.16.0.dev0
* Pytorch 1.10.1+cu102
* Datasets 1.17.1.dev0
* Tokenizers 0.11.0
| [
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 0.0001\n* train\\_batch\\_size: 8\n* eval\\_batch\\_size: 8\n* seed: 42\n* gradient\\_accumulation\\_steps: 2\n* total\\_train\\_batch\\_size: 16\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 100\n* num\\_epochs: 4\n* mixed\\_precision\\_training: Native AMP",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.16.0.dev0\n* Pytorch 1.10.1+cu102\n* Datasets 1.17.1.dev0\n* Tokenizers 0.11.0"
] | [
"TAGS\n#transformers #pytorch #wav2vec2 #automatic-speech-recognition #generated_from_trainer #mozilla-foundation/common_voice_7_0 #audio #speech #fi #dataset-mozilla-foundation/common_voice_7_0 #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 0.0001\n* train\\_batch\\_size: 8\n* eval\\_batch\\_size: 8\n* seed: 42\n* gradient\\_accumulation\\_steps: 2\n* total\\_train\\_batch\\_size: 16\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 100\n* num\\_epochs: 4\n* mixed\\_precision\\_training: Native AMP",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.16.0.dev0\n* Pytorch 1.10.1+cu102\n* Datasets 1.17.1.dev0\n* Tokenizers 0.11.0"
] | [
87,
158,
4,
41
] | [
"passage: TAGS\n#transformers #pytorch #wav2vec2 #automatic-speech-recognition #generated_from_trainer #mozilla-foundation/common_voice_7_0 #audio #speech #fi #dataset-mozilla-foundation/common_voice_7_0 #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 0.0001\n* train\\_batch\\_size: 8\n* eval\\_batch\\_size: 8\n* seed: 42\n* gradient\\_accumulation\\_steps: 2\n* total\\_train\\_batch\\_size: 16\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 100\n* num\\_epochs: 4\n* mixed\\_precision\\_training: Native AMP### Training results### Framework versions\n\n\n* Transformers 4.16.0.dev0\n* Pytorch 1.10.1+cu102\n* Datasets 1.17.1.dev0\n* Tokenizers 0.11.0"
] | [
-0.1366824358701706,
0.14317189157009125,
-0.004246931057423353,
0.028348134830594063,
0.11624939739704132,
0.001424684771336615,
0.09584668278694153,
0.12585994601249695,
-0.07719042897224426,
0.1430334448814392,
0.0872846245765686,
0.10333050787448883,
0.08998225629329681,
0.14017736911773682,
-0.0019375528208911419,
-0.29291555285453796,
0.03617887198925018,
-0.04095832258462906,
-0.07881156355142593,
0.10029687732458115,
0.07460864633321762,
-0.09932272881269455,
0.01561690028756857,
0.0009115443681366742,
-0.1101246178150177,
-0.03290466219186783,
-0.05417187511920929,
-0.05361815169453621,
0.07969730347394943,
0.03398581221699715,
0.019671279937028885,
0.04090029373764992,
0.08199048042297363,
-0.2790221869945526,
0.011280565522611141,
0.05232873186469078,
0.040213558822870255,
0.047861695289611816,
0.12320929765701294,
-0.009739058092236519,
0.0844593420624733,
-0.05744056776165962,
0.022360477596521378,
0.050610434263944626,
-0.08315634727478027,
-0.22754058241844177,
-0.08159217238426208,
-0.002580976812168956,
0.11719732731580734,
0.08951391279697418,
-0.05173136293888092,
0.002054899465292692,
-0.07291632890701294,
0.07690681517124176,
0.25744783878326416,
-0.20927879214286804,
-0.06998905539512634,
-0.023684758692979813,
0.04114153981208801,
0.0420859232544899,
-0.11196186393499374,
0.008697038516402245,
0.03303736075758934,
-0.003957769833505154,
0.0908101424574852,
0.02380869723856449,
0.07653599232435226,
-0.02125464752316475,
-0.1432347297668457,
-0.04518933966755867,
0.12832944095134735,
0.0957099124789238,
-0.015732480213046074,
-0.11612708121538162,
-0.021588079631328583,
-0.2050289660692215,
-0.053903594613075256,
0.029497021809220314,
0.011016556061804295,
-0.03258242830634117,
-0.07701814919710159,
0.03537052124738693,
-0.02512090653181076,
-0.07080305367708206,
0.06335047632455826,
0.13521572947502136,
0.04038234427571297,
-0.06653979420661926,
0.04966064170002937,
0.07202626764774323,
0.07945387065410614,
-0.16781441867351532,
-0.007298050448298454,
0.05112500116229057,
-0.10879858583211899,
0.0112399160861969,
-0.031434595584869385,
0.0711693987250328,
0.06456762552261353,
0.13460497558116913,
0.04754794389009476,
0.08447637408971786,
0.03120853193104267,
0.01315750926733017,
-0.05715227872133255,
0.11717242747545242,
-0.09510082006454468,
-0.13262219727039337,
-0.04786434397101402,
0.10720372945070267,
0.004177829250693321,
-0.03692057356238365,
-0.06684768944978714,
0.03370033577084541,
0.11547224223613739,
0.044403400272130966,
0.006213189102709293,
0.008311018347740173,
-0.068454809486866,
-0.030988086014986038,
-0.03382483869791031,
-0.12051121145486832,
0.04503977298736572,
0.05829257890582085,
-0.04572935029864311,
0.049419838935136795,
-0.05620494484901428,
0.02716699056327343,
-0.0398554652929306,
0.09501098096370697,
-0.054732996970415115,
-0.016132934018969536,
-0.06102212518453598,
-0.0771944522857666,
0.05027015879750252,
-0.09306909143924713,
0.01116948015987873,
-0.04953911155462265,
-0.0465037040412426,
-0.08408995717763901,
0.048443377017974854,
-0.06529751420021057,
-0.08255858719348907,
-0.12062603235244751,
-0.08672153949737549,
0.06301829218864441,
-0.0316130705177784,
0.14888471364974976,
-0.05560021847486496,
0.09976238757371902,
-0.004544047638773918,
0.06726696342229843,
0.10310324281454086,
0.08777589350938797,
0.003371217055246234,
0.04603854566812515,
-0.11909342557191849,
0.11572986841201782,
-0.10987940430641174,
0.0337350107729435,
-0.13973966240882874,
-0.09188299626111984,
-0.03400872275233269,
0.006552183069288731,
0.11583474278450012,
0.14001692831516266,
-0.16687366366386414,
-0.10827161371707916,
0.16112089157104492,
-0.03446309641003609,
-0.06341016292572021,
0.14756956696510315,
-0.0007508699200116098,
-0.06524353474378586,
0.023589307442307472,
0.2169640213251114,
0.10924188047647476,
-0.07998789101839066,
-0.017789922654628754,
-0.07231906056404114,
0.14716902375221252,
0.036810941994190216,
0.11018293350934982,
-0.05799109861254692,
0.023728374391794205,
-0.003950206562876701,
-0.009469360113143921,
0.07557544112205505,
-0.09107889235019684,
-0.07589247822761536,
0.0060738674364984035,
-0.0779731422662735,
0.00656824791803956,
0.04287843033671379,
-0.004183206241577864,
-0.06590016186237335,
-0.11517854034900665,
-0.033514659851789474,
0.09839439392089844,
-0.13297319412231445,
0.018939223140478134,
-0.07219661772251129,
0.0890178233385086,
-0.044431738555431366,
0.005040253512561321,
-0.14334636926651,
0.012502890080213547,
0.044730015099048615,
-0.02848917432129383,
0.00869952142238617,
-0.0620821937918663,
0.04682721942663193,
0.03416974097490311,
-0.01908385008573532,
-0.06866259127855301,
-0.0396270826458931,
-0.01432072650641203,
-0.043790169060230255,
-0.24401988089084625,
-0.06206928566098213,
-0.023720815777778625,
0.17810793220996857,
-0.17412179708480835,
-0.019160529598593712,
0.028018005192279816,
0.11789198964834213,
0.010525516234338284,
-0.07776405662298203,
0.021344942972064018,
0.06509213894605637,
-0.01869799569249153,
-0.07198706269264221,
0.021306389942765236,
0.01675519160926342,
-0.08168866485357285,
0.019182154908776283,
-0.125628262758255,
0.08949935436248779,
0.09662235528230667,
0.015399495139718056,
-0.05285198986530304,
-0.07360159605741501,
-0.04368137940764427,
-0.05304831266403198,
-0.031017303466796875,
-0.006062908098101616,
0.15175072848796844,
0.003928121645003557,
0.08011743426322937,
-0.08672831952571869,
-0.02660425938665867,
0.06358154118061066,
0.02727842703461647,
-0.02277238480746746,
0.14328815042972565,
0.10290196537971497,
-0.013449505902826786,
0.09761465340852737,
0.032421305775642395,
-0.04329086095094681,
0.15481427311897278,
-0.06964074075222015,
-0.11380971968173981,
-0.035267479717731476,
0.017276538535952568,
0.013943427242338657,
0.1048566997051239,
-0.21599596738815308,
-0.01919717714190483,
0.02808893285691738,
0.032187554985284805,
0.021594159305095673,
-0.15697230398654938,
-0.002307186834514141,
0.03930196911096573,
-0.08243923634290695,
-0.023968107998371124,
0.0010919446358457208,
-0.03509684279561043,
0.08084260672330856,
0.006208907347172499,
-0.09843800961971283,
-0.04291345924139023,
-0.04281275346875191,
-0.10053685307502747,
0.14010721445083618,
-0.09779781103134155,
-0.11443200707435608,
-0.09322129935026169,
-0.053679872304201126,
-0.02832675725221634,
-0.0004315903934184462,
0.04993472993373871,
-0.08848033100366592,
-0.026682941243052483,
-0.050626400858163834,
0.02164735272526741,
-0.06371383368968964,
0.049629680812358856,
0.05598494037985802,
-0.010532171465456486,
0.016621308401226997,
-0.0682520717382431,
0.023710619658231735,
-0.012311375699937344,
0.008286051452159882,
-0.0027479389682412148,
0.0028914799913764,
0.12396308034658432,
0.18067747354507446,
0.08626064658164978,
0.04827243089675903,
-0.024385634809732437,
0.19383127987384796,
-0.1535833775997162,
0.0031267802696675062,
0.09370212256908417,
0.017354978248476982,
0.04213717579841614,
0.16045764088630676,
0.0410689115524292,
-0.09019196033477783,
0.01602843776345253,
0.03371331840753555,
-0.02878422848880291,
-0.21118146181106567,
-0.03658869490027428,
-0.09060745686292648,
-0.042894527316093445,
0.10233297944068909,
0.016027865931391716,
-0.001830298569984734,
0.03627508133649826,
-0.02710764855146408,
-0.005784947425127029,
0.040104642510414124,
0.03629253804683685,
0.04286353290081024,
0.04295962303876877,
0.11030378937721252,
0.0006557384040206671,
-0.03137920796871185,
0.01791215129196644,
-0.01974104531109333,
0.2777152955532074,
0.02778947539627552,
0.1948583871126175,
0.04467136040329933,
0.1566268652677536,
-0.0011699532624334097,
0.05922248214483261,
0.012200604192912579,
0.010344687849283218,
0.01386893168091774,
-0.05898572504520416,
-0.04071986302733421,
0.02166234515607357,
0.15047292411327362,
-0.004276878200471401,
-0.11665566265583038,
0.007465629372745752,
0.0036161302123218775,
0.33862531185150146,
0.09654827415943146,
-0.2804511487483978,
-0.05154825747013092,
0.0036692905705422163,
-0.049526941031217575,
-0.036699142307043076,
0.043919745832681656,
0.12368147075176239,
-0.05591097101569176,
0.07251196354627609,
-0.033385250717401505,
0.09125350415706635,
-0.09544110298156738,
-0.0074201542884111404,
0.05375009775161743,
0.08980275690555573,
0.001632573315873742,
0.05398353934288025,
-0.24175012111663818,
0.25856900215148926,
-0.016426311805844307,
0.0802038237452507,
-0.056298159062862396,
0.032690081745386124,
0.02740228921175003,
-0.0607948824763298,
0.0960569977760315,
-0.000014418289538298268,
-0.10528397560119629,
-0.13811752200126648,
-0.12729741632938385,
0.006785964127629995,
0.1512167900800705,
-0.07722120732069016,
0.11233026534318924,
-0.030517637729644775,
-0.04474768787622452,
0.03247671201825142,
-0.05521751940250397,
-0.11696420609951019,
-0.12389498949050903,
0.04755159094929695,
0.041360534727573395,
0.10016971826553345,
-0.05899989232420921,
-0.1075928583741188,
-0.0952439233660698,
0.14921176433563232,
-0.10260727256536484,
-0.008359863422811031,
-0.1354484111070633,
0.03864761069417,
0.16406109929084778,
-0.05997611954808235,
0.05738547816872597,
0.03175969794392586,
0.14236707985401154,
0.021232113242149353,
-0.005548421759158373,
0.11113447695970535,
-0.0864599272608757,
-0.17952419817447662,
-0.06382916122674942,
0.18636839091777802,
0.04018029198050499,
0.06564810872077942,
-0.012890074402093887,
0.01458857674151659,
0.005119421053677797,
-0.08245420455932617,
0.0914136990904808,
0.06792903691530228,
-0.0006845744792371988,
0.04234836995601654,
-0.013620980083942413,
0.02486380562186241,
-0.09299103170633316,
-0.07246825844049454,
0.11805783957242966,
0.24518755078315735,
-0.07191380858421326,
0.03392483666539192,
0.06937257200479507,
-0.050542090088129044,
-0.16429120302200317,
-0.02756565250456333,
0.14747394621372223,
0.053789492696523666,
-0.05671107769012451,
-0.23278270661830902,
0.005530768074095249,
0.059182897210121155,
-0.01814391277730465,
0.0806332528591156,
-0.34343013167381287,
-0.11782629042863846,
0.08545452356338501,
0.05759531259536743,
0.0022260535042732954,
-0.1469869464635849,
-0.08213338255882263,
-0.03143921494483948,
-0.06250887364149094,
0.04421353340148926,
0.0013767152559012175,
0.13873249292373657,
0.03331434354186058,
-0.00023209293431136757,
0.017228050157427788,
-0.046609070152044296,
0.13911035656929016,
0.05268203467130661,
0.00754041550680995,
-0.014841311611235142,
0.03528960049152374,
-0.04537345841526985,
-0.05008278042078018,
-0.007201341912150383,
-0.05414195731282234,
0.020193004980683327,
-0.13213831186294556,
-0.046968113631010056,
-0.07325315475463867,
0.002297433093190193,
-0.030461953952908516,
-0.023146098479628563,
-0.02936459891498089,
0.02551039680838585,
0.11680348217487335,
0.02812402881681919,
0.08382008224725723,
-0.06123296543955803,
0.08063673228025436,
0.08181510865688324,
0.09841282665729523,
-0.02035909704864025,
-0.1058221310377121,
-0.02467985264956951,
-0.011355397291481495,
0.04124213755130768,
-0.07369421422481537,
0.022575335577130318,
0.14014072716236115,
0.05356534570455551,
0.12544791400432587,
0.05890153720974922,
-0.09082003682851791,
0.012528670951724052,
0.06194642558693886,
-0.05877355858683586,
-0.16590207815170288,
-0.03352244198322296,
0.005284325685352087,
-0.1124090924859047,
-0.0006560132023878396,
0.09762248396873474,
-0.022831013426184654,
0.006719087716192007,
0.0025613168254494667,
0.03631841018795967,
-0.02242642641067505,
0.23183219134807587,
0.017887437716126442,
0.0816490426659584,
-0.1163349524140358,
0.06990861892700195,
0.043124012649059296,
-0.11862093955278397,
0.046169463545084,
0.06326761841773987,
-0.05329708009958267,
-0.005819130223244429,
0.022018752992153168,
0.05382145568728447,
0.04975629970431328,
-0.038728244602680206,
-0.08686643838882446,
-0.15062735974788666,
0.10932470858097076,
0.04343883693218231,
0.011964036151766777,
0.024189570918679237,
-0.026281746104359627,
0.029351191595196724,
-0.09972558170557022,
0.08295098692178726,
0.10455278307199478,
0.058731961995363235,
-0.1273963898420334,
0.07566101849079132,
0.004056919366121292,
0.009785630740225315,
-0.008660972118377686,
-0.020764930173754692,
-0.10001172125339508,
0.03164602071046829,
-0.04618870094418526,
-0.013236109167337418,
-0.09018927067518234,
-0.014081628061830997,
0.007657746784389019,
-0.06492424011230469,
-0.06503120809793472,
0.006663360167294741,
-0.10304946452379227,
-0.049623992294073105,
-0.03996886685490608,
0.06536348909139633,
-0.0863572359085083,
-0.0025379001162946224,
0.030372506007552147,
-0.14581742882728577,
0.1041964665055275,
0.059943243861198425,
0.020124098286032677,
-0.0010933357989415526,
-0.09125564247369766,
-0.02027277462184429,
0.039416488260030746,
0.004443251062184572,
0.02423097752034664,
-0.1958146095275879,
-0.0028155052568763494,
-0.02643810771405697,
0.012778270989656448,
-0.02425464428961277,
0.0012769862078130245,
-0.11723656952381134,
-0.019494542852044106,
-0.06893403828144073,
-0.040528059005737305,
-0.0365450382232666,
0.06324250251054764,
0.08847583085298538,
0.007863822393119335,
0.14195741713047028,
-0.05663304030895233,
0.0661802813410759,
-0.20635949075222015,
0.015165201388299465,
-0.017826545983552933,
-0.06453739106655121,
-0.024024508893489838,
-0.02558201551437378,
0.10631128400564194,
-0.04708564281463623,
0.06519252061843872,
-0.055899351835250854,
0.021209534257650375,
-0.005265102256089449,
-0.11084926128387451,
0.03724902123212814,
0.059813421219587326,
0.19860787689685822,
0.08654361963272095,
-0.03009650483727455,
0.06474747508764267,
-0.002978189382702112,
0.08069834113121033,
0.16488926112651825,
0.1161234900355339,
0.1322040855884552,
0.09809021651744843,
0.09006274491548538,
0.08404995501041412,
-0.15747065842151642,
-0.14094452559947968,
0.17505137622356415,
-0.06772483140230179,
0.138291135430336,
0.0012874178355559707,
0.20393143594264984,
0.11070092022418976,
-0.18794028460979462,
0.0584351010620594,
-0.009752611629664898,
-0.08135125041007996,
-0.10221342742443085,
-0.10792915523052216,
-0.09277743101119995,
-0.17172229290008545,
0.020070470869541168,
-0.11089817434549332,
0.06603727489709854,
0.04346095770597458,
0.05140259116888046,
0.050104040652513504,
0.09215837717056274,
0.07301279902458191,
-0.018321754410862923,
0.14043040573596954,
0.005988855846226215,
-0.03784211352467537,
-0.05148659273982048,
-0.12153702229261398,
0.06844794750213623,
-0.036427512764930725,
0.08899563550949097,
-0.029696447774767876,
-0.11536506563425064,
0.08319655805826187,
0.023476285859942436,
-0.10529998689889908,
0.02145916223526001,
-0.04464002326130867,
0.05761449784040451,
0.1088365688920021,
0.03441644087433815,
-0.02388022467494011,
-0.0020205057226121426,
0.1780852973461151,
-0.07982593029737473,
-0.02457130327820778,
-0.12325659394264221,
0.11948385834693909,
0.009526411071419716,
0.029357871040701866,
0.015984101220965385,
-0.06567869335412979,
-0.021781479939818382,
0.18335135281085968,
0.1314195990562439,
-0.019854161888360977,
-0.03481612354516983,
0.04501504823565483,
-0.008818930014967918,
-0.02733374945819378,
0.07306809723377228,
0.11845175921916962,
0.07252215594053268,
-0.029757937416434288,
-0.02408488467335701,
-0.04820016771554947,
-0.07085801661014557,
-0.030249373987317085,
0.07239248603582382,
0.026756558567285538,
-0.025827599689364433,
-0.018265781924128532,
0.1319700926542282,
-0.10241986066102982,
-0.11913677304983139,
0.01487652212381363,
-0.15284638106822968,
-0.19455040991306305,
-0.039419986307621,
0.018779369071125984,
0.06496836245059967,
0.048588063567876816,
-0.005607419181615114,
-0.028716789558529854,
0.1192389726638794,
0.022747453302145004,
-0.041745178401470184,
-0.07768119126558304,
0.03343375027179718,
-0.07649217545986176,
0.1599532514810562,
-0.039499394595623016,
0.02487962692975998,
0.10996466130018234,
0.08972282707691193,
-0.0858737975358963,
0.03526958078145981,
0.09238912165164948,
-0.12360106408596039,
0.06867533177137375,
0.20626024901866913,
-0.05167189612984657,
0.11889538913965225,
0.05157383903861046,
-0.04950668290257454,
0.02647450938820839,
-0.07106103003025055,
-0.012048613280057907,
-0.05659055709838867,
0.0032075471244752407,
-0.05203348398208618,
0.1351321041584015,
0.15725979208946228,
-0.08118069171905518,
-0.025311045348644257,
-0.022259410470724106,
0.01172345969825983,
0.017208479344844818,
0.1498408317565918,
-0.028460616245865822,
-0.2890113294124603,
0.0011016123462468386,
-0.02826022356748581,
0.028259823098778725,
-0.18426622450351715,
-0.06385159492492676,
0.021143974736332893,
-0.05423527956008911,
-0.07847476005554199,
0.11110130697488785,
0.03514876216650009,
0.025054145604372025,
-0.06806297600269318,
-0.11943888664245605,
-0.02588232234120369,
0.18219178915023804,
-0.18142975866794586,
-0.0630260780453682
] |
null | null | transformers | <!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# wav2vec2-xlsr-fi-train-aug-lm-1B
This model was trained from scratch on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.1499
- Wer: 0.1955
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 16
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 100
- num_epochs: 4
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Wer |
|:-------------:|:-----:|:----:|:---------------:|:------:|
| 0.6473 | 0.29 | 400 | 0.2857 | 0.3825 |
| 0.6039 | 0.58 | 800 | 0.2459 | 0.3476 |
| 0.4757 | 0.87 | 1200 | 0.2338 | 0.3274 |
| 0.4473 | 1.15 | 1600 | 0.2246 | 0.3128 |
| 0.4322 | 1.44 | 2000 | 0.1962 | 0.2805 |
| 0.3961 | 1.73 | 2400 | 0.2070 | 0.2797 |
| 0.3642 | 2.02 | 2800 | 0.1790 | 0.2473 |
| 0.3561 | 2.31 | 3200 | 0.1769 | 0.2375 |
| 0.282 | 2.6 | 3600 | 0.1672 | 0.2263 |
| 0.2978 | 2.89 | 4000 | 0.1636 | 0.2192 |
| 0.2722 | 3.17 | 4400 | 0.1637 | 0.2102 |
| 0.2924 | 3.46 | 4800 | 0.1506 | 0.2021 |
| 0.2631 | 3.75 | 5200 | 0.1499 | 0.1955 |
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.1+cu102
- Datasets 1.17.1.dev0
- Tokenizers 0.11.0
| {"language": "fi", "tags": ["generated_from_trainer", "mozilla-foundation/common_voice_7_0", "audio", "automatic-speech-recognition", "speech", "robust-speech-event", "hf-asr-leaderboard"], "datasets": ["mozilla-foundation/common_voice_7_0"], "metrics": ["wer", "cer"], "model-index": [{"name": "XLS-R 1B Wav2Vec2 Finnish by Rasmus Toivanen", "results": [{"task": {"type": "automatic-speech-recognition", "name": "Automatic Speech Recognition"}, "dataset": {"name": "Common Voice 7", "type": "mozilla-foundation/common_voice_7_0", "args": "fi"}, "metrics": [{"type": "wer", "value": 10.96, "name": "Test WER"}, {"type": "cer", "value": 2.81, "name": "Test CER"}]}]}]} | automatic-speech-recognition | RASMUS/wav2vec2-xlsr-fi-train-aug-lm-1B | [
"transformers",
"pytorch",
"wav2vec2",
"automatic-speech-recognition",
"generated_from_trainer",
"mozilla-foundation/common_voice_7_0",
"audio",
"speech",
"robust-speech-event",
"hf-asr-leaderboard",
"fi",
"dataset:mozilla-foundation/common_voice_7_0",
"model-index",
"endpoints_compatible",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [
"fi"
] | TAGS
#transformers #pytorch #wav2vec2 #automatic-speech-recognition #generated_from_trainer #mozilla-foundation/common_voice_7_0 #audio #speech #robust-speech-event #hf-asr-leaderboard #fi #dataset-mozilla-foundation/common_voice_7_0 #model-index #endpoints_compatible #region-us
| wav2vec2-xlsr-fi-train-aug-lm-1B
================================
This model was trained from scratch on the None dataset.
It achieves the following results on the evaluation set:
* Loss: 0.1499
* Wer: 0.1955
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 0.0001
* train\_batch\_size: 8
* eval\_batch\_size: 8
* seed: 42
* gradient\_accumulation\_steps: 2
* total\_train\_batch\_size: 16
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* lr\_scheduler\_warmup\_steps: 100
* num\_epochs: 4
* mixed\_precision\_training: Native AMP
### Training results
### Framework versions
* Transformers 4.16.0.dev0
* Pytorch 1.10.1+cu102
* Datasets 1.17.1.dev0
* Tokenizers 0.11.0
| [
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 0.0001\n* train\\_batch\\_size: 8\n* eval\\_batch\\_size: 8\n* seed: 42\n* gradient\\_accumulation\\_steps: 2\n* total\\_train\\_batch\\_size: 16\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 100\n* num\\_epochs: 4\n* mixed\\_precision\\_training: Native AMP",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.16.0.dev0\n* Pytorch 1.10.1+cu102\n* Datasets 1.17.1.dev0\n* Tokenizers 0.11.0"
] | [
"TAGS\n#transformers #pytorch #wav2vec2 #automatic-speech-recognition #generated_from_trainer #mozilla-foundation/common_voice_7_0 #audio #speech #robust-speech-event #hf-asr-leaderboard #fi #dataset-mozilla-foundation/common_voice_7_0 #model-index #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 0.0001\n* train\\_batch\\_size: 8\n* eval\\_batch\\_size: 8\n* seed: 42\n* gradient\\_accumulation\\_steps: 2\n* total\\_train\\_batch\\_size: 16\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 100\n* num\\_epochs: 4\n* mixed\\_precision\\_training: Native AMP",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.16.0.dev0\n* Pytorch 1.10.1+cu102\n* Datasets 1.17.1.dev0\n* Tokenizers 0.11.0"
] | [
109,
158,
4,
41
] | [
"passage: TAGS\n#transformers #pytorch #wav2vec2 #automatic-speech-recognition #generated_from_trainer #mozilla-foundation/common_voice_7_0 #audio #speech #robust-speech-event #hf-asr-leaderboard #fi #dataset-mozilla-foundation/common_voice_7_0 #model-index #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 0.0001\n* train\\_batch\\_size: 8\n* eval\\_batch\\_size: 8\n* seed: 42\n* gradient\\_accumulation\\_steps: 2\n* total\\_train\\_batch\\_size: 16\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 100\n* num\\_epochs: 4\n* mixed\\_precision\\_training: Native AMP### Training results### Framework versions\n\n\n* Transformers 4.16.0.dev0\n* Pytorch 1.10.1+cu102\n* Datasets 1.17.1.dev0\n* Tokenizers 0.11.0"
] | [
-0.15423421561717987,
0.11838502436876297,
-0.004353526514023542,
0.049508631229400635,
0.10097900032997131,
-0.013895426876842976,
0.0971129983663559,
0.14518649876117706,
-0.08695215731859207,
0.1155422106385231,
0.08048609644174576,
0.07441218942403793,
0.07515570521354675,
0.1064450666308403,
-0.018290530890226364,
-0.30195558071136475,
0.015348821878433228,
-0.034449782222509384,
-0.12691448628902435,
0.09366918355226517,
0.10002218931913376,
-0.09243805706501007,
0.028296908363699913,
0.029981091618537903,
-0.09498731791973114,
0.003331332001835108,
-0.035863976925611496,
-0.043425679206848145,
0.07715334743261337,
0.0545886754989624,
0.03682796284556389,
0.029488371685147285,
0.09551086276769638,
-0.2517731487751007,
0.009994086809456348,
0.05719585344195366,
0.042936746031045914,
0.05874769762158394,
0.120939239859581,
-0.028813781216740608,
0.10001479834318161,
-0.03642464429140091,
0.03462626039981842,
0.0722956508398056,
-0.10300241410732269,
-0.2542262673377991,
-0.07388061285018921,
0.046757593750953674,
0.13060452044010162,
0.0919952467083931,
-0.04989616200327873,
0.005812372080981731,
-0.06398836523294449,
0.10325739532709122,
0.19610688090324402,
-0.20238497853279114,
-0.07448738813400269,
-0.02892243303358555,
0.04643019288778305,
0.03885944187641144,
-0.11342698335647583,
-0.014142055064439774,
0.03319880738854408,
0.01553519070148468,
0.06171995401382446,
0.007075987756252289,
0.014093868434429169,
0.0014817657647654414,
-0.13691478967666626,
-0.045066043734550476,
0.142130509018898,
0.09018296748399734,
-0.02481257915496826,
-0.10635243356227875,
-0.0034257015213370323,
-0.20276539027690887,
-0.0509716160595417,
0.04206187278032303,
0.019840331748127937,
-0.032950181514024734,
-0.0789935514330864,
0.03575531020760536,
-0.055892378091812134,
-0.08665014058351517,
0.06934951990842819,
0.12122456729412079,
0.02989235706627369,
-0.025747444480657578,
0.002822666894644499,
0.09564689546823502,
0.04871087893843651,
-0.17795492708683014,
-0.026056241244077682,
0.046890854835510254,
-0.11609892547130585,
-0.009424868039786816,
-0.018258776515722275,
0.02711339294910431,
0.03901377692818642,
0.11837802827358246,
-0.02488042414188385,
0.09440352767705917,
0.021885188296437263,
0.012468605302274227,
-0.06783637404441833,
0.17234477400779724,
-0.06811327487230301,
-0.08921875804662704,
-0.052984923124313354,
0.133891761302948,
-0.02117246203124523,
-0.007583177648484707,
-0.06652398407459259,
0.04696284234523773,
0.08399218320846558,
0.04230014979839325,
-0.012590489350259304,
0.009014997631311417,
-0.0547526478767395,
-0.025951510295271873,
0.014066529460251331,
-0.11791341006755829,
0.03047448955476284,
0.07047110050916672,
-0.08725413680076599,
-0.000643407111056149,
-0.020771993324160576,
0.02036231942474842,
-0.02758614718914032,
0.08618432283401489,
-0.05316891148686409,
-0.006501432973891497,
-0.06251122802495956,
-0.09004709869623184,
0.03483665734529495,
-0.03029351681470871,
-0.002079515717923641,
-0.04352777078747749,
-0.09773974865674973,
-0.08431820571422577,
0.050240032374858856,
-0.05569954961538315,
-0.07686686515808105,
-0.09973631799221039,
-0.09548363089561462,
0.05773242935538292,
-0.016563655808568,
0.15022428333759308,
-0.05663200095295906,
0.08810772001743317,
0.04053372144699097,
0.04483911022543907,
0.10192593932151794,
0.0746014192700386,
-0.014548014849424362,
0.05592077597975731,
-0.12821123003959656,
0.12298853695392609,
-0.11381056904792786,
0.06879076361656189,
-0.12688671052455902,
-0.10620652884244919,
-0.02404150553047657,
0.01720917411148548,
0.09136039763689041,
0.13243532180786133,
-0.17475482821464539,
-0.11229456961154938,
0.16570599377155304,
-0.04560350999236107,
-0.08661788702011108,
0.1366155743598938,
-0.010027455165982246,
-0.035197481513023376,
0.04073003679513931,
0.196559339761734,
0.15326151251792908,
-0.0935152992606163,
-0.0069925193674862385,
-0.053610771894454956,
0.13681578636169434,
0.06131034344434738,
0.08871158957481384,
-0.059964582324028015,
0.040464892983436584,
-0.0031447075307369232,
-0.010748273693025112,
0.0589134506881237,
-0.08384086191654205,
-0.08441099524497986,
-0.00245187827385962,
-0.09907101839780807,
-0.009841619990766048,
0.056959401816129684,
0.026689548045396805,
-0.06856147199869156,
-0.12651439011096954,
-0.004698373842984438,
0.11088632792234421,
-0.11106371134519577,
0.0075457398779690266,
-0.08267343044281006,
0.06751064211130142,
-0.024658728390932083,
-0.00829546432942152,
-0.15520420670509338,
0.014218530617654324,
0.03409341722726822,
-0.06401298940181732,
0.029002202674746513,
-0.035722095519304276,
0.07784081995487213,
0.04192134737968445,
-0.0345129631459713,
-0.0727754607796669,
-0.018036842346191406,
-0.020172785967588425,
-0.04505161568522453,
-0.23264354467391968,
-0.06917991489171982,
-0.02056088112294674,
0.19589608907699585,
-0.22002284228801727,
0.00043446282506920397,
0.050459351390600204,
0.11553812772035599,
0.03655903786420822,
-0.06918662786483765,
0.024963634088635445,
0.05696559697389603,
-0.010927722789347172,
-0.06836967915296555,
0.014182322658598423,
0.011702556163072586,
-0.100519098341465,
0.014089194126427174,
-0.14409606158733368,
0.07890493422746658,
0.07672781497240067,
0.022110244259238243,
-0.08398710936307907,
-0.038008783012628555,
-0.05947849154472351,
-0.05369866266846657,
-0.014475928619503975,
-0.02047303132712841,
0.17510153353214264,
0.015683120116591454,
0.1098594069480896,
-0.07807502895593643,
-0.05827462300658226,
0.04497261717915535,
0.01667945645749569,
0.005987206008285284,
0.1578531563282013,
0.0300806425511837,
-0.027695370838046074,
0.08336101472377777,
-0.0064510684460401535,
-0.06499233096837997,
0.18529757857322693,
-0.0904570147395134,
-0.10232938826084137,
-0.03318479284644127,
0.03030412457883358,
0.030253905802965164,
0.1181020587682724,
-0.17469926178455353,
-0.031472109258174896,
0.016812708228826523,
0.024207282811403275,
0.020611751824617386,
-0.1847398430109024,
-0.00012965424684807658,
0.039277635514736176,
-0.09504126012325287,
-0.04399615526199341,
0.013730844482779503,
-0.025837035849690437,
0.06972579658031464,
-0.002599637024104595,
-0.08428774774074554,
-0.04601394385099411,
-0.04694925993680954,
-0.09843940287828445,
0.1637352854013443,
-0.07879544794559479,
-0.11780582368373871,
-0.14263123273849487,
-0.02633844129741192,
-0.027853401377797127,
-0.0035483790561556816,
0.04137082397937775,
-0.10825681686401367,
-0.028605148196220398,
-0.06902902573347092,
0.03815782070159912,
-0.04820888862013817,
0.014902625232934952,
0.027122970670461655,
0.0055671799927949905,
0.07618251442909241,
-0.1025487557053566,
0.023386536166071892,
-0.010225878097116947,
-0.011463194154202938,
-0.016519954428076744,
0.013272718526422977,
0.09362354129552841,
0.1843704730272293,
0.0867684856057167,
0.05462715029716492,
-0.02271907404065132,
0.21627750992774963,
-0.15472301840782166,
0.005480614025145769,
0.10328757017850876,
-0.0070044067688286304,
0.036417048424482346,
0.1538519710302353,
0.05384748429059982,
-0.07457287609577179,
0.014110933989286423,
0.048940256237983704,
-0.01898025907576084,
-0.22425714135169983,
-0.018733875826001167,
-0.08416659384965897,
-0.013276522047817707,
0.09426980465650558,
0.028643252328038216,
0.03350081667304039,
0.018005935475230217,
-0.01755823940038681,
-0.01632046513259411,
0.05197518318891525,
0.03891229256987572,
0.06727422773838043,
0.04332536831498146,
0.11536629498004913,
-0.015775159001350403,
-0.042384419590234756,
0.016519678756594658,
-0.011648242361843586,
0.2351626455783844,
0.01438202802091837,
0.18748272955417633,
0.050230346620082855,
0.12028882652521133,
-0.0008398149511776865,
0.052353620529174805,
0.004235304892063141,
-0.014093957841396332,
0.04199603199958801,
-0.06232183799147606,
0.0011478991946205497,
0.030239272862672806,
0.13582904636859894,
0.02098747342824936,
-0.10816366225481033,
0.0010182270780205727,
0.010370677337050438,
0.3581218719482422,
0.11041679233312607,
-0.29211390018463135,
-0.09328042715787888,
0.015395527705550194,
-0.07688344269990921,
-0.03631635382771492,
0.04225839674472809,
0.11355385184288025,
-0.07093510031700134,
0.08093041181564331,
-0.03968512639403343,
0.1019759476184845,
-0.06644363701343536,
0.0016569411382079124,
0.09340845048427582,
0.07923823595046997,
-0.0029713488183915615,
0.04839783161878586,
-0.2142103910446167,
0.2709577977657318,
-0.004127017222344875,
0.08648824691772461,
-0.02636469341814518,
0.049949634820222855,
0.04351430758833885,
-0.027607396245002747,
0.08385705947875977,
-0.0056144059635698795,
-0.11236246675252914,
-0.1783493012189865,
-0.09801648557186127,
0.00819273479282856,
0.13241145014762878,
-0.07073413580656052,
0.11897054314613342,
-0.028161106631159782,
-0.05385846644639969,
0.03724772110581398,
-0.0891318991780281,
-0.12046439945697784,
-0.10539816319942474,
0.03693034499883652,
0.03544473275542259,
0.09743480384349823,
-0.07467855513095856,
-0.087705098092556,
-0.06779813021421432,
0.12258188426494598,
-0.11394508928060532,
-0.020152991637587547,
-0.1218729317188263,
0.038188524544239044,
0.17764896154403687,
-0.05957962945103645,
0.03841986879706383,
0.02241472341120243,
0.13215291500091553,
0.03929290175437927,
-0.0049873171374201775,
0.09429793804883957,
-0.08260975033044815,
-0.19729791581630707,
-0.05026635527610779,
0.20016391575336456,
0.04149676114320755,
0.0706440880894661,
-0.0289300624281168,
0.030609916895627975,
-0.010653474368155003,
-0.07666719704866409,
0.08144386112689972,
0.016770724207162857,
-0.025172997266054153,
0.036588866263628006,
-0.0131458081305027,
0.01653447188436985,
-0.09036999940872192,
-0.0529911182820797,
0.09093260020017624,
0.2584240138530731,
-0.0738464891910553,
0.03070780634880066,
0.04071047902107239,
-0.0497874841094017,
-0.13751530647277832,
0.00879894383251667,
0.16456057131290436,
0.05334720388054848,
-0.05211375653743744,
-0.21477681398391724,
0.025276241824030876,
0.04685629531741142,
-0.02590261399745941,
0.12134313583374023,
-0.3227905333042145,
-0.14189687371253967,
0.10404937714338303,
0.025701450183987617,
-0.059441063553094864,
-0.1607755720615387,
-0.07582367211580276,
-0.02030441351234913,
-0.09167458862066269,
0.029524745419621468,
0.004269572906196117,
0.1224253922700882,
0.023562071844935417,
0.05042170733213425,
0.02287573181092739,
-0.03875519335269928,
0.14192673563957214,
0.0317358635365963,
0.03503759950399399,
0.0038633844815194607,
0.019895533099770546,
-0.01601279154419899,
-0.06988631188869476,
0.0227789506316185,
-0.07213107496500015,
0.021351665258407593,
-0.14746612310409546,
-0.03357568383216858,
-0.09476658701896667,
0.004500334616750479,
-0.03911777585744858,
-0.011080287396907806,
-0.014505015686154366,
0.03663060441613197,
0.07651180773973465,
0.023680057376623154,
0.10814589262008667,
-0.08808231353759766,
0.1282537430524826,
0.1174132227897644,
0.10348831117153168,
0.01691008172929287,
-0.11329784244298935,
-0.001092847203835845,
-0.003293941030278802,
0.0366930328309536,
-0.12736253440380096,
0.04022001847624779,
0.14452989399433136,
0.05766603723168373,
0.1584312468767166,
0.05312671884894371,
-0.08307327330112457,
0.004880716558545828,
0.06442438811063766,
-0.05991161987185478,
-0.12709106504917145,
-0.04108224809169769,
-0.0026983122806996107,
-0.13521254062652588,
-0.01034958753734827,
0.10758823901414871,
-0.03880801424384117,
0.008643390610814095,
0.010933040641248226,
0.03517390042543411,
-0.0584559366106987,
0.24678128957748413,
0.04107481613755226,
0.09804563224315643,
-0.10246410220861435,
0.06212889775633812,
0.04330020770430565,
-0.10601427406072617,
0.023903202265501022,
0.08873770385980606,
-0.026018623262643814,
-0.01197728980332613,
-0.012468330562114716,
0.08878317475318909,
0.03599008172750473,
-0.054267968982458115,
-0.11942930519580841,
-0.15244978666305542,
0.09693251550197601,
0.09376432001590729,
0.01603558473289013,
0.03388819098472595,
-0.03899306058883667,
0.049138471484184265,
-0.09413136541843414,
0.10813454538583755,
0.12001049518585205,
0.06201136112213135,
-0.1379866898059845,
0.11905655264854431,
-0.009004413150250912,
-0.0038039616774767637,
0.008289951831102371,
-0.022176459431648254,
-0.10462277382612228,
0.03547291085124016,
-0.11908914893865585,
-0.014382198452949524,
-0.04801824316382408,
0.0009725784766487777,
0.015624870546162128,
-0.04925621673464775,
-0.06918096542358398,
0.02365569956600666,
-0.12542398273944855,
-0.04090121015906334,
-0.025185834616422653,
0.06425777077674866,
-0.10106943547725677,
-0.006394617725163698,
0.03547937050461769,
-0.13335318863391876,
0.08761955052614212,
0.05899066850543022,
-0.000024988014047266915,
0.014107910916209221,
-0.08122065663337708,
-0.007996167987585068,
0.03291809931397438,
-0.002150769578292966,
0.02738582342863083,
-0.175092875957489,
-0.00300598400644958,
-0.03405052423477173,
0.026396283879876137,
-0.019291603937745094,
-0.016049887984991074,
-0.10258325934410095,
-0.0077114226296544075,
-0.03290791064500809,
-0.03790391981601715,
-0.050243884325027466,
0.06828197836875916,
0.06952854245901108,
0.029591642320156097,
0.14590555429458618,
-0.07199942320585251,
0.05708577111363411,
-0.23373883962631226,
0.007173742633312941,
0.00042292536818422377,
-0.07257197052240372,
-0.04738133028149605,
-0.023301582783460617,
0.1083865836262703,
-0.06428880989551544,
0.06568550318479538,
-0.03581434488296509,
0.050173692405223846,
0.020036034286022186,
-0.10150352865457535,
0.03654967620968819,
0.061762258410453796,
0.16284294426441193,
0.05365874618291855,
-0.02513575367629528,
0.06902813911437988,
-0.019546229392290115,
0.05932782590389252,
0.14043955504894257,
0.12904229760169983,
0.1243051290512085,
0.035831041634082794,
0.08273417502641678,
0.10184210538864136,
-0.1334804743528366,
-0.11140578240156174,
0.1325090527534485,
-0.07423605024814606,
0.14854289591312408,
-0.026500683277845383,
0.19878625869750977,
0.09378990530967712,
-0.1935817152261734,
0.076238252222538,
-0.045661572366952896,
-0.09664353728294373,
-0.11009374260902405,
-0.08242286741733551,
-0.07776278257369995,
-0.1737392693758011,
0.025983814150094986,
-0.11069540679454803,
0.0890587642788887,
0.024027975276112556,
0.05254290625452995,
0.03352193906903267,
0.12304472178220749,
0.0387241505086422,
-0.0026566486340016127,
0.13971953094005585,
-0.005025104153901339,
-0.024280166253447533,
-0.04326346144080162,
-0.08224081248044968,
0.07493621110916138,
-0.047638457268476486,
0.06554161012172699,
-0.04635469615459442,
-0.12601414322853088,
0.06523558497428894,
0.01635088212788105,
-0.10222968459129333,
0.03045041114091873,
-0.019629573449492455,
0.06923618912696838,
0.0867406576871872,
0.041639089584350586,
-0.0023589704651385546,
-0.0036722468212246895,
0.21454943716526031,
-0.10677073895931244,
-0.05065456032752991,
-0.1467333287000656,
0.18233539164066315,
-0.006688390392810106,
0.004830644465982914,
0.010180147364735603,
-0.0790180191397667,
-0.019943643361330032,
0.1837441325187683,
0.1471826285123825,
-0.022601451724767685,
-0.030635153874754906,
0.013817023485898972,
-0.006735003553330898,
-0.033674951642751694,
0.06108756735920906,
0.10141759365797043,
0.09955190122127533,
-0.028343768790364265,
-0.020359691232442856,
-0.019039956852793694,
-0.07564054429531097,
-0.007503022905439138,
0.10271920263767242,
0.01530266273766756,
-0.013175662606954575,
-0.022678637877106667,
0.13036850094795227,
-0.11826545745134354,
-0.1411771923303604,
0.00894596055150032,
-0.17087584733963013,
-0.18403221666812897,
-0.04768470674753189,
0.04145508259534836,
0.05979132652282715,
0.05326671898365021,
-0.012114406563341618,
-0.04570760205388069,
0.1140112429857254,
0.0028611798770725727,
-0.03230349346995354,
-0.1190616562962532,
0.06220335140824318,
-0.13581305742263794,
0.16518224775791168,
-0.045410651713609695,
0.033239953219890594,
0.10975699126720428,
0.05391077324748039,
-0.06593327224254608,
0.02520265430212021,
0.08426779508590698,
-0.16213291883468628,
0.03344764932990074,
0.23643143475055695,
-0.043183326721191406,
0.14555345475673676,
0.03076556697487831,
-0.08079873770475388,
0.02503550797700882,
-0.06470023840665817,
-0.03485506400465965,
-0.0818183571100235,
-0.017251692712306976,
-0.042153697460889816,
0.12669344246387482,
0.205583393573761,
-0.07677537202835083,
-0.014009617269039154,
-0.037975385785102844,
0.01622592657804489,
0.03696874901652336,
0.12325051426887512,
-0.04989023506641388,
-0.28438466787338257,
0.023108361288905144,
-0.02225598506629467,
0.007609337568283081,
-0.2203056663274765,
-0.07543694972991943,
0.030627213418483734,
-0.05777620151638985,
-0.04004507511854172,
0.11979512870311737,
0.0695643499493599,
0.04910673201084137,
-0.05149086192250252,
-0.08155261725187302,
-0.01986626908183098,
0.1877140998840332,
-0.17529049515724182,
-0.06652211397886276
] |
null | null | transformers |
# Harry Potter DialoGPT Model | {"tags": ["conversational"]} | text-generation | RAhul03/DialoGPT-small-harrypotter | [
"transformers",
"pytorch",
"gpt2",
"text-generation",
"conversational",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [] | TAGS
#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
# Harry Potter DialoGPT Model | [
"# Harry Potter DialoGPT Model"
] | [
"TAGS\n#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n",
"# Harry Potter DialoGPT Model"
] | [
51,
8
] | [
"passage: TAGS\n#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n# Harry Potter DialoGPT Model"
] | [
-0.0009023238671943545,
0.07815738022327423,
-0.006546166725456715,
0.07792752981185913,
0.10655936598777771,
0.048972971737384796,
0.17639793455600739,
0.12185695022344589,
0.016568755730986595,
-0.04774167761206627,
0.11647630482912064,
0.2130284160375595,
-0.002118367003276944,
0.024608047679066658,
-0.05022026598453522,
-0.3065771162509918,
0.0474756620824337,
0.014356585219502449,
-0.07174845039844513,
0.11724270135164261,
0.09064973145723343,
-0.046179238706827164,
0.08330509811639786,
-0.009135239757597446,
-0.13198648393154144,
-0.039482954889535904,
0.019292812794446945,
-0.11745545268058777,
0.1662212759256363,
0.05298272892832756,
0.02469746209681034,
-0.008447164669632912,
-0.06598151475191116,
-0.15036040544509888,
0.037190426141023636,
-0.027472136542201042,
-0.01080626156181097,
0.05462246760725975,
0.023526115342974663,
-0.07521048933267593,
0.170567125082016,
0.17678891122341156,
0.0833497866988182,
0.0349111407995224,
-0.14917024970054626,
-0.045548245310783386,
0.008950977586209774,
0.05421316996216774,
-0.017893504351377487,
0.09349167346954346,
-0.019903047010302544,
0.11801653355360031,
-0.04491448402404785,
0.09210366010665894,
0.15255063772201538,
-0.4016275703907013,
-0.027563704177737236,
0.08920855820178986,
0.05989706888794899,
0.12076901644468307,
-0.10560955852270126,
0.03972794860601425,
-0.0039703017100691795,
0.01236654631793499,
-0.014540530741214752,
-0.08304883539676666,
-0.07308239489793777,
0.032504837960004807,
-0.1272556483745575,
0.008525865152478218,
0.23756256699562073,
-0.10643257945775986,
0.037069112062454224,
-0.09791990369558334,
-0.07414398342370987,
0.048336777836084366,
-0.053761593997478485,
-0.081727035343647,
-0.054839808493852615,
0.06347949057817459,
0.004366500303149223,
-0.06301609426736832,
-0.08326146006584167,
-0.0006536149303428829,
-0.12781435251235962,
0.17595994472503662,
0.061243366450071335,
0.041611745953559875,
-0.21322020888328552,
0.08940251916646957,
0.04477722570300102,
-0.04711297154426575,
0.007116159424185753,
-0.11796226352453232,
0.04023287072777748,
0.005483259446918964,
-0.03256071358919144,
-0.021854614838957787,
0.0393419973552227,
0.13909944891929626,
-0.01777748204767704,
0.03252175822854042,
0.006831915583461523,
0.05811219662427902,
0.08162496984004974,
0.02222144603729248,
0.019291909411549568,
-0.0818009302020073,
0.019385190680623055,
-0.08128736168146133,
-0.0030400939285755157,
-0.048940129578113556,
-0.17071883380413055,
-0.07477642595767975,
0.052610911428928375,
0.020047198981046677,
0.03746970370411873,
0.08054786175489426,
-0.0017944995779544115,
-0.05560554191470146,
0.03284840285778046,
0.01671096310019493,
-0.020622212439775467,
-0.010361049324274063,
-0.02412462793290615,
0.19123271107673645,
0.019619356840848923,
0.014111656695604324,
-0.12379156798124313,
0.10023640841245651,
-0.08179095387458801,
0.0037731381598860025,
0.02743307314813137,
-0.04204464703798294,
-0.004716555587947369,
0.02917117439210415,
0.023101668804883957,
-0.1252521574497223,
-0.1099385917186737,
-0.0030569476075470448,
-0.012054097838699818,
-0.036421261727809906,
-0.10490952432155609,
-0.08483029156923294,
-0.012153145857155323,
0.0449371263384819,
-0.013397793285548687,
0.007936403155326843,
-0.05143149942159653,
0.0985720232129097,
-0.0514979362487793,
0.09873400628566742,
-0.08342572301626205,
0.06359215080738068,
-0.09124887734651566,
-0.061886150389909744,
-0.11452563107013702,
0.05216052383184433,
0.012905281968414783,
0.066250741481781,
0.016998225823044777,
-0.044836658984422684,
-0.014836243353784084,
0.05253177136182785,
-0.07656687498092651,
0.1940697431564331,
-0.041674621403217316,
-0.12459053844213486,
0.24146439135074615,
-0.09138800948858261,
-0.1802034229040146,
0.12973085045814514,
-0.022254703566432,
0.08523941785097122,
0.12802475690841675,
0.20380465686321259,
-0.00019822151807602495,
-0.01302915159612894,
0.07281201332807541,
0.07031642645597458,
-0.09803894907236099,
0.06239739805459976,
0.029653839766979218,
-0.008071083575487137,
-0.08906278014183044,
0.05762826278805733,
0.046033453196287155,
-0.010650773532688618,
-0.035073768347501755,
-0.001896020956337452,
-0.012895751744508743,
-0.022185025736689568,
0.14126582443714142,
-0.02006692811846733,
0.1300428807735443,
-0.06926563382148743,
-0.03515486419200897,
-0.009500149637460709,
0.03533667325973511,
-0.04091939330101013,
0.08151165395975113,
-0.0436173714697361,
0.10586477071046829,
0.09034156054258347,
0.053724925965070724,
-0.13120363652706146,
0.00466286763548851,
-0.015246815048158169,
0.17014820873737335,
0.08964069187641144,
0.05222717300057411,
0.06265474855899811,
-0.0020888058934360743,
-0.06708643585443497,
0.045407816767692566,
0.13778303563594818,
-0.037020038813352585,
-0.12218865007162094,
-0.1755627691745758,
0.051157694309949875,
-0.045444171875715256,
0.10855234414339066,
-0.10010123997926712,
0.022670533508062363,
-0.055906031280756,
0.07772238552570343,
-0.024998966604471207,
0.020512236282229424,
-0.0013405600329861045,
-0.021700702607631683,
-0.08356887847185135,
-0.002377772703766823,
0.08597290515899658,
-0.02048647589981556,
-0.06707409024238586,
0.16556480526924133,
-0.16400809586048126,
0.1631954461336136,
0.2116095870733261,
-0.28542569279670715,
-0.005696662236005068,
-0.15163889527320862,
-0.0208092350512743,
0.019645055755972862,
0.07834604382514954,
0.026225795969367027,
0.2044338881969452,
-0.012928472831845284,
0.16565458476543427,
-0.05699567869305611,
-0.07730039209127426,
-0.06881127506494522,
-0.048101142048835754,
0.013522743247449398,
0.09095205366611481,
0.04542696103453636,
-0.11962861567735672,
0.13119758665561676,
0.1054433062672615,
0.06484298408031464,
0.12711186707019806,
0.1030748188495636,
-0.008113685995340347,
0.07252490520477295,
-0.03624548763036728,
-0.03462279960513115,
-0.09254947304725647,
-0.30446043610572815,
-0.04840317741036415,
0.0939924493432045,
0.007963384501636028,
0.09285714477300644,
-0.0919896736741066,
-0.03311870992183685,
0.006042704917490482,
0.009473444893956184,
0.028337622061371803,
0.09653715789318085,
0.013490920886397362,
0.15320514142513275,
-0.008011690340936184,
-0.03430786728858948,
0.05891305208206177,
0.017982570454478264,
-0.09147711098194122,
0.17280617356300354,
-0.17050009965896606,
-0.27190929651260376,
-0.06990014761686325,
-0.21745692193508148,
-0.013139115646481514,
0.05258983001112938,
0.0786920040845871,
-0.11818131804466248,
-0.018352627754211426,
-0.006239492911845446,
0.05685517191886902,
-0.2425733357667923,
0.0004911290016025305,
-0.1354890614748001,
0.0501418262720108,
-0.1974833607673645,
-0.09718500077724457,
-0.02271542325615883,
-0.013450481928884983,
-0.0464281290769577,
0.13365240395069122,
-0.1448695808649063,
-0.011572926305234432,
0.2329535037279129,
0.032479673624038696,
0.027794739231467247,
-0.05020907148718834,
0.19788463413715363,
-0.0958966314792633,
-0.023973820731043816,
0.11024576425552368,
-0.05038975924253464,
0.04834126681089401,
0.06649978458881378,
-0.012981836684048176,
-0.08557141572237015,
0.023789849132299423,
-0.068336620926857,
-0.03150583803653717,
-0.27926525473594666,
-0.0930178239941597,
-0.09319330751895905,
0.11305391043424606,
0.04079577326774597,
0.06421639025211334,
0.16545771062374115,
0.05191578343510628,
-0.024325082078576088,
-0.03006586618721485,
0.11609793454408646,
0.12905290722846985,
0.2277202159166336,
-0.06067761778831482,
0.10221996158361435,
0.009445492178201675,
-0.08203992247581482,
0.06062209978699684,
0.056782789528369904,
0.06324724853038788,
0.02584579586982727,
0.03694582358002663,
-0.030939655378460884,
0.1121687963604927,
0.12571842968463898,
0.05258069559931755,
0.0481170229613781,
0.0002127334737451747,
-0.0561506561934948,
-0.008168719708919525,
-0.05726633965969086,
0.06774696707725525,
0.061340972781181335,
-0.12918008863925934,
-0.08061543852090836,
0.0011613310780376196,
0.06660808622837067,
-0.016230419278144836,
0.06823775917291641,
-0.13560809195041656,
-0.03582429885864258,
0.0790911465883255,
-0.07693151384592056,
-0.14156894385814667,
0.11972879618406296,
-0.026570770889520645,
-0.19904157519340515,
0.05265914276242256,
0.007704653777182102,
0.0908159390091896,
-0.06360849738121033,
0.05343840271234512,
-0.13023801147937775,
-0.12935101985931396,
-0.018437571823596954,
0.07945099472999573,
-0.3450873792171478,
0.13536721467971802,
-0.013286802917718887,
-0.02876877970993519,
-0.06474969536066055,
-0.02640824392437935,
0.013905409723520279,
0.12719078361988068,
0.08667250722646713,
0.0008821099763736129,
0.0991629809141159,
0.03823768347501755,
0.04188435152173042,
-0.002011700300499797,
0.10950417071580887,
0.0050011589191854,
0.004797275178134441,
-0.04982118681073189,
0.007274609990417957,
-0.05164213851094246,
-0.07472953200340271,
0.08393982797861099,
-0.20678792893886566,
0.09087453782558441,
-0.03378438204526901,
0.08427679538726807,
0.04304937273263931,
-0.018965769559144974,
-0.1001204177737236,
0.19745583832263947,
-0.012206900864839554,
-0.11405988782644272,
-0.07517550885677338,
-0.02810264565050602,
0.09103139489889145,
-0.013817726634442806,
0.012886416167020798,
-0.045470476150512695,
0.032183047384023666,
-0.1263762265443802,
-0.1597503274679184,
0.08734500408172607,
-0.04441224783658981,
-0.10894393920898438,
-0.025462759658694267,
0.20382575690746307,
-0.007266622502356768,
0.08242089301347733,
0.01605331338942051,
0.010653935372829437,
-0.18066231906414032,
-0.04018142446875572,
0.02645772136747837,
-0.0016437612939625978,
0.005979063920676708,
0.047698814421892166,
0.019091911613941193,
0.06207629665732384,
-0.1069745197892189,
-0.013920160941779613,
0.3158324360847473,
0.15978319942951202,
-0.00912671908736229,
0.14943915605545044,
0.1093616932630539,
-0.08669080585241318,
-0.17238758504390717,
-0.1171615794301033,
-0.1210922971367836,
-0.08425768464803696,
-0.10681738704442978,
-0.1525043100118637,
0.09535340964794159,
-0.03392014652490616,
0.03498011827468872,
0.14615866541862488,
-0.280263751745224,
-0.10949636250734329,
0.13820378482341766,
0.010744688101112843,
0.3510635495185852,
-0.12303631007671356,
-0.044944874942302704,
-0.06214528530836105,
-0.16933435201644897,
0.08021392673254013,
-0.031203703954815865,
0.11581093072891235,
-0.0744495838880539,
0.19395925104618073,
0.01719796098768711,
0.014287159778177738,
0.0916559100151062,
0.05038322135806084,
-0.05808406323194504,
-0.07368700206279755,
-0.10248131304979324,
0.010812131687998772,
0.03546109423041344,
0.010252019390463829,
-0.008802837692201138,
0.0211968794465065,
-0.11341743916273117,
-0.050869911909103394,
-0.06302189081907272,
0.0072614275850355625,
-0.01001308299601078,
-0.042155615985393524,
-0.05533592775464058,
-0.022557416930794716,
-0.020093943923711777,
0.02266426384449005,
0.14185629785060883,
-0.07527699321508408,
0.18586260080337524,
0.02357078716158867,
0.1586609035730362,
-0.11956068128347397,
-0.06724818795919418,
-0.029193658381700516,
-0.05280323326587677,
0.06468886137008667,
-0.08884575963020325,
-0.027708567678928375,
0.1332162618637085,
-0.01903904788196087,
0.04655366763472557,
0.12936700880527496,
0.02046884410083294,
0.015383756719529629,
0.034968774765729904,
-0.2578005790710449,
-0.07463036477565765,
-0.03505445644259453,
-0.012416874058544636,
0.05272092670202255,
0.05525677278637886,
0.19735674560070038,
-0.03551921248435974,
-0.08521962910890579,
0.020131373777985573,
0.02735883742570877,
-0.02776256389915943,
0.10749414563179016,
0.019579345360398293,
-0.004837906453758478,
-0.16151933372020721,
0.08257976174354553,
-0.005964108742773533,
-0.08297000825405121,
0.028665626421570778,
0.2024049311876297,
-0.12141239643096924,
-0.10309756547212601,
-0.06804922968149185,
0.07315051555633545,
-0.09220825880765915,
0.016043387353420258,
-0.005091092549264431,
-0.1521538347005844,
0.06916408240795135,
0.07598215341567993,
0.04075418785214424,
0.06513199955224991,
-0.11743064224720001,
-0.015730571001768112,
-0.04170290008187294,
-0.002195435343310237,
0.03521120920777321,
0.01863143965601921,
-0.057492829859256744,
0.15846455097198486,
-0.0676199421286583,
0.08538917452096939,
-0.0744810476899147,
-0.1058846190571785,
-0.1395980566740036,
0.04660497233271599,
-0.08038312196731567,
-0.07247276604175568,
-0.12832807004451752,
-0.052204377949237823,
-0.0067099276930093765,
-0.03388519585132599,
0.006552806124091148,
-0.06627799570560455,
-0.10922821611166,
0.01822470687329769,
-0.00743203004822135,
-0.009385870769619942,
-0.06096754968166351,
0.026706209406256676,
0.06246216222643852,
-0.039788868278265,
0.15730851888656616,
0.22509248554706573,
-0.13591648638248444,
0.11564400047063828,
-0.09797432273626328,
-0.105463907122612,
0.046008042991161346,
0.009427277371287346,
0.03594303876161575,
0.0503489226102829,
-0.03594081476330757,
0.0044484552927315235,
0.03905477747321129,
0.08074651658535004,
0.08456914126873016,
-0.06776505708694458,
0.020801106467843056,
-0.05122765153646469,
-0.14904099702835083,
-0.016655439510941505,
-0.0464773029088974,
0.06876829266548157,
-0.006725262850522995,
0.11020535975694656,
-0.0515950471162796,
0.07739507406949997,
-0.07558431476354599,
0.050614211708307266,
0.021146971732378006,
-0.14688286185264587,
-0.006612539757043123,
-0.07093682140111923,
0.042144812643527985,
-0.008834975771605968,
0.20241086184978485,
-0.03228091076016426,
0.010342049412429333,
0.033811055123806,
0.06203942745923996,
-0.01957780309021473,
0.009357001632452011,
0.2014283686876297,
0.12640917301177979,
-0.08496357500553131,
-0.02679651789367199,
0.06793134659528732,
0.07248228788375854,
0.07093550264835358,
0.10807815194129944,
-0.015352966263890266,
0.028434239327907562,
0.07829629629850388,
-0.060215238481760025,
0.07576877623796463,
-0.08603982627391815,
-0.11668483167886734,
0.05793621391057968,
0.012955795042216778,
-0.055695828050374985,
0.20305177569389343,
0.19142870604991913,
-0.026278704404830933,
0.018410727381706238,
-0.0029499190859496593,
-0.10117456316947937,
-0.15619947016239166,
-0.05423750728368759,
-0.07170962542295456,
-0.1319410353899002,
-0.004549739416688681,
-0.16646917164325714,
0.022016216069459915,
-0.01132756657898426,
0.09506805986166,
-0.06855440139770508,
-0.01345991250127554,
0.1364889293909073,
-0.1055467277765274,
0.0847758799791336,
-0.024517204612493515,
0.07877567410469055,
-0.03746940940618515,
-0.018209461122751236,
-0.10342709720134735,
0.007514837197959423,
0.01131442841142416,
0.06840907037258148,
-0.10897937417030334,
0.02432350255548954,
-0.12208317965269089,
-0.08617185056209564,
-0.026142612099647522,
0.09279687702655792,
-0.0403008833527565,
0.15116846561431885,
0.02645145356655121,
-0.06710928678512573,
-0.004313822835683823,
0.2646709978580475,
-0.08046227693557739,
-0.08319197595119476,
-0.030799202620983124,
0.2152107208967209,
0.04053696244955063,
0.06396269053220749,
0.019140036776661873,
0.038027774542570114,
-0.07184682041406631,
0.2957373559474945,
0.34401440620422363,
-0.1318037211894989,
-0.007773484103381634,
0.04225075617432594,
0.04406323283910751,
0.14687567949295044,
0.07998795062303543,
0.11360671371221542,
0.2849363386631012,
-0.09197647124528885,
0.016657205298542976,
-0.04230864346027374,
-0.01424806285649538,
-0.06908884644508362,
0.045314885675907135,
0.08216670155525208,
-0.09241747111082077,
-0.022950593382120132,
0.08125471323728561,
-0.29741767048835754,
0.10791494697332382,
-0.15600289404392242,
-0.14948409795761108,
-0.05027429759502411,
-0.008771711029112339,
0.014683255925774574,
0.019041186198592186,
0.09663030505180359,
0.025651484727859497,
-0.07275258749723434,
0.07816889137029648,
0.024486342445015907,
-0.23020237684249878,
-0.01345184724777937,
0.1456068754196167,
-0.06789913028478622,
-0.025938833132386208,
-0.021313713863492012,
0.051610056310892105,
0.05763651058077812,
0.09027529507875443,
-0.03809558227658272,
-0.0746568813920021,
-0.007141788024455309,
-0.022818787023425102,
0.01914946548640728,
0.0597183033823967,
0.06841408461332321,
-0.0920223817229271,
0.1167774423956871,
-0.07350476831197739,
0.0650370642542839,
0.037623800337314606,
-0.022277191281318665,
0.0018526542698964477,
0.013183658011257648,
-0.06512464582920074,
0.05533479526638985,
0.1295643299818039,
-0.025459708645939827,
-0.002524374984204769,
-0.028180841356515884,
-0.0767761766910553,
-0.024015206843614578,
-0.04643676429986954,
-0.09101243317127228,
-0.18130090832710266,
-0.12738600373268127,
0.041754670441150665,
-0.03240608796477318,
-0.2046082615852356,
0.0060346988029778,
-0.1128578633069992,
0.03700976446270943,
-0.14154092967510223,
0.10004086047410965,
0.07216610759496689,
0.004716616589576006,
0.006774604320526123,
0.0675399899482727,
0.045677728950977325,
0.14796748757362366,
-0.16543124616146088,
-0.04919974133372307
] |
null | null | transformers |
# chatbot | {"tags": ["conversational"]} | text-generation | REAP3R/Chat-bot | [
"transformers",
"pytorch",
"gpt2",
"text-generation",
"conversational",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [] | TAGS
#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
# chatbot | [
"# chatbot"
] | [
"TAGS\n#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n",
"# chatbot"
] | [
51,
3
] | [
"passage: TAGS\n#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n# chatbot"
] | [
0.010968235321342945,
0.007727771066129208,
-0.006352289114147425,
-0.021234644576907158,
0.15643933415412903,
-0.021001052111387253,
0.14073291420936584,
0.12981557846069336,
0.037508778274059296,
-0.03964969888329506,
0.10194994509220123,
0.20841825008392334,
-0.00542474165558815,
0.13561874628067017,
-0.05821920931339264,
-0.24166133999824524,
0.10112466663122177,
0.01476452499628067,
0.03378241881728172,
0.12260545790195465,
0.08316849917173386,
-0.041477590799331665,
0.08418841660022736,
0.0011691325344145298,
-0.15303249657154083,
0.0064209140837192535,
0.05452295020222664,
-0.12329812347888947,
0.10303720086812973,
0.05289599671959877,
0.06106419116258621,
0.012471608817577362,
-0.09770537167787552,
-0.11654352396726608,
0.03456741198897362,
0.01890350505709648,
-0.04074369743466377,
0.03658412769436836,
-0.010813523083925247,
-0.08214707672595978,
0.14671893417835236,
0.11386162042617798,
0.013279279693961143,
0.0903177261352539,
-0.1386452615261078,
0.005235176999121904,
-0.02555673196911812,
0.00337896472774446,
0.0936213880777359,
0.13422952592372894,
-0.046854130923748016,
0.19132405519485474,
-0.08063740283250809,
0.12248838692903519,
0.12049216777086258,
-0.3401472866535187,
-0.025978313758969307,
0.10109777003526688,
0.06011759489774704,
0.10950048267841339,
-0.051378924399614334,
0.049484364688396454,
-0.00712321512401104,
0.018992112949490547,
-0.04772717133164406,
-0.06284298002719879,
-0.16672436892986298,
-0.009949848987162113,
-0.07080666720867157,
-0.05244527757167816,
0.24543210864067078,
-0.02490624412894249,
0.06179231032729149,
-0.07152427732944489,
-0.0980309545993805,
-0.037926021963357925,
-0.06099139153957367,
0.00011904512211913243,
-0.0791013166308403,
0.06272955238819122,
-0.039094701409339905,
-0.10873697698116302,
-0.11974725872278214,
-0.027021821588277817,
-0.1557847410440445,
0.12387338280677795,
0.025450333952903748,
0.06254498660564423,
-0.2513158619403839,
0.06142409145832062,
0.08122029155492783,
-0.074964240193367,
0.01842261478304863,
-0.0921509712934494,
0.0008264317875728011,
0.019875068217515945,
-0.027515538036823273,
-0.09474046528339386,
0.10851836204528809,
0.1110510528087616,
-0.0646900162100792,
0.06138327717781067,
-0.06855621933937073,
0.0693703293800354,
0.042970336973667145,
0.07236487418413162,
0.04730800911784172,
-0.006932645570486784,
0.04044298827648163,
-0.12386598438024521,
0.013073403388261795,
-0.07275812327861786,
-0.16028159856796265,
0.015031903050839901,
0.028119387105107307,
0.0732974112033844,
0.025695225223898888,
0.12698188424110413,
-0.01048135757446289,
-0.03597958758473396,
0.09424072504043579,
-0.02232491783797741,
-0.0019619897939264774,
0.03640150651335716,
-0.014254728332161903,
0.07531033456325531,
0.012703531421720982,
0.02996113710105419,
-0.12737888097763062,
-0.04830339178442955,
-0.05908579379320145,
0.0001308889768552035,
-0.04733804240822792,
-0.02590845339000225,
0.005711355246603489,
0.00738812331110239,
-0.029661906883120537,
-0.18274231255054474,
-0.09702613204717636,
0.002538988832384348,
-0.011665266007184982,
-0.03832115978002548,
-0.08872666209936142,
-0.09335967898368835,
-0.008397960104048252,
0.03065664693713188,
-0.0666441023349762,
-0.03411272168159485,
-0.055469363927841187,
0.09266588091850281,
-0.07358493655920029,
0.1118328794836998,
-0.12777391076087952,
0.06632407754659653,
-0.10776160657405853,
-0.042608607560396194,
-0.14090469479560852,
0.11548750847578049,
-0.03850465267896652,
0.15190273523330688,
-0.010529258288443089,
0.01767810434103012,
-0.07639782875776291,
0.04077168181538582,
-0.05328209325671196,
0.22302140295505524,
-0.05048906430602074,
-0.11502215266227722,
0.34725409746170044,
-0.06485728919506073,
-0.14210256934165955,
0.11303535103797913,
0.003760914783924818,
0.05006154626607895,
0.13632337749004364,
0.2123635709285736,
-0.0261703934520483,
0.003961663693189621,
0.050356198102235794,
0.0831587091088295,
-0.14045821130275726,
-0.016624638810753822,
-0.018806766718626022,
-0.032394640147686005,
-0.07642623782157898,
0.035080648958683014,
0.1725746989250183,
0.10324514657258987,
-0.0428631454706192,
-0.028632357716560364,
-0.0011939750984311104,
-0.011377060785889626,
0.0751439779996872,
-0.013688726350665092,
0.1099335327744484,
-0.07302366197109222,
-0.04514864459633827,
-0.056627798825502396,
0.019711773842573166,
-0.01580791361629963,
0.02164168283343315,
-0.10822531580924988,
0.047292910516262054,
0.06264777481555939,
0.0838978961110115,
-0.12733836472034454,
-0.1415710747241974,
0.00012855489330831915,
0.16368989646434784,
0.06267979741096497,
0.08042116463184357,
0.07889045029878616,
-0.06326374411582947,
0.03520650789141655,
0.010223211720585823,
0.1620071679353714,
-0.025502944365143776,
-0.07119473814964294,
-0.07314620167016983,
0.0813789889216423,
-0.0539436973631382,
0.13123835623264313,
-0.0234846044331789,
0.021171795204281807,
0.05064643546938896,
0.14443543553352356,
0.012337811291217804,
0.008496670052409172,
0.0199549850076437,
-0.03353352099657059,
-0.04138624295592308,
-0.010461508296430111,
0.08826810121536255,
0.012893259525299072,
-0.07700743526220322,
0.24783207476139069,
-0.14848968386650085,
0.09149928390979767,
0.18843573331832886,
-0.21857765316963196,
0.010638960637152195,
-0.053649917244911194,
-0.06008971855044365,
0.014886829070746899,
0.06390618532896042,
-0.056745871901512146,
0.22547407448291779,
-0.0017129938350990415,
0.1830248385667801,
-0.02677134796977043,
-0.03347187489271164,
-0.012601913884282112,
-0.052054524421691895,
-0.004963926505297422,
0.07046899199485779,
0.07929938286542892,
-0.15696802735328674,
0.1630246937274933,
0.08007003366947174,
0.06945012509822845,
0.2289620041847229,
0.029656967148184776,
0.03260974958539009,
0.0826200470328331,
0.024902932345867157,
-0.058335576206445694,
-0.06770749390125275,
-0.3195973336696625,
-0.030061006546020508,
0.06347274780273438,
0.03887705132365227,
0.11986680328845978,
-0.0740366280078888,
-0.018041882663965225,
-0.04303312674164772,
-0.02773725986480713,
0.08105406165122986,
0.10314317047595978,
0.07875426113605499,
0.1463840901851654,
-0.004006334114819765,
-0.05795108899474144,
0.08215587586164474,
0.011105954647064209,
-0.09246756881475449,
0.15028172731399536,
-0.15361902117729187,
-0.41497141122817993,
-0.09441942721605301,
-0.13845877349376678,
-0.07577524334192276,
0.04783830791711807,
0.12056230753660202,
-0.16298602521419525,
-0.005637128371745348,
0.008888390846550465,
0.0722578838467598,
-0.01950574666261673,
0.0044798655435442924,
-0.020905667915940285,
0.012374204583466053,
-0.07598809897899628,
-0.11197791993618011,
-0.05649970471858978,
-0.037194665521383286,
-0.106648288667202,
0.14664947986602783,
-0.11991269886493683,
0.023200567811727524,
0.20641808211803436,
0.031089279800653458,
0.0795053094625473,
-0.02582305483520031,
0.23265144228935242,
-0.11483077704906464,
0.007941543124616146,
0.16255860030651093,
0.0073867072351276875,
0.04933791607618332,
0.13886050879955292,
-0.019989047199487686,
-0.10307926684617996,
0.046441830694675446,
-0.0163872130215168,
-0.07975195348262787,
-0.21294564008712769,
-0.16701461374759674,
-0.10543602705001831,
0.10835474729537964,
0.006600744556635618,
0.08079647272825241,
0.16895653307437897,
0.036466874182224274,
-0.0481322705745697,
-0.013781754299998283,
0.06882532685995102,
0.07971814274787903,
0.16089901328086853,
-0.09387132525444031,
0.14651620388031006,
-0.029950179159641266,
-0.13938821852207184,
0.07165490835905075,
0.02806977927684784,
0.08653702586889267,
0.062354449182748795,
0.09939214587211609,
0.009161441586911678,
0.07194389402866364,
0.13189098238945007,
0.02129383198916912,
0.031854674220085144,
-0.04957938194274902,
-0.01644894666969776,
-0.02976180799305439,
-0.10522158443927765,
0.0260230153799057,
0.07269643992185593,
-0.1633579283952713,
-0.023752160370349884,
-0.06686579436063766,
0.11339012533426285,
0.09322414547204971,
0.06018132343888283,
-0.168349951505661,
-0.03982655331492424,
0.07550626993179321,
-0.06044725701212883,
-0.1277373880147934,
0.09332257509231567,
0.06403905898332596,
-0.16756905615329742,
0.024505676701664925,
-0.02139412611722946,
0.11550404876470566,
-0.08999937772750854,
0.08817742764949799,
-0.11706983298063278,
-0.058962903916835785,
0.015341798774898052,
0.105455182492733,
-0.2705933451652527,
0.1452779620885849,
-0.029991887509822845,
-0.0541866198182106,
-0.12695784866809845,
-0.014623773284256458,
0.003649629419669509,
0.06584227085113525,
0.06553937494754791,
0.018085302785038948,
-0.024704966694116592,
-0.017190923914313316,
-0.05527152866125107,
0.03158920630812645,
0.07341287285089493,
-0.012052681297063828,
-0.04040921851992607,
-0.03577953204512596,
-0.04021904617547989,
-0.04864274710416794,
-0.10462372750043869,
0.01803085394203663,
-0.17189502716064453,
0.06697724014520645,
0.14285263419151306,
0.12307984381914139,
0.024086667224764824,
0.02395295538008213,
-0.010308430530130863,
0.2797662019729614,
0.04446364939212799,
-0.10608122497797012,
-0.08168934285640717,
-0.04774974286556244,
0.010542072355747223,
-0.05477364361286163,
0.011213324964046478,
-0.0734231248497963,
0.07149861007928848,
-0.06219610571861267,
-0.18187426030635834,
0.11037934571504593,
-0.10816529393196106,
-0.021906547248363495,
-0.02129918336868286,
0.22219528257846832,
0.0006881207227706909,
0.00536661921069026,
0.0382552407681942,
-0.02350775897502899,
-0.09951229393482208,
-0.08767884224653244,
0.020559556782245636,
0.046638090163469315,
-0.016793793067336082,
0.058419786393642426,
-0.028869345784187317,
-0.11537910252809525,
-0.08667223900556564,
-0.004584715235978365,
0.342002809047699,
0.10664524137973785,
-0.04054054245352745,
0.20385801792144775,
0.11959315836429596,
-0.03308909013867378,
-0.26087725162506104,
-0.10823334008455276,
-0.0878646969795227,
-0.07608511298894882,
-0.089250847697258,
-0.17844969034194946,
0.04727178066968918,
-0.0320587232708931,
-0.020154796540737152,
0.13518373668193817,
-0.28961649537086487,
-0.06996733695268631,
0.15610413253307343,
-0.019609389826655388,
0.3764498233795166,
-0.09389678388834,
-0.10073988884687424,
-0.030432209372520447,
-0.13337567448616028,
0.18723532557487488,
-0.03852540999650955,
0.09648825973272324,
0.019809750840067863,
0.20044921338558197,
0.0581018291413784,
-0.007363624405115843,
0.04672066122293472,
0.0041247205808758736,
-0.059888388961553574,
-0.11298070102930069,
-0.04421853646636009,
-0.0007281508878804743,
0.024450309574604034,
0.04857132211327553,
-0.05521932989358902,
0.009354500100016594,
-0.10515283048152924,
-0.017762495204806328,
-0.13058432936668396,
0.042659398168325424,
0.03756467252969742,
-0.04862840473651886,
-0.01378182414919138,
-0.06273111701011658,
-0.013175230473279953,
0.05894321948289871,
0.19168667495250702,
-0.07272739708423615,
0.21076641976833344,
0.11558205634355545,
0.08411826938390732,
-0.22147415578365326,
-0.02523861825466156,
-0.057225290685892105,
-0.04152408614754677,
0.08137895911931992,
-0.04853685572743416,
0.05035674199461937,
0.08528727293014526,
-0.060956716537475586,
0.11212325096130371,
0.06785566359758377,
-0.02554694004356861,
0.018215762451291084,
0.10199369490146637,
-0.2747262120246887,
-0.07341162115335464,
-0.054314352571964264,
0.14979170262813568,
0.10969287157058716,
0.10020744055509567,
0.22787094116210938,
0.01726788841187954,
-0.05493323877453804,
-0.009304939769208431,
0.03333360329270363,
-0.034331969916820526,
0.03036540001630783,
-0.05691874399781227,
0.023586733266711235,
-0.17194314301013947,
0.036453936249017715,
0.03661350533366203,
-0.15939517319202423,
0.05747129023075104,
0.21142849326133728,
-0.13509301841259003,
-0.1372973471879959,
-0.10331124067306519,
0.08924120664596558,
-0.05636236444115639,
0.0005496678641065955,
-0.04617026820778847,
-0.11833962798118591,
0.06463127583265305,
0.07095065712928772,
0.021985189989209175,
0.08208142966032028,
-0.03020087629556656,
-0.017765633761882782,
-0.018312979489564896,
-0.03290516883134842,
-0.03723575174808502,
-0.03289555013179779,
-0.051995716989040375,
0.059567052870988846,
0.0005490952171385288,
0.12000774592161179,
-0.0840354636311531,
-0.14145635068416595,
-0.17658033967018127,
0.034249063581228256,
-0.0688028484582901,
-0.09679076075553894,
-0.11906569451093674,
-0.050392962992191315,
0.0026361907366663218,
-0.02761034481227398,
-0.037074096500873566,
-0.053501419723033905,
-0.12077222764492035,
0.013399235904216766,
-0.03256428614258766,
0.017926257103681564,
-0.09752757102251053,
0.037193115800619125,
0.06958825141191483,
-0.028703154996037483,
0.1767270416021347,
0.1911511868238449,
-0.10594157874584198,
0.08928286284208298,
-0.11155028641223907,
-0.09694984555244446,
0.1144760400056839,
0.009480922482907772,
0.08490347117185593,
0.09539744257926941,
-0.001813035341911018,
0.06989146769046783,
0.06293424218893051,
0.05136820673942566,
0.05649825558066368,
-0.12218942493200302,
0.05862679332494736,
0.0073097781278193,
-0.13656875491142273,
-0.04719697684049606,
-0.05046750605106354,
0.024087902158498764,
0.010694477707147598,
0.10499544441699982,
-0.07243356853723526,
0.08560462296009064,
-0.043090492486953735,
0.023455526679754257,
0.027065368369221687,
-0.15994195640087128,
0.0122337955981493,
-0.06750793009996414,
0.03353302553296089,
-0.0075696553103625774,
0.1402260661125183,
0.046394024044275284,
0.013495087623596191,
0.0524686723947525,
0.04742848500609398,
0.002679830649867654,
0.012045069597661495,
0.06517841666936874,
0.09384872019290924,
-0.06348707526922226,
-0.05434736981987953,
0.046185366809368134,
0.046089570969343185,
-0.0026142976712435484,
0.1433531641960144,
-0.019801612943410873,
0.03141777217388153,
0.07546926289796829,
0.010503745637834072,
0.010958097875118256,
-0.14913257956504822,
-0.13310807943344116,
-0.09403715282678604,
0.06431301683187485,
-0.07352129369974136,
0.12460990250110626,
0.1318122297525406,
-0.007569256704300642,
0.01886865682899952,
-0.024605730548501015,
-0.04436752200126648,
-0.1257784515619278,
-0.10587292909622192,
-0.06667178869247437,
-0.1875438541173935,
0.010156827047467232,
-0.09631170332431793,
0.059540148824453354,
0.003711156779900193,
0.06180589646100998,
-0.06110977381467819,
0.1306600421667099,
0.021868662908673286,
-0.09969628602266312,
0.04721321910619736,
-0.027622448280453682,
0.06326542049646378,
-0.03536062315106392,
-0.009182078763842583,
-0.08070515096187592,
0.04022664204239845,
0.006954141892492771,
0.08317069709300995,
-0.040859419852495193,
0.01854594424366951,
-0.1144041195511818,
-0.08724456280469894,
-0.046521034091711044,
0.051244210451841354,
-0.055659763514995575,
0.11636100709438324,
0.05031655728816986,
-0.01122594065964222,
0.017421316355466843,
0.24626222252845764,
-0.07361259311437607,
-0.05732099711894989,
-0.08648043870925903,
0.17137260735034943,
0.0029120277613401413,
0.07970262318849564,
-0.046578772366046906,
0.00009305967978434637,
-0.15492291748523712,
0.34188634157180786,
0.30378296971321106,
-0.08325394988059998,
0.016406185925006866,
-0.04247250407934189,
0.04008674621582031,
0.07640576362609863,
0.1321054995059967,
0.12382277846336365,
0.3342989683151245,
-0.04842456430196762,
-0.015471839345991611,
0.00039677578024566174,
-0.039695288985967636,
-0.1082836240530014,
0.0237716156989336,
0.0115501182153821,
-0.03137712553143501,
-0.018448546528816223,
0.07904249429702759,
-0.27964651584625244,
0.06845441460609436,
-0.15490150451660156,
-0.2179986536502838,
-0.06582965701818466,
0.0009568033274263144,
0.14152410626411438,
0.02705411985516548,
0.13051259517669678,
0.01944400928914547,
-0.09969880431890488,
0.09459847956895828,
0.0038436707109212875,
-0.2086271047592163,
-0.07184503972530365,
0.089839406311512,
-0.156601220369339,
0.040388744324445724,
-0.04467551410198212,
0.05246507748961449,
0.06981616467237473,
0.037365756928920746,
-0.027069754898548126,
0.030589725822210312,
-0.011969237588346004,
-0.07046740502119064,
0.0015125696081668139,
0.08624745905399323,
0.011207466945052147,
0.0008624065085314214,
0.06532314419746399,
-0.20856845378875732,
0.01669275388121605,
-0.02748848684132099,
0.03132975846529007,
-0.0005896484944969416,
0.048278164118528366,
-0.053153518587350845,
0.05129438266158104,
0.09185507148504257,
-0.01909736730158329,
0.012324882671236992,
-0.01746486872434616,
-0.03575682267546654,
-0.04160476475954056,
-0.11001599580049515,
-0.1364278942346573,
-0.21645081043243408,
-0.12886761128902435,
0.07466620951890945,
-0.00014694548735860735,
-0.19641409814357758,
0.014561931602656841,
-0.1055859625339508,
0.07478046417236328,
-0.13186629116535187,
0.09759792685508728,
0.07289654016494751,
0.017645543441176414,
0.004783345852047205,
0.069838747382164,
0.055998072028160095,
0.10635696351528168,
-0.09407062828540802,
-0.07457634806632996
] |
null | null | transformers |
# Saitama DialoGPT Model | {"tags": ["conversational"]} | text-generation | REZERO/DialoGPT-medium-saitama | [
"transformers",
"pytorch",
"gpt2",
"text-generation",
"conversational",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [] | TAGS
#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
# Saitama DialoGPT Model | [
"# Saitama DialoGPT Model"
] | [
"TAGS\n#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n",
"# Saitama DialoGPT Model"
] | [
51,
8
] | [
"passage: TAGS\n#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n# Saitama DialoGPT Model"
] | [
-0.023153873160481453,
0.07107754051685333,
-0.005953408777713776,
0.023764409124851227,
0.18122904002666473,
-0.003066613106057048,
0.14991125464439392,
0.12210971862077713,
-0.01997346803545952,
-0.03993896394968033,
0.09243354946374893,
0.12895074486732483,
0.029486792162060738,
0.09168091416358948,
-0.06925652176141739,
-0.33698907494544983,
0.06313901394605637,
0.04780515655875206,
-0.016657548025250435,
0.12589992582798004,
0.10782457143068314,
-0.0418403297662735,
0.08070060610771179,
0.012784432619810104,
-0.1336587816476822,
0.021662723273038864,
0.0200305487960577,
-0.13588377833366394,
0.10678943991661072,
0.05228463560342789,
0.05161559209227562,
0.013158738613128662,
-0.03510143980383873,
-0.12578295171260834,
0.03313784301280975,
-0.03041624091565609,
-0.007903971709311008,
0.03764877840876579,
0.021633531898260117,
-0.06411447376012802,
0.07929845154285431,
0.13999362289905548,
-0.002196913119405508,
0.0633217990398407,
-0.15874509513378143,
0.0047588227316737175,
0.013217811472713947,
-0.0022693059872835875,
0.08830883353948593,
0.1177537590265274,
-0.048731230199337006,
0.06372243911027908,
-0.08294130861759186,
0.09291849285364151,
0.08647377043962479,
-0.30685967206954956,
-0.020164579153060913,
0.08150279521942139,
0.010888618417084217,
0.058792393654584885,
-0.032043587416410446,
0.07207341492176056,
0.02664591185748577,
0.007302716840058565,
-0.04255557432770729,
-0.0773535966873169,
-0.09540990740060806,
0.018656769767403603,
-0.08264278620481491,
-0.0002221566828666255,
0.2537755072116852,
-0.04793602228164673,
0.04883475974202156,
-0.07056108862161636,
-0.08412706106901169,
-0.027590690180659294,
-0.05497371405363083,
-0.03008977510035038,
-0.07607454061508179,
0.07752204686403275,
0.031297557055950165,
-0.06897952407598495,
-0.13192591071128845,
-0.040120065212249756,
-0.14571870863437653,
0.13212600350379944,
0.044438764452934265,
0.03492776304483414,
-0.2240055948495865,
0.07939042150974274,
-0.012206538580358028,
-0.10849474370479584,
0.00349975167773664,
-0.09325100481510162,
-0.005062025506049395,
0.033776070922613144,
-0.03417940065264702,
-0.09708154201507568,
0.06927431374788284,
0.09627387672662735,
-0.029227519407868385,
0.02535044029355049,
-0.03315635398030281,
0.03830578923225403,
0.04986214265227318,
0.09865190088748932,
-0.021642692387104034,
-0.06104541942477226,
0.012105388566851616,
-0.09050039201974869,
-0.0061041866429150105,
-0.06645556539297104,
-0.18373246490955353,
-0.04245065525174141,
0.027418116107583046,
0.058663707226514816,
0.001655497238971293,
0.13771261274814606,
-0.009850055910646915,
-0.04939841479063034,
0.022961348295211792,
-0.054335735738277435,
-0.03182964026927948,
0.0008837311179377139,
-0.010580248199403286,
0.14728085696697235,
-0.005507790483534336,
0.040718238800764084,
-0.1226821169257164,
0.008437149226665497,
-0.04516521096229553,
0.0058802468702197075,
-0.0456409677863121,
-0.03278413414955139,
-0.015939662232995033,
-0.03676491603255272,
0.0037332952488213778,
-0.16208599507808685,
-0.20224711298942566,
0.0018342725234106183,
-0.018027665093541145,
-0.06309427320957184,
-0.09628146141767502,
-0.09080356359481812,
-0.018551379442214966,
0.04169393703341484,
-0.07061877846717834,
-0.03445938602089882,
-0.05309922248125076,
0.08171187341213226,
0.015232755802571774,
0.0892328992486,
-0.08321741223335266,
0.07723028212785721,
-0.10930545628070831,
-0.022408444434404373,
-0.10197178274393082,
0.11207335442304611,
0.003387970384210348,
0.08777951449155807,
-0.02797604724764824,
-0.0071008228696882725,
-0.07763709127902985,
0.069694384932518,
-0.018263934180140495,
0.26517626643180847,
-0.11143600940704346,
-0.08927091956138611,
0.2734147310256958,
-0.0852208063006401,
-0.11792491376399994,
0.13587065041065216,
0.01212368719279766,
0.09459176659584045,
0.12593191862106323,
0.20912334322929382,
-0.03145218640565872,
0.035359449684619904,
0.0790538638830185,
0.06988079100847244,
-0.07404913753271103,
-0.00041010644054040313,
0.03520293906331062,
-0.0018393328646197915,
-0.0823335349559784,
0.055717289447784424,
0.08641592413187027,
0.07533805072307587,
-0.04344573989510536,
-0.03692280501127243,
0.0013311099028214812,
-0.023200806230306625,
0.0755055844783783,
-0.027250828221440315,
0.13358457386493683,
-0.015252094715833664,
-0.04780300706624985,
0.019107138738036156,
0.03345359116792679,
-0.05709904059767723,
0.041406020522117615,
-0.10449671000242233,
0.0984659492969513,
-0.031053265556693077,
0.05814595893025398,
-0.12834681570529938,
-0.024612437933683395,
-0.023050609976053238,
0.15472012758255005,
0.049624133855104446,
0.06519187986850739,
0.05920263007283211,
-0.04104992374777794,
-0.023142134770751,
0.029402336105704308,
0.15167206525802612,
-0.014978256076574326,
-0.05957932770252228,
-0.08089854568243027,
0.09546913951635361,
-0.05092855542898178,
0.11407840996980667,
-0.0529044084250927,
0.025085926055908203,
0.030578767880797386,
0.09108980745077133,
-0.019673604518175125,
0.03142782300710678,
0.04116592928767204,
-0.009246019646525383,
-0.06219213828444481,
0.020978208631277084,
0.08430809527635574,
-0.00014869561709929258,
-0.10161663591861725,
0.25255286693573,
-0.1711563616991043,
0.1306840181350708,
0.1688169687986374,
-0.21212927997112274,
-0.0064376587979495525,
-0.10607641190290451,
-0.018839413300156593,
0.0032152074854820967,
0.058369919657707214,
-0.031386569142341614,
0.19481989741325378,
-0.02492247335612774,
0.19604501128196716,
-0.04308542236685753,
-0.0022581536322832108,
-0.019386082887649536,
-0.06285826861858368,
0.017328407615423203,
0.08030141144990921,
0.12539297342300415,
-0.15044358372688293,
0.18490110337734222,
0.08133388310670853,
0.03639434650540352,
0.21436667442321777,
0.027987025678157806,
-0.017664970830082893,
0.0586819164454937,
-0.003555404720827937,
-0.03973759338259697,
-0.05717827007174492,
-0.27951136231422424,
-0.021849922835826874,
0.073415607213974,
0.04882766678929329,
0.10328470170497894,
-0.06982522457838058,
-0.036874424666166306,
0.001927564968355,
-0.027803828939795494,
0.04468926042318344,
0.09939703345298767,
0.019926078617572784,
0.11959970742464066,
-0.018495460972189903,
-0.03589120879769325,
0.0636826679110527,
0.02053212746977806,
-0.07025505602359772,
0.18613648414611816,
-0.12652981281280518,
-0.38938868045806885,
-0.11425112932920456,
-0.18264634907245636,
-0.03636631369590759,
0.05922219529747963,
0.09690770506858826,
-0.15224327147006989,
-0.033409483730793,
-0.0025357655249536037,
0.07109341770410538,
-0.0731114000082016,
0.011434447020292282,
-0.04786713793873787,
0.004583416972309351,
-0.10872979462146759,
-0.075033999979496,
-0.05095168203115463,
-0.02620338648557663,
-0.07560182362794876,
0.11837413907051086,
-0.14048264920711517,
0.04562119022011757,
0.23569980263710022,
0.054129697382450104,
0.05613969266414642,
-0.03570348396897316,
0.17727094888687134,
-0.12667353451251984,
0.004186038393527269,
0.23438239097595215,
-0.044986527413129807,
0.046907491981983185,
0.1654556542634964,
-0.012644470669329166,
-0.07570429891347885,
0.041773635894060135,
-0.03295281529426575,
-0.08152813464403152,
-0.2296205759048462,
-0.10326096415519714,
-0.12429995834827423,
0.12830451130867004,
-0.004179736599326134,
0.04940716549754143,
0.15001963078975677,
0.08510575443506241,
-0.030326198786497116,
-0.020631104707717896,
0.0455017127096653,
0.08960295468568802,
0.22447095811367035,
-0.062223587185144424,
0.14636385440826416,
-0.009954736568033695,
-0.15253235399723053,
0.07702276110649109,
0.060026705265045166,
0.07088149338960648,
0.045990318059921265,
0.08448856323957443,
0.010346155613660812,
0.04411664977669716,
0.16061913967132568,
0.03405516967177391,
0.014379248023033142,
-0.03898271173238754,
-0.049606990069150925,
-0.0375806987285614,
-0.0369265191257,
0.06182417273521423,
0.030650559812784195,
-0.13203203678131104,
-0.03279425948858261,
-0.029809273779392242,
0.06739751994609833,
0.061009135097265244,
0.09315480291843414,
-0.18303541839122772,
-0.029937835410237312,
0.07764364778995514,
-0.038285769522190094,
-0.12269356846809387,
0.09172773361206055,
0.01204919908195734,
-0.15046219527721405,
0.05406271666288376,
-0.023471370339393616,
0.11482150852680206,
-0.05963313207030296,
0.0894307792186737,
-0.08279474079608917,
-0.08937511593103409,
-0.004309476353228092,
0.10255865007638931,
-0.3096167743206024,
0.18557390570640564,
-0.011821872554719448,
-0.05261820927262306,
-0.08406276255846024,
0.010075812228024006,
0.023823939263820648,
0.12506906688213348,
0.10820508748292923,
-0.021978596225380898,
0.040954917669296265,
-0.007984665222465992,
-0.04669150710105896,
0.03009173274040222,
0.083735890686512,
-0.022622985765337944,
-0.00916165579110384,
-0.04468459263443947,
0.0008458362426608801,
-0.022719731554389,
-0.07816524058580399,
-0.005295256618410349,
-0.19299955666065216,
0.08629575371742249,
0.0800509974360466,
0.10498785227537155,
0.03337421640753746,
-0.025750182569026947,
-0.07522318512201309,
0.27066996693611145,
-0.003359121037647128,
-0.10694821923971176,
-0.10539275407791138,
-0.010889677330851555,
0.05805909261107445,
-0.060761090368032455,
0.037928272038698196,
-0.06142323091626167,
0.03136397525668144,
-0.027295896783471107,
-0.15223780274391174,
0.10222595185041428,
-0.07997751981019974,
-0.04112233221530914,
-0.026605870574712753,
0.18183059990406036,
-0.03686272352933884,
0.010124551132321358,
0.038385454565286636,
-0.0007232706411741674,
-0.08383350074291229,
-0.07796131074428558,
-0.023676959797739983,
0.052216436713933945,
0.02796427346765995,
0.04313625022768974,
-0.023985236883163452,
-0.07890535145998001,
-0.07274448126554489,
-0.05418599024415016,
0.29991814494132996,
0.11615362763404846,
-0.033594097942113876,
0.1867629438638687,
0.13315485417842865,
-0.06256721168756485,
-0.2642981708049774,
-0.11484654992818832,
-0.06022338941693306,
-0.011365039274096489,
-0.05262000113725662,
-0.17810724675655365,
0.07785513997077942,
-0.04474571347236633,
-0.02859884686768055,
0.04735434427857399,
-0.27356597781181335,
-0.10526372492313385,
0.1477373242378235,
-0.012173733673989773,
0.3757399320602417,
-0.1223902627825737,
-0.08358810842037201,
-0.0620410293340683,
-0.1663483828306198,
0.1296921670436859,
0.02494349144399166,
0.12018869817256927,
-0.012135929428040981,
0.18576431274414062,
0.05734938010573387,
-0.007734944578260183,
0.09506871551275253,
-0.007873110473155975,
-0.06734847277402878,
-0.10892699658870697,
-0.06828224658966064,
-0.004140655044466257,
0.03208066523075104,
0.047342702746391296,
-0.048786286264657974,
0.01960371620953083,
-0.12198910117149353,
-0.07616821676492691,
-0.05538928881287575,
0.029951922595500946,
0.04118780791759491,
-0.0872274786233902,
0.005227487068623304,
-0.02940376289188862,
-0.012113523669540882,
0.017588894814252853,
0.13769450783729553,
-0.12167663872241974,
0.12565451860427856,
0.09332355856895447,
0.1365748792886734,
-0.0976482406258583,
0.017082471400499344,
-0.06887205690145493,
-0.05699707940220833,
0.07193776965141296,
-0.0977422297000885,
0.029956772923469543,
0.08542360365390778,
-0.04046245664358139,
0.08777686953544617,
0.09037777781486511,
-0.0008694328716956079,
0.039515167474746704,
0.08993053436279297,
-0.2210894674062729,
-0.07377807796001434,
-0.07771110534667969,
0.019632240757346153,
0.0808904767036438,
0.1069648489356041,
0.20890158414840698,
-0.02723170444369316,
-0.03398147225379944,
0.014984142035245895,
0.0304188784211874,
-0.036092035472393036,
0.08026710152626038,
-0.015455967746675014,
0.011959919705986977,
-0.1466035097837448,
0.053071070462465286,
0.02269093506038189,
-0.09298564493656158,
0.046704184263944626,
0.1622888594865799,
-0.11260247975587845,
-0.11633102595806122,
-0.09244685620069504,
0.09073828905820847,
-0.11279729753732681,
-0.033469103276729584,
-0.06779620796442032,
-0.1362210065126419,
0.0630846843123436,
0.08703886717557907,
0.05488205701112747,
0.06352157145738602,
-0.09594085812568665,
-0.013219335116446018,
-0.028229501098394394,
0.0060243019834160805,
0.05017993599176407,
-0.00832225102931261,
-0.08356975018978119,
0.05480857193470001,
-0.021395161747932434,
0.1327543705701828,
-0.09248124808073044,
-0.1100289449095726,
-0.1465788334608078,
0.04633527994155884,
-0.11663398891687393,
-0.0683368667960167,
-0.08819134533405304,
-0.062266238033771515,
-0.014930016361176968,
-0.04340945929288864,
-0.042175356298685074,
-0.03623103350400925,
-0.1044551357626915,
0.04596075043082237,
-0.04949600622057915,
0.013452988117933273,
-0.057802051305770874,
0.02553483471274376,
0.061141300946474075,
-0.023649850860238075,
0.15666192770004272,
0.142090305685997,
-0.1213076189160347,
0.08939436078071594,
-0.13496404886245728,
-0.05848916620016098,
0.09415590763092041,
0.017514759674668312,
0.046090226620435715,
0.06931755691766739,
0.007369108498096466,
0.05920279398560524,
0.04540710896253586,
0.034180622547864914,
0.06795939803123474,
-0.09269699454307556,
0.03581032529473305,
-0.034019872546195984,
-0.1205059140920639,
-0.05555528774857521,
-0.022202609106898308,
0.009410049766302109,
0.05158711224794388,
0.0770585834980011,
-0.057187698781490326,
0.09339338541030884,
-0.04960932582616806,
0.035750050097703934,
0.005233672447502613,
-0.14933736622333527,
-0.04844100400805473,
-0.08671934902667999,
0.043258484452962875,
0.002246923977509141,
0.1819533109664917,
0.03964317962527275,
-0.004827064927667379,
0.02564043179154396,
0.0312841460108757,
0.09516461193561554,
0.008763880468904972,
0.187296524643898,
0.11837659776210785,
-0.04847294092178345,
-0.06537475436925888,
0.07803808152675629,
0.01231584046036005,
0.060851581394672394,
0.07628834247589111,
-0.014270386658608913,
-0.04902493581175804,
0.08926861733198166,
-0.0316789373755455,
0.05664591118693352,
-0.11402633041143417,
-0.1347879320383072,
-0.07616058737039566,
0.04327382519841194,
-0.055061209946870804,
0.19453541934490204,
0.17913834750652313,
-0.006995166186243296,
0.0378907285630703,
-0.028746500611305237,
-0.07849523425102234,
-0.1877220869064331,
-0.20785531401634216,
-0.08516757935285568,
-0.13845297694206238,
0.010510895401239395,
-0.131507009267807,
0.055931948125362396,
0.07768642902374268,
0.08856619894504547,
-0.06815069168806076,
0.10249146819114685,
0.07697755843400955,
-0.09573107212781906,
0.07565842568874359,
-0.03148474916815758,
0.08978269249200821,
-0.011195252649486065,
-0.0065995254553854465,
-0.07734863460063934,
0.06544262915849686,
0.0022064978256821632,
0.043904103338718414,
-0.060252897441387177,
-0.0019806751515716314,
-0.09392470866441727,
-0.07757556438446045,
-0.056926947087049484,
0.05712747201323509,
0.006881718523800373,
0.13035044074058533,
0.01386997476220131,
-0.03215445205569267,
0.024541664868593216,
0.2773635983467102,
-0.07444801926612854,
-0.10093505680561066,
-0.08887405693531036,
0.20841583609580994,
0.022845063358545303,
0.09302378445863724,
-0.028164992108941078,
-0.007602682337164879,
-0.08401257544755936,
0.3546716570854187,
0.28286561369895935,
-0.1176091656088829,
0.0008052923367358744,
0.00275834696367383,
0.042940400540828705,
0.0921991616487503,
0.1262633204460144,
0.09348876774311066,
0.29635313153266907,
-0.05624886602163315,
-0.011196007952094078,
-0.01837354153394699,
-0.04983067512512207,
-0.09507995843887329,
0.08521376550197601,
0.049043599516153336,
-0.05484886094927788,
-0.02933478355407715,
0.11572904884815216,
-0.24729600548744202,
0.09715190529823303,
-0.18829986453056335,
-0.1750880628824234,
-0.09569097310304642,
-0.007530409377068281,
0.09744883328676224,
0.03669511526823044,
0.07189614325761795,
-0.015195947140455246,
-0.051706790924072266,
0.06127836927771568,
0.04057583212852478,
-0.16575482487678528,
0.0061542014591395855,
0.09231461584568024,
-0.091042160987854,
-0.03320780396461487,
-0.024324992671608925,
0.039666540920734406,
0.06684267520904541,
0.055795345455408096,
-0.006289859768003225,
0.03582693636417389,
0.00027493256493471563,
-0.05071062594652176,
0.04414432495832443,
0.07709326595067978,
0.017651308327913284,
-0.08534421771764755,
0.05568121373653412,
-0.15720441937446594,
0.0505107156932354,
0.008089756593108177,
-0.031631775200366974,
-0.020419619977474213,
0.04748040810227394,
-0.07022839784622192,
0.07640682905912399,
0.09610798209905624,
-0.02169966697692871,
-0.020046338438987732,
-0.042089302092790604,
0.0022599853109568357,
-0.03986174613237381,
-0.10905148088932037,
-0.082790806889534,
-0.17905497550964355,
-0.12600409984588623,
0.08628351986408234,
0.006096112076193094,
-0.18306514620780945,
0.0157907847315073,
-0.13024748861789703,
0.07039858400821686,
-0.1412716805934906,
0.10184203088283539,
0.08223457634449005,
0.012235118076205254,
0.007937838323414326,
-0.007087912876158953,
0.05295511707663536,
0.0949840098619461,
-0.13329282402992249,
-0.09243553131818771
] |
null | null | null | RICH双子 | {} | null | RICH/rui-test | [
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [] | TAGS
#region-us
| RICH双子 | [] | [
"TAGS\n#region-us \n"
] | [
6
] | [
"passage: TAGS\n#region-us \n"
] | [
0.024608636274933815,
-0.026205500587821007,
-0.009666500613093376,
-0.10395516455173492,
0.08638657629489899,
0.059816278517246246,
0.01882290467619896,
0.020661840215325356,
0.23975107073783875,
-0.005599027033895254,
0.1219947561621666,
0.0015615287702530622,
-0.037353623658418655,
0.03733762726187706,
-0.0035912662278860807,
-0.17583473026752472,
0.03876631706953049,
-0.018274923786520958,
0.01843859627842903,
0.026470553129911423,
-0.07776834815740585,
-0.07564429938793182,
0.015296397730708122,
-0.10247814655303955,
-0.083692267537117,
0.11002834886312485,
0.031466204673051834,
-0.019670886918902397,
0.10779199749231339,
-0.04243955761194229,
0.18699054419994354,
-0.011512263678014278,
-0.11213519424200058,
-0.2536850869655609,
0.021806683391332626,
-0.01765260472893715,
-0.08747660368680954,
0.01506110467016697,
0.0665089413523674,
-0.09014441072940826,
-0.0588928684592247,
0.0795099288225174,
-0.01132340170443058,
0.04246443510055542,
-0.27593839168548584,
-0.12684126198291779,
-0.05297930911183357,
-0.1421966552734375,
0.08651168644428253,
0.04035491496324539,
0.008764253929257393,
0.15506891906261444,
-0.20897391438484192,
0.004104613792151213,
0.08255259692668915,
-0.2538507878780365,
0.05591634660959244,
0.17671173810958862,
0.03623908758163452,
0.18037272989749908,
0.0060391901060938835,
0.11029672622680664,
0.0716743916273117,
-0.024263937026262283,
-0.17590197920799255,
-0.08127854019403458,
-0.04696211963891983,
0.16642488539218903,
-0.06727185100317001,
-0.14248386025428772,
0.34701237082481384,
0.00015008423360995948,
0.009657775051891804,
0.16921205818653107,
-0.059524230659008026,
-0.09972117841243744,
0.07259953022003174,
0.016484731808304787,
0.018492350354790688,
0.1471305936574936,
0.16307872533798218,
-0.0458691343665123,
-0.13837823271751404,
-0.018630273640155792,
-0.22798998653888702,
0.17510560154914856,
-0.03248048573732376,
0.13137903809547424,
-0.27447956800460815,
0.01684025302529335,
-0.2570667266845703,
0.0032130838371813297,
0.04178816080093384,
-0.06004921346902847,
-0.0226522795855999,
-0.013265985064208508,
-0.08018817007541656,
0.004899587947875261,
0.06192673370242119,
0.1266920566558838,
-0.06128726154565811,
0.06128238886594772,
-0.09319206327199936,
0.141696035861969,
0.07166698575019836,
0.07868369668722153,
0.13037432730197906,
0.041205424815416336,
-0.07187089323997498,
-0.21872246265411377,
-0.0026476888451725245,
-0.06275863200426102,
-0.09502086788415909,
-0.0020165652967989445,
-0.11606067419052124,
0.17244569957256317,
-0.030802514404058456,
-0.09825427830219269,
-0.11208184063434601,
0.09148659557104111,
-0.032992321997880936,
-0.03437839448451996,
-0.03552987426519394,
-0.020977836102247238,
0.019381176680326462,
0.04704452306032181,
-0.1548958420753479,
-0.005131472367793322,
0.07039852440357208,
0.11502562463283539,
-0.1346137970685959,
-0.003783059772104025,
-0.07908964157104492,
0.03039063885807991,
0.07654735445976257,
-0.16510222852230072,
0.03158547356724739,
-0.1124754324555397,
-0.07531405985355377,
0.002912673633545637,
-0.015710093080997467,
-0.016202643513679504,
0.166526660323143,
-0.0020451415330171585,
0.0714716836810112,
-0.026345307007431984,
-0.05890209600329399,
-0.11243434250354767,
-0.08489254862070084,
0.05390460044145584,
0.03670717030763626,
0.03266148269176483,
-0.2193479984998703,
0.014805203303694725,
-0.12762966752052307,
0.1360815018415451,
-0.10566820204257965,
-0.04705966264009476,
-0.022842247039079666,
0.20562705397605896,
0.037286072969436646,
0.08762791007757187,
-0.22171171009540558,
0.039756543934345245,
-0.05404696613550186,
0.18480908870697021,
-0.1502426266670227,
-0.0799463614821434,
0.20813211798667908,
-0.07964949309825897,
-0.10115210711956024,
0.021235812455415726,
0.020391687750816345,
0.026287272572517395,
0.0766737088561058,
0.4564172327518463,
-0.09766800701618195,
-0.09146861732006073,
0.10178250074386597,
0.17055274546146393,
-0.12427149713039398,
-0.1827561855316162,
0.06446871906518936,
-0.16666454076766968,
-0.1973118633031845,
0.0018917324487119913,
0.09222044050693512,
0.038269978016614914,
-0.07875611633062363,
-0.020746968686580658,
0.06325206160545349,
-0.0007678253459744155,
0.09095914661884308,
0.03755716234445572,
0.09034032374620438,
-0.08716782182455063,
0.11115926504135132,
-0.05017651244997978,
0.004037132486701012,
0.1343354731798172,
0.027325427159667015,
-0.03223329409956932,
0.08694463223218918,
-0.0485352948307991,
0.05295134335756302,
-0.1662379503250122,
-0.15068690478801727,
0.03398871049284935,
0.06283251196146011,
0.03186952322721481,
0.1280253529548645,
0.08141885697841644,
-0.10732853412628174,
0.022690722718834877,
-0.004228927195072174,
0.058398615568876266,
0.03891623765230179,
0.006107209715992212,
0.008764320984482765,
0.0961301177740097,
-0.10607069730758667,
-0.13589619100093842,
-0.07336436957120895,
-0.014715781435370445,
0.14371353387832642,
-0.0302802175283432,
0.07690227776765823,
-0.004240254405885935,
0.00013200697139836848,
0.06930823624134064,
0.08137880265712738,
0.016412746161222458,
0.08971183747053146,
-0.05237193778157234,
-0.05160155147314072,
0.10863113403320312,
-0.13533565402030945,
0.17837053537368774,
0.14053137600421906,
-0.20532016456127167,
0.029453208670020103,
-0.06838275492191315,
0.03670361638069153,
-0.008162540383636951,
0.0975119024515152,
-0.08272241055965424,
-0.02106042578816414,
0.013134466484189034,
0.0052274600602686405,
-0.013007243163883686,
0.017682146281003952,
-0.07295988500118256,
-0.07787393033504486,
-0.10233919322490692,
0.08436838537454605,
0.11562882363796234,
-0.10282530635595322,
0.14214380085468292,
0.4384984076023102,
0.11495281755924225,
0.21582984924316406,
-0.09581480920314789,
-0.0412987545132637,
0.007486371789127588,
0.0001535322517156601,
-0.04476691037416458,
0.08031861484050751,
-0.15973517298698425,
-0.038901735097169876,
0.027348900213837624,
0.07128690183162689,
0.11475157737731934,
-0.14959022402763367,
-0.09639324247837067,
-0.00793045200407505,
0.0022841424215584993,
-0.1249532699584961,
0.023905446752905846,
-0.03974650055170059,
0.04015624523162842,
0.07232289016246796,
-0.021535737439990044,
0.13939237594604492,
-0.04166141897439957,
-0.0639561116695404,
0.07585346698760986,
-0.2017085999250412,
-0.23179671168327332,
-0.12309670448303223,
-0.14680525660514832,
0.04366797208786011,
0.05154111236333847,
0.01726446859538555,
-0.17635835707187653,
-0.015074856579303741,
0.07706750929355621,
0.07820965349674225,
-0.20886357128620148,
-0.022814949974417686,
-0.004290030337870121,
0.0895976573228836,
-0.10227091610431671,
-0.0017130117630586028,
-0.04419664293527603,
-0.10150232166051865,
0.0017003051470965147,
0.07279510796070099,
-0.137485533952713,
0.13807645440101624,
0.21589438617229462,
0.07225540280342102,
0.07359948754310608,
-0.019093448296189308,
0.09936179965734482,
-0.10856141895055771,
-0.16549113392829895,
0.08348225057125092,
-0.06234746053814888,
0.047262318432331085,
0.17534415423870087,
0.03307317942380905,
-0.13904969394207,
-0.015682822093367577,
-0.0402069091796875,
-0.15603256225585938,
-0.238995760679245,
-0.09178274869918823,
-0.1182505264878273,
0.16442428529262543,
0.0009358620154671371,
0.06651917099952698,
0.08258313685655594,
-0.022042419761419296,
0.16447891294956207,
-0.07379321753978729,
-0.07578866183757782,
-0.006978808436542749,
0.12375060468912125,
-0.056660156697034836,
-0.03080669604241848,
-0.10566964000463486,
-0.008295975625514984,
0.1151021271944046,
0.15304014086723328,
0.12214863300323486,
0.2957419455051422,
0.08268889784812927,
0.026645636186003685,
0.08958091586828232,
0.17622539401054382,
0.09495089203119278,
0.07838419824838638,
-0.045413073152303696,
-0.014814783819019794,
0.014317171648144722,
-0.04022889584302902,
0.010141594335436821,
0.14683100581169128,
-0.2679629921913147,
-0.006678564939647913,
-0.2710230350494385,
0.0965198427438736,
-0.10913380235433578,
0.11837165057659149,
-0.01015760749578476,
0.10194015502929688,
0.11082887649536133,
0.03233652561903,
-0.03858073800802231,
0.16613617539405823,
0.08450309932231903,
-0.11277695000171661,
0.001758623169735074,
0.03737903758883476,
0.09715615212917328,
-0.02818971499800682,
0.12721189856529236,
-0.11048974841833115,
-0.1464834064245224,
0.013753619976341724,
0.07152791321277618,
-0.15373679995536804,
0.3138748109340668,
0.012069208547472954,
-0.13481520116329193,
-0.01481647603213787,
-0.09957809001207352,
-0.006440147757530212,
0.1254177987575531,
0.09333524852991104,
0.07935678958892822,
-0.2185502052307129,
-0.13339371979236603,
0.05872276425361633,
-0.00575496768578887,
0.22408108413219452,
-0.034034017473459244,
-0.11356475204229355,
-0.027013886719942093,
0.04241163283586502,
-0.06043251231312752,
0.08524788916110992,
0.023536119610071182,
-0.08113526552915573,
-0.032957352697849274,
0.05323701351881027,
0.012368366122245789,
0.00524376705288887,
0.09360801428556442,
0.020107939839363098,
-0.0009265501867048442,
0.01785753294825554,
0.047885000705718994,
-0.0675911232829094,
-0.1984109878540039,
0.09357594698667526,
-0.05215044692158699,
0.0015536568826064467,
-0.08013670891523361,
-0.15122665464878082,
-0.08837161958217621,
-0.16009655594825745,
0.12540200352668762,
-0.034406669437885284,
0.12700119614601135,
-0.06619787961244583,
0.17341409623622894,
-0.07871770113706589,
0.04481020197272301,
-0.047349292784929276,
0.050332702696323395,
-0.007268077693879604,
-0.07756082713603973,
0.16585899889469147,
-0.15564003586769104,
0.01809087023139,
0.19572502374649048,
-0.018915493041276932,
0.07177707552909851,
0.021322092041373253,
-0.0636206790804863,
0.23147478699684143,
0.3014698624610901,
0.008138049393892288,
0.1665448248386383,
0.3018903136253357,
-0.07466315478086472,
-0.2642788887023926,
-0.05505012720823288,
-0.2841376066207886,
-0.05371501296758652,
0.10716094076633453,
-0.22523896396160126,
0.06986407935619354,
0.14383509755134583,
-0.06471995264291763,
0.30228954553604126,
-0.21825523674488068,
0.012589273042976856,
0.15434536337852478,
-0.08868814259767532,
0.5515313148498535,
-0.1133413165807724,
-0.17677772045135498,
-0.008122089318931103,
-0.08741296827793121,
0.10602109134197235,
-0.0340677872300148,
0.06877441704273224,
0.013465235009789467,
0.04797380417585373,
0.048932258039712906,
-0.03111894056200981,
0.22701001167297363,
0.008710170164704323,
0.09015397727489471,
-0.07378865778446198,
-0.18624304234981537,
0.11639340221881866,
-0.04359482601284981,
-0.08891059458255768,
0.0849778801202774,
-0.05942516401410103,
-0.11078983545303345,
0.04663389176130295,
-0.07950539886951447,
-0.024862350896000862,
0.08423490077257156,
-0.04678233340382576,
-0.042606171220541,
-0.008054176345467567,
-0.1618063747882843,
-0.0002289071271661669,
0.31360217928886414,
-0.07096036523580551,
0.16695955395698547,
0.03677211329340935,
0.00038613268407061696,
-0.11027684062719345,
0.030288029462099075,
-0.05203165486454964,
-0.021576624363660812,
0.09578979015350342,
-0.11096979677677155,
0.03204701095819473,
0.14160704612731934,
-0.04864364117383957,
0.05846960097551346,
0.09256096184253693,
-0.0849417969584465,
0.007583672646433115,
0.17753590643405914,
-0.17537221312522888,
-0.1273445188999176,
-0.006135711446404457,
-0.09862716495990753,
0.14055661857128143,
0.04394126310944557,
0.05191568285226822,
0.16669964790344238,
0.03967129811644554,
-0.029474308714270592,
-0.02817419543862343,
-0.1153380498290062,
-0.0201893113553524,
0.040153320878744125,
0.00045633706031367183,
-0.08791285753250122,
0.2262638509273529,
0.06409153342247009,
-0.1328488290309906,
-0.051157206296920776,
0.2161225974559784,
-0.06805316358804703,
-0.04911920800805092,
-0.223562553524971,
0.10752306133508682,
-0.07112517952919006,
-0.0965060144662857,
0.05453834682703018,
-0.02270081453025341,
0.005106312222778797,
0.181985542178154,
0.03941008821129799,
0.11070270836353302,
0.03738937899470329,
-0.02448922023177147,
0.15798696875572205,
-0.142850860953331,
-0.14191335439682007,
-0.025354057550430298,
-0.08757315576076508,
-0.13844476640224457,
-0.026804137974977493,
0.1617041826248169,
-0.09177309274673462,
-0.14772607386112213,
-0.2621181011199951,
0.10968475043773651,
-0.16432365775108337,
-0.10192688554525375,
-0.03469514101743698,
-0.08968492597341537,
0.0696166530251503,
0.030301768332719803,
-0.03093348816037178,
-0.06706760823726654,
-0.18593791127204895,
0.0816768929362297,
0.06349513679742813,
0.045533183962106705,
-0.017847947776317596,
0.0067379772663116455,
0.1720137596130371,
0.025955144315958023,
0.10040043294429779,
0.16762186586856842,
0.011397695168852806,
0.2246655523777008,
-0.1671202927827835,
-0.11496317386627197,
0.1336962729692459,
-0.026543032377958298,
0.06762003898620605,
0.16792191565036774,
-0.0772583931684494,
0.015526676550507545,
-0.028136352077126503,
0.07066910713911057,
-0.11003983020782471,
-0.105624258518219,
0.007937257178127766,
0.02567129209637642,
-0.2755882740020752,
-0.005599735304713249,
-0.19717298448085785,
0.14788752794265747,
0.02579621411859989,
0.03297143429517746,
0.10257530212402344,
0.10404334217309952,
0.08312062919139862,
-0.0017710148822516203,
0.03226327523589134,
-0.1176818460226059,
0.02753005363047123,
-0.059239376336336136,
-0.020663779228925705,
0.017624232918024063,
0.36952024698257446,
-0.03603357449173927,
-0.046802736818790436,
0.003710439894348383,
0.1307835876941681,
-0.02139742486178875,
0.017395347356796265,
0.13209912180900574,
0.12607666850090027,
-0.08595693111419678,
-0.1504845917224884,
0.04888554662466049,
-0.04565655067563057,
-0.02836887165904045,
0.1464131623506546,
0.05905961990356445,
0.1050296202301979,
0.0908031314611435,
-0.014463032595813274,
-0.00318976235575974,
0.012856799177825451,
-0.15486004948616028,
0.06223496049642563,
-0.010558074340224266,
0.012565906159579754,
0.017934376373887062,
0.15238402783870697,
-0.005540105979889631,
0.07739730179309845,
-0.09889880567789078,
0.004208535887300968,
-0.13498884439468384,
-0.07913459837436676,
0.03617347031831741,
-0.13393273949623108,
0.04141177982091904,
-0.01871878281235695,
0.029611799865961075,
0.30386561155319214,
0.02558239921927452,
-0.020639164373278618,
0.12512871623039246,
-0.1214587539434433,
-0.12050267308950424,
-0.001594188273884356,
-0.029960084706544876,
0.0791488066315651,
-0.02633434161543846,
-0.0997740775346756,
-0.1001306027173996,
-0.15166029334068298,
-0.09759195148944855,
0.05182836204767227,
-0.04993441700935364,
-0.059362251311540604,
-0.17634081840515137,
-0.05707859992980957,
-0.05147340148687363,
0.14025864005088806,
-0.12263951450586319,
0.15159130096435547,
-0.014490418136119843,
0.004084470681846142,
0.04405883327126503,
0.1950942426919937,
-0.03644494712352753,
0.08714226633310318,
0.0154351145029068,
0.1522706001996994,
-0.05119588226079941,
0.14720745384693146,
-0.10931728035211563,
-0.04014137014746666,
-0.06710435450077057,
0.21513493359088898,
0.25630924105644226,
-0.06136954948306084,
-0.008937356993556023,
-0.012760217301547527,
0.058654606342315674,
0.1073930487036705,
0.16049085557460785,
0.002326392102986574,
0.2802925705909729,
-0.03133585304021835,
0.04815128445625305,
0.02901598811149597,
0.013607407920062542,
-0.06336209923028946,
0.03397751972079277,
0.07539387792348862,
-0.035039983689785004,
-0.1412304788827896,
0.15837742388248444,
-0.21980468928813934,
0.18157227337360382,
0.11640069633722305,
-0.19996967911720276,
-0.013728445395827293,
-0.04882071167230606,
0.1689416468143463,
-0.0856364443898201,
0.1637246012687683,
-0.0903693437576294,
-0.2108195722103119,
-0.2056000679731369,
0.03867346793413162,
-0.34623071551322937,
-0.254462867975235,
0.10422009229660034,
0.1488201916217804,
0.04015883058309555,
-0.018507536500692368,
-0.019967829808592796,
-0.018367022275924683,
0.04877542704343796,
-0.0067357709631323814,
0.06014643982052803,
0.031397558748722076,
-0.02988368645310402,
-0.24127542972564697,
-0.029804671183228493,
0.023964406922459602,
-0.07093082368373871,
0.07464958727359772,
-0.06874357163906097,
-0.022495782002806664,
0.08059766888618469,
-0.03066304884850979,
0.03298592567443848,
-0.035373736172914505,
-0.16326889395713806,
0.027529051527380943,
0.03900543600320816,
0.036012712866067886,
0.00634160777553916,
0.0008072225609794259,
-0.03455270454287529,
0.0644603744149208,
-0.16716794669628143,
-0.16015739738941193,
0.14140215516090393,
-0.06745140254497528,
0.2779497504234314,
-0.05812826007604599,
-0.0809100940823555,
0.04766704887151718,
-0.03426874056458473,
0.1807648241519928,
-0.07756473124027252,
0.047254521399736404,
0.12766779959201813,
0.011127962730824947,
0.03121316432952881,
-0.3092964291572571,
0.11082969605922699,
-0.000795336440205574,
-0.006093299947679043,
-0.07581598311662674
] |
null | null | null | this is a test by rui | {} | null | RICH/test | [
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [] | TAGS
#region-us
| this is a test by rui | [] | [
"TAGS\n#region-us \n"
] | [
6
] | [
"passage: TAGS\n#region-us \n"
] | [
0.024608636274933815,
-0.026205500587821007,
-0.009666500613093376,
-0.10395516455173492,
0.08638657629489899,
0.059816278517246246,
0.01882290467619896,
0.020661840215325356,
0.23975107073783875,
-0.005599027033895254,
0.1219947561621666,
0.0015615287702530622,
-0.037353623658418655,
0.03733762726187706,
-0.0035912662278860807,
-0.17583473026752472,
0.03876631706953049,
-0.018274923786520958,
0.01843859627842903,
0.026470553129911423,
-0.07776834815740585,
-0.07564429938793182,
0.015296397730708122,
-0.10247814655303955,
-0.083692267537117,
0.11002834886312485,
0.031466204673051834,
-0.019670886918902397,
0.10779199749231339,
-0.04243955761194229,
0.18699054419994354,
-0.011512263678014278,
-0.11213519424200058,
-0.2536850869655609,
0.021806683391332626,
-0.01765260472893715,
-0.08747660368680954,
0.01506110467016697,
0.0665089413523674,
-0.09014441072940826,
-0.0588928684592247,
0.0795099288225174,
-0.01132340170443058,
0.04246443510055542,
-0.27593839168548584,
-0.12684126198291779,
-0.05297930911183357,
-0.1421966552734375,
0.08651168644428253,
0.04035491496324539,
0.008764253929257393,
0.15506891906261444,
-0.20897391438484192,
0.004104613792151213,
0.08255259692668915,
-0.2538507878780365,
0.05591634660959244,
0.17671173810958862,
0.03623908758163452,
0.18037272989749908,
0.0060391901060938835,
0.11029672622680664,
0.0716743916273117,
-0.024263937026262283,
-0.17590197920799255,
-0.08127854019403458,
-0.04696211963891983,
0.16642488539218903,
-0.06727185100317001,
-0.14248386025428772,
0.34701237082481384,
0.00015008423360995948,
0.009657775051891804,
0.16921205818653107,
-0.059524230659008026,
-0.09972117841243744,
0.07259953022003174,
0.016484731808304787,
0.018492350354790688,
0.1471305936574936,
0.16307872533798218,
-0.0458691343665123,
-0.13837823271751404,
-0.018630273640155792,
-0.22798998653888702,
0.17510560154914856,
-0.03248048573732376,
0.13137903809547424,
-0.27447956800460815,
0.01684025302529335,
-0.2570667266845703,
0.0032130838371813297,
0.04178816080093384,
-0.06004921346902847,
-0.0226522795855999,
-0.013265985064208508,
-0.08018817007541656,
0.004899587947875261,
0.06192673370242119,
0.1266920566558838,
-0.06128726154565811,
0.06128238886594772,
-0.09319206327199936,
0.141696035861969,
0.07166698575019836,
0.07868369668722153,
0.13037432730197906,
0.041205424815416336,
-0.07187089323997498,
-0.21872246265411377,
-0.0026476888451725245,
-0.06275863200426102,
-0.09502086788415909,
-0.0020165652967989445,
-0.11606067419052124,
0.17244569957256317,
-0.030802514404058456,
-0.09825427830219269,
-0.11208184063434601,
0.09148659557104111,
-0.032992321997880936,
-0.03437839448451996,
-0.03552987426519394,
-0.020977836102247238,
0.019381176680326462,
0.04704452306032181,
-0.1548958420753479,
-0.005131472367793322,
0.07039852440357208,
0.11502562463283539,
-0.1346137970685959,
-0.003783059772104025,
-0.07908964157104492,
0.03039063885807991,
0.07654735445976257,
-0.16510222852230072,
0.03158547356724739,
-0.1124754324555397,
-0.07531405985355377,
0.002912673633545637,
-0.015710093080997467,
-0.016202643513679504,
0.166526660323143,
-0.0020451415330171585,
0.0714716836810112,
-0.026345307007431984,
-0.05890209600329399,
-0.11243434250354767,
-0.08489254862070084,
0.05390460044145584,
0.03670717030763626,
0.03266148269176483,
-0.2193479984998703,
0.014805203303694725,
-0.12762966752052307,
0.1360815018415451,
-0.10566820204257965,
-0.04705966264009476,
-0.022842247039079666,
0.20562705397605896,
0.037286072969436646,
0.08762791007757187,
-0.22171171009540558,
0.039756543934345245,
-0.05404696613550186,
0.18480908870697021,
-0.1502426266670227,
-0.0799463614821434,
0.20813211798667908,
-0.07964949309825897,
-0.10115210711956024,
0.021235812455415726,
0.020391687750816345,
0.026287272572517395,
0.0766737088561058,
0.4564172327518463,
-0.09766800701618195,
-0.09146861732006073,
0.10178250074386597,
0.17055274546146393,
-0.12427149713039398,
-0.1827561855316162,
0.06446871906518936,
-0.16666454076766968,
-0.1973118633031845,
0.0018917324487119913,
0.09222044050693512,
0.038269978016614914,
-0.07875611633062363,
-0.020746968686580658,
0.06325206160545349,
-0.0007678253459744155,
0.09095914661884308,
0.03755716234445572,
0.09034032374620438,
-0.08716782182455063,
0.11115926504135132,
-0.05017651244997978,
0.004037132486701012,
0.1343354731798172,
0.027325427159667015,
-0.03223329409956932,
0.08694463223218918,
-0.0485352948307991,
0.05295134335756302,
-0.1662379503250122,
-0.15068690478801727,
0.03398871049284935,
0.06283251196146011,
0.03186952322721481,
0.1280253529548645,
0.08141885697841644,
-0.10732853412628174,
0.022690722718834877,
-0.004228927195072174,
0.058398615568876266,
0.03891623765230179,
0.006107209715992212,
0.008764320984482765,
0.0961301177740097,
-0.10607069730758667,
-0.13589619100093842,
-0.07336436957120895,
-0.014715781435370445,
0.14371353387832642,
-0.0302802175283432,
0.07690227776765823,
-0.004240254405885935,
0.00013200697139836848,
0.06930823624134064,
0.08137880265712738,
0.016412746161222458,
0.08971183747053146,
-0.05237193778157234,
-0.05160155147314072,
0.10863113403320312,
-0.13533565402030945,
0.17837053537368774,
0.14053137600421906,
-0.20532016456127167,
0.029453208670020103,
-0.06838275492191315,
0.03670361638069153,
-0.008162540383636951,
0.0975119024515152,
-0.08272241055965424,
-0.02106042578816414,
0.013134466484189034,
0.0052274600602686405,
-0.013007243163883686,
0.017682146281003952,
-0.07295988500118256,
-0.07787393033504486,
-0.10233919322490692,
0.08436838537454605,
0.11562882363796234,
-0.10282530635595322,
0.14214380085468292,
0.4384984076023102,
0.11495281755924225,
0.21582984924316406,
-0.09581480920314789,
-0.0412987545132637,
0.007486371789127588,
0.0001535322517156601,
-0.04476691037416458,
0.08031861484050751,
-0.15973517298698425,
-0.038901735097169876,
0.027348900213837624,
0.07128690183162689,
0.11475157737731934,
-0.14959022402763367,
-0.09639324247837067,
-0.00793045200407505,
0.0022841424215584993,
-0.1249532699584961,
0.023905446752905846,
-0.03974650055170059,
0.04015624523162842,
0.07232289016246796,
-0.021535737439990044,
0.13939237594604492,
-0.04166141897439957,
-0.0639561116695404,
0.07585346698760986,
-0.2017085999250412,
-0.23179671168327332,
-0.12309670448303223,
-0.14680525660514832,
0.04366797208786011,
0.05154111236333847,
0.01726446859538555,
-0.17635835707187653,
-0.015074856579303741,
0.07706750929355621,
0.07820965349674225,
-0.20886357128620148,
-0.022814949974417686,
-0.004290030337870121,
0.0895976573228836,
-0.10227091610431671,
-0.0017130117630586028,
-0.04419664293527603,
-0.10150232166051865,
0.0017003051470965147,
0.07279510796070099,
-0.137485533952713,
0.13807645440101624,
0.21589438617229462,
0.07225540280342102,
0.07359948754310608,
-0.019093448296189308,
0.09936179965734482,
-0.10856141895055771,
-0.16549113392829895,
0.08348225057125092,
-0.06234746053814888,
0.047262318432331085,
0.17534415423870087,
0.03307317942380905,
-0.13904969394207,
-0.015682822093367577,
-0.0402069091796875,
-0.15603256225585938,
-0.238995760679245,
-0.09178274869918823,
-0.1182505264878273,
0.16442428529262543,
0.0009358620154671371,
0.06651917099952698,
0.08258313685655594,
-0.022042419761419296,
0.16447891294956207,
-0.07379321753978729,
-0.07578866183757782,
-0.006978808436542749,
0.12375060468912125,
-0.056660156697034836,
-0.03080669604241848,
-0.10566964000463486,
-0.008295975625514984,
0.1151021271944046,
0.15304014086723328,
0.12214863300323486,
0.2957419455051422,
0.08268889784812927,
0.026645636186003685,
0.08958091586828232,
0.17622539401054382,
0.09495089203119278,
0.07838419824838638,
-0.045413073152303696,
-0.014814783819019794,
0.014317171648144722,
-0.04022889584302902,
0.010141594335436821,
0.14683100581169128,
-0.2679629921913147,
-0.006678564939647913,
-0.2710230350494385,
0.0965198427438736,
-0.10913380235433578,
0.11837165057659149,
-0.01015760749578476,
0.10194015502929688,
0.11082887649536133,
0.03233652561903,
-0.03858073800802231,
0.16613617539405823,
0.08450309932231903,
-0.11277695000171661,
0.001758623169735074,
0.03737903758883476,
0.09715615212917328,
-0.02818971499800682,
0.12721189856529236,
-0.11048974841833115,
-0.1464834064245224,
0.013753619976341724,
0.07152791321277618,
-0.15373679995536804,
0.3138748109340668,
0.012069208547472954,
-0.13481520116329193,
-0.01481647603213787,
-0.09957809001207352,
-0.006440147757530212,
0.1254177987575531,
0.09333524852991104,
0.07935678958892822,
-0.2185502052307129,
-0.13339371979236603,
0.05872276425361633,
-0.00575496768578887,
0.22408108413219452,
-0.034034017473459244,
-0.11356475204229355,
-0.027013886719942093,
0.04241163283586502,
-0.06043251231312752,
0.08524788916110992,
0.023536119610071182,
-0.08113526552915573,
-0.032957352697849274,
0.05323701351881027,
0.012368366122245789,
0.00524376705288887,
0.09360801428556442,
0.020107939839363098,
-0.0009265501867048442,
0.01785753294825554,
0.047885000705718994,
-0.0675911232829094,
-0.1984109878540039,
0.09357594698667526,
-0.05215044692158699,
0.0015536568826064467,
-0.08013670891523361,
-0.15122665464878082,
-0.08837161958217621,
-0.16009655594825745,
0.12540200352668762,
-0.034406669437885284,
0.12700119614601135,
-0.06619787961244583,
0.17341409623622894,
-0.07871770113706589,
0.04481020197272301,
-0.047349292784929276,
0.050332702696323395,
-0.007268077693879604,
-0.07756082713603973,
0.16585899889469147,
-0.15564003586769104,
0.01809087023139,
0.19572502374649048,
-0.018915493041276932,
0.07177707552909851,
0.021322092041373253,
-0.0636206790804863,
0.23147478699684143,
0.3014698624610901,
0.008138049393892288,
0.1665448248386383,
0.3018903136253357,
-0.07466315478086472,
-0.2642788887023926,
-0.05505012720823288,
-0.2841376066207886,
-0.05371501296758652,
0.10716094076633453,
-0.22523896396160126,
0.06986407935619354,
0.14383509755134583,
-0.06471995264291763,
0.30228954553604126,
-0.21825523674488068,
0.012589273042976856,
0.15434536337852478,
-0.08868814259767532,
0.5515313148498535,
-0.1133413165807724,
-0.17677772045135498,
-0.008122089318931103,
-0.08741296827793121,
0.10602109134197235,
-0.0340677872300148,
0.06877441704273224,
0.013465235009789467,
0.04797380417585373,
0.048932258039712906,
-0.03111894056200981,
0.22701001167297363,
0.008710170164704323,
0.09015397727489471,
-0.07378865778446198,
-0.18624304234981537,
0.11639340221881866,
-0.04359482601284981,
-0.08891059458255768,
0.0849778801202774,
-0.05942516401410103,
-0.11078983545303345,
0.04663389176130295,
-0.07950539886951447,
-0.024862350896000862,
0.08423490077257156,
-0.04678233340382576,
-0.042606171220541,
-0.008054176345467567,
-0.1618063747882843,
-0.0002289071271661669,
0.31360217928886414,
-0.07096036523580551,
0.16695955395698547,
0.03677211329340935,
0.00038613268407061696,
-0.11027684062719345,
0.030288029462099075,
-0.05203165486454964,
-0.021576624363660812,
0.09578979015350342,
-0.11096979677677155,
0.03204701095819473,
0.14160704612731934,
-0.04864364117383957,
0.05846960097551346,
0.09256096184253693,
-0.0849417969584465,
0.007583672646433115,
0.17753590643405914,
-0.17537221312522888,
-0.1273445188999176,
-0.006135711446404457,
-0.09862716495990753,
0.14055661857128143,
0.04394126310944557,
0.05191568285226822,
0.16669964790344238,
0.03967129811644554,
-0.029474308714270592,
-0.02817419543862343,
-0.1153380498290062,
-0.0201893113553524,
0.040153320878744125,
0.00045633706031367183,
-0.08791285753250122,
0.2262638509273529,
0.06409153342247009,
-0.1328488290309906,
-0.051157206296920776,
0.2161225974559784,
-0.06805316358804703,
-0.04911920800805092,
-0.223562553524971,
0.10752306133508682,
-0.07112517952919006,
-0.0965060144662857,
0.05453834682703018,
-0.02270081453025341,
0.005106312222778797,
0.181985542178154,
0.03941008821129799,
0.11070270836353302,
0.03738937899470329,
-0.02448922023177147,
0.15798696875572205,
-0.142850860953331,
-0.14191335439682007,
-0.025354057550430298,
-0.08757315576076508,
-0.13844476640224457,
-0.026804137974977493,
0.1617041826248169,
-0.09177309274673462,
-0.14772607386112213,
-0.2621181011199951,
0.10968475043773651,
-0.16432365775108337,
-0.10192688554525375,
-0.03469514101743698,
-0.08968492597341537,
0.0696166530251503,
0.030301768332719803,
-0.03093348816037178,
-0.06706760823726654,
-0.18593791127204895,
0.0816768929362297,
0.06349513679742813,
0.045533183962106705,
-0.017847947776317596,
0.0067379772663116455,
0.1720137596130371,
0.025955144315958023,
0.10040043294429779,
0.16762186586856842,
0.011397695168852806,
0.2246655523777008,
-0.1671202927827835,
-0.11496317386627197,
0.1336962729692459,
-0.026543032377958298,
0.06762003898620605,
0.16792191565036774,
-0.0772583931684494,
0.015526676550507545,
-0.028136352077126503,
0.07066910713911057,
-0.11003983020782471,
-0.105624258518219,
0.007937257178127766,
0.02567129209637642,
-0.2755882740020752,
-0.005599735304713249,
-0.19717298448085785,
0.14788752794265747,
0.02579621411859989,
0.03297143429517746,
0.10257530212402344,
0.10404334217309952,
0.08312062919139862,
-0.0017710148822516203,
0.03226327523589134,
-0.1176818460226059,
0.02753005363047123,
-0.059239376336336136,
-0.020663779228925705,
0.017624232918024063,
0.36952024698257446,
-0.03603357449173927,
-0.046802736818790436,
0.003710439894348383,
0.1307835876941681,
-0.02139742486178875,
0.017395347356796265,
0.13209912180900574,
0.12607666850090027,
-0.08595693111419678,
-0.1504845917224884,
0.04888554662466049,
-0.04565655067563057,
-0.02836887165904045,
0.1464131623506546,
0.05905961990356445,
0.1050296202301979,
0.0908031314611435,
-0.014463032595813274,
-0.00318976235575974,
0.012856799177825451,
-0.15486004948616028,
0.06223496049642563,
-0.010558074340224266,
0.012565906159579754,
0.017934376373887062,
0.15238402783870697,
-0.005540105979889631,
0.07739730179309845,
-0.09889880567789078,
0.004208535887300968,
-0.13498884439468384,
-0.07913459837436676,
0.03617347031831741,
-0.13393273949623108,
0.04141177982091904,
-0.01871878281235695,
0.029611799865961075,
0.30386561155319214,
0.02558239921927452,
-0.020639164373278618,
0.12512871623039246,
-0.1214587539434433,
-0.12050267308950424,
-0.001594188273884356,
-0.029960084706544876,
0.0791488066315651,
-0.02633434161543846,
-0.0997740775346756,
-0.1001306027173996,
-0.15166029334068298,
-0.09759195148944855,
0.05182836204767227,
-0.04993441700935364,
-0.059362251311540604,
-0.17634081840515137,
-0.05707859992980957,
-0.05147340148687363,
0.14025864005088806,
-0.12263951450586319,
0.15159130096435547,
-0.014490418136119843,
0.004084470681846142,
0.04405883327126503,
0.1950942426919937,
-0.03644494712352753,
0.08714226633310318,
0.0154351145029068,
0.1522706001996994,
-0.05119588226079941,
0.14720745384693146,
-0.10931728035211563,
-0.04014137014746666,
-0.06710435450077057,
0.21513493359088898,
0.25630924105644226,
-0.06136954948306084,
-0.008937356993556023,
-0.012760217301547527,
0.058654606342315674,
0.1073930487036705,
0.16049085557460785,
0.002326392102986574,
0.2802925705909729,
-0.03133585304021835,
0.04815128445625305,
0.02901598811149597,
0.013607407920062542,
-0.06336209923028946,
0.03397751972079277,
0.07539387792348862,
-0.035039983689785004,
-0.1412304788827896,
0.15837742388248444,
-0.21980468928813934,
0.18157227337360382,
0.11640069633722305,
-0.19996967911720276,
-0.013728445395827293,
-0.04882071167230606,
0.1689416468143463,
-0.0856364443898201,
0.1637246012687683,
-0.0903693437576294,
-0.2108195722103119,
-0.2056000679731369,
0.03867346793413162,
-0.34623071551322937,
-0.254462867975235,
0.10422009229660034,
0.1488201916217804,
0.04015883058309555,
-0.018507536500692368,
-0.019967829808592796,
-0.018367022275924683,
0.04877542704343796,
-0.0067357709631323814,
0.06014643982052803,
0.031397558748722076,
-0.02988368645310402,
-0.24127542972564697,
-0.029804671183228493,
0.023964406922459602,
-0.07093082368373871,
0.07464958727359772,
-0.06874357163906097,
-0.022495782002806664,
0.08059766888618469,
-0.03066304884850979,
0.03298592567443848,
-0.035373736172914505,
-0.16326889395713806,
0.027529051527380943,
0.03900543600320816,
0.036012712866067886,
0.00634160777553916,
0.0008072225609794259,
-0.03455270454287529,
0.0644603744149208,
-0.16716794669628143,
-0.16015739738941193,
0.14140215516090393,
-0.06745140254497528,
0.2779497504234314,
-0.05812826007604599,
-0.0809100940823555,
0.04766704887151718,
-0.03426874056458473,
0.1807648241519928,
-0.07756473124027252,
0.047254521399736404,
0.12766779959201813,
0.011127962730824947,
0.03121316432952881,
-0.3092964291572571,
0.11082969605922699,
-0.000795336440205574,
-0.006093299947679043,
-0.07581598311662674
] |
null | null | transformers | Try the test sentence:
<i>The woman said "my name is Sarah [and] I live in London."</i>
The model should tag the tokens in the sentence with information about whether or not they are contained within a compound clause. If you find the model useful, please cite my thesis which presents the dataset used for finetuning:
Evans, R. (2020) Sentence Simplification for Text Processing. Doctoral thesis. University of Wolverhampton. Wolverhampton, UK. (http://rgcl.wlv.ac.uk/~richard/Evans2020_SentenceSimplificationForTextProcessing.pdf)
There you will find more information about the tagging scheme.
The model was derived using code adapted from an original program written by Dr. Le An Ha at the University of Wolverhampton. | {} | token-classification | RJ3vans/CCVspanTagger | [
"transformers",
"pytorch",
"bert",
"token-classification",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [] | TAGS
#transformers #pytorch #bert #token-classification #autotrain_compatible #endpoints_compatible #region-us
| Try the test sentence:
<i>The woman said "my name is Sarah [and] I live in London."</i>
The model should tag the tokens in the sentence with information about whether or not they are contained within a compound clause. If you find the model useful, please cite my thesis which presents the dataset used for finetuning:
Evans, R. (2020) Sentence Simplification for Text Processing. Doctoral thesis. University of Wolverhampton. Wolverhampton, UK. (URL
There you will find more information about the tagging scheme.
The model was derived using code adapted from an original program written by Dr. Le An Ha at the University of Wolverhampton. | [] | [
"TAGS\n#transformers #pytorch #bert #token-classification #autotrain_compatible #endpoints_compatible #region-us \n"
] | [
37
] | [
"passage: TAGS\n#transformers #pytorch #bert #token-classification #autotrain_compatible #endpoints_compatible #region-us \n"
] | [
-0.04952388256788254,
0.052763525396585464,
-0.008742042817175388,
0.033980391919612885,
0.16650345921516418,
0.031232766807079315,
0.056794650852680206,
0.08634597808122635,
0.05724777653813362,
-0.022096728906035423,
0.12041265517473221,
0.25661665201187134,
-0.04172574356198311,
0.09726614505052567,
-0.09425565600395203,
-0.2964369058609009,
0.072568878531456,
0.08518822491168976,
-0.03685073181986809,
0.10874255001544952,
0.0826217457652092,
-0.09953062981367111,
0.07087896764278412,
-0.021146323531866074,
-0.14565661549568176,
0.03782055899500847,
0.04566175118088722,
-0.12290510535240173,
0.10927622765302658,
0.024675650522112846,
0.195825457572937,
0.0250800009816885,
-0.04914003983139992,
-0.12460087239742279,
0.02292012982070446,
0.012434997595846653,
-0.0567244216799736,
0.05464257672429085,
0.07972842454910278,
-0.09245497733354568,
-0.02643316052854061,
0.0736164003610611,
0.040207646787166595,
0.040536463260650635,
-0.12785013020038605,
-0.130097895860672,
-0.01677129417657852,
0.04410306364297867,
0.06830945611000061,
0.03920266404747963,
0.028024563565850258,
0.19659002125263214,
-0.1535772681236267,
0.1040927842259407,
0.1168038472533226,
-0.2951667606830597,
-0.0032763986382633448,
0.12271767109632492,
0.02307300455868244,
0.002292903373017907,
-0.046535883098840714,
0.026178892701864243,
0.025552578270435333,
0.011873415671288967,
0.027952805161476135,
-0.08463964611291885,
-0.08450421690940857,
0.03911300376057625,
-0.10113436728715897,
-0.025793785229325294,
0.17653688788414001,
-0.04118471220135689,
0.051768165081739426,
0.011256481520831585,
-0.10429224371910095,
-0.06710886210203171,
-0.016446024179458618,
-0.016140753403306007,
-0.029147058725357056,
0.05014093220233917,
0.020645350217819214,
0.04166753962635994,
-0.09745467454195023,
0.026441968977451324,
-0.21823887526988983,
0.2394854873418808,
0.026930933818221092,
0.06543339043855667,
-0.1759624481201172,
0.06742089986801147,
0.002468109829351306,
-0.080185666680336,
0.03979361802339554,
-0.10338287800550461,
0.000461930176243186,
-0.05314353480935097,
-0.03661246970295906,
0.03140731528401375,
0.06317161023616791,
0.17395952343940735,
0.0793117806315422,
0.05008377507328987,
-0.005898697301745415,
0.07871981710195541,
0.034834783524274826,
0.12483206391334534,
0.004537689033895731,
-0.03222474083304405,
0.04727894067764282,
-0.12738829851150513,
-0.022797152400016785,
-0.057654689997434616,
-0.13722378015518188,
-0.035683806985616684,
0.08572584390640259,
0.08669279515743256,
-0.002899006474763155,
0.07131802290678024,
-0.07259770482778549,
-0.04170111194252968,
0.0977427288889885,
-0.06708423793315887,
0.03773060068488121,
0.011165250092744827,
0.01788998953998089,
0.10969475656747818,
-0.03837720677256584,
0.006092004477977753,
-0.038544073700904846,
0.16035914421081543,
-0.06788671761751175,
-0.0086200051009655,
-0.03894633799791336,
-0.07913154363632202,
0.03580477461218834,
-0.1462087333202362,
0.05359502136707306,
-0.16970540583133698,
-0.07705289125442505,
0.03400159999728203,
0.03982843458652496,
0.006068000569939613,
-0.023860258981585503,
0.004609986208379269,
0.0009630053536966443,
0.010919814929366112,
-0.06532157212495804,
-0.059639010578393936,
-0.06180674210190773,
0.0789342075586319,
-0.026356957852840424,
0.05199717730283737,
-0.09772148728370667,
0.06380254030227661,
-0.10777808725833893,
0.014992175623774529,
-0.1314254105091095,
-0.01425449550151825,
-0.07122194766998291,
0.1727602779865265,
-0.012890664860606194,
-0.0652521625161171,
-0.042813826352357864,
0.026928367093205452,
-0.05078446492552757,
0.1189238503575325,
-0.05899157002568245,
-0.11599802225828171,
0.16220137476921082,
-0.10287307947874069,
-0.12483754754066467,
0.07581168413162231,
-0.01511769462376833,
-0.007789896801114082,
0.06788471341133118,
0.1409188061952591,
0.11342629045248032,
-0.021826764568686485,
0.08137305825948715,
0.10294424742460251,
-0.1294003278017044,
-0.13587328791618347,
0.006361662410199642,
0.00798719096928835,
-0.14556561410427094,
0.052592433989048004,
0.05236833915114403,
0.0726967453956604,
-0.07074553519487381,
-0.03274666517972946,
-0.01327072735875845,
-0.016134168952703476,
0.11864199489355087,
0.06209159642457962,
0.11604267358779907,
-0.0657677873969078,
0.00820392556488514,
0.047656312584877014,
0.004204337950795889,
0.036543652415275574,
0.015800083056092262,
-0.08706195652484894,
0.12248360365629196,
-0.04112236201763153,
-0.0011841603554785252,
-0.1986646205186844,
-0.09740031510591507,
0.023004116490483284,
0.07795482873916626,
-0.03679869696497917,
0.13610167801380157,
0.0701465755701065,
-0.04764446243643761,
0.00010037058382295072,
-0.03370752930641174,
0.17777207493782043,
0.04204830154776573,
-0.07011453807353973,
-0.07916951924562454,
0.018525518476963043,
-0.07664067298173904,
-0.030626622959971428,
-0.05736034736037254,
0.004993060603737831,
0.09117806702852249,
0.15110987424850464,
0.013702728785574436,
0.07227712869644165,
-0.04311654344201088,
0.06486255675554276,
-0.07453466951847076,
0.009454507380723953,
0.10986973345279694,
-0.00844109058380127,
-0.05139068886637688,
0.12689460813999176,
-0.1288842260837555,
0.3503526449203491,
0.18760092556476593,
-0.3039218783378601,
-0.0014054732164368033,
-0.04166017845273018,
-0.015616725198924541,
0.008022678084671497,
0.052066002041101456,
0.02243027463555336,
0.055117130279541016,
0.004745377227663994,
0.16756683588027954,
-0.015916254371404648,
-0.05048782005906105,
0.01112330798059702,
-0.05658484622836113,
-0.04292769730091095,
0.0737248957157135,
0.08670081943273544,
-0.20279163122177124,
0.1850307285785675,
0.22442755103111267,
0.0013078266056254506,
0.10093411803245544,
-0.009863444603979588,
0.027319302782416344,
0.03913215547800064,
-0.0403471440076828,
-0.01890004798769951,
-0.015322047285735607,
-0.1852157860994339,
-0.04855918884277344,
0.07947539538145065,
0.031089356169104576,
0.044881563633680344,
-0.12700966000556946,
-0.023778779432177544,
0.019753707572817802,
0.05750936269760132,
-0.006435907445847988,
0.08618827164173126,
0.05749829113483429,
0.08228709548711777,
-0.00042137576383538544,
-0.12340108305215836,
0.11843181401491165,
0.008938729763031006,
-0.06630217283964157,
0.1701919436454773,
-0.13096192479133606,
-0.2919446527957916,
-0.12918199598789215,
-0.21609659492969513,
-0.02221912518143654,
0.04468577727675438,
0.06791006773710251,
-0.09210261702537537,
-0.05621439218521118,
0.07646377384662628,
-0.0043442221358418465,
-0.08893678337335587,
0.06794130802154541,
-0.08020073175430298,
0.05017327144742012,
-0.04635123163461685,
-0.06104189157485962,
-0.06701690703630447,
-0.041732318699359894,
-0.02662169374525547,
0.13805724680423737,
-0.09828965365886688,
0.06120699644088745,
0.17539545893669128,
-0.011075721122324467,
0.06305833160877228,
-0.023022029548883438,
0.17187125980854034,
-0.051036398857831955,
-0.01183983776718378,
0.15369866788387299,
-0.07339499890804291,
0.08695117384195328,
0.16216324269771576,
0.039562564343214035,
-0.05847073718905449,
0.00318810623139143,
-0.03215182200074196,
-0.10039623826742172,
-0.18391156196594238,
-0.12352752685546875,
-0.11139705777168274,
0.03804181143641472,
0.07061707973480225,
0.06962733715772629,
0.135431706905365,
0.09561170637607574,
0.051199477165937424,
0.010959632694721222,
-0.048153605312108994,
0.07466201484203339,
0.2208920270204544,
0.001999937929213047,
0.1435864269733429,
-0.04169393703341484,
-0.1389981210231781,
0.07379768788814545,
0.05560476705431938,
0.13692645728588104,
0.10865066200494766,
-0.01025706622749567,
0.01681477762758732,
0.15626244246959686,
0.18229423463344574,
0.1308237463235855,
0.006019692402333021,
-0.03366916626691818,
-0.00786581914871931,
0.01499095093458891,
-0.05352693051099777,
0.018093694001436234,
0.1126210168004036,
-0.1184353232383728,
-0.05580713599920273,
-0.1544867902994156,
0.06733953207731247,
0.09452182799577713,
0.05705242604017258,
-0.21316885948181152,
0.015583771280944347,
0.07424211502075195,
-0.02289263904094696,
-0.07244562357664108,
0.07719103991985321,
-0.06336886435747147,
-0.14039036631584167,
0.06610569357872009,
-0.05594165250658989,
0.1137172058224678,
-0.07018838077783585,
0.07125633209943771,
0.00436688307672739,
-0.09134659916162491,
0.03966715931892395,
0.09198823571205139,
-0.24242180585861206,
0.2353951781988144,
-0.005842206999659538,
-0.08202855288982391,
-0.07314476370811462,
-0.01116443332284689,
0.040539294481277466,
0.20659157633781433,
0.06959166377782822,
0.015105132944881916,
-0.09668217599391937,
-0.2116534262895584,
-0.01098657213151455,
0.0037440438754856586,
0.10247340053319931,
-0.042362287640571594,
-0.020731588825583458,
-0.04073396697640419,
-0.02784070186316967,
-0.013651788234710693,
-0.025264961645007133,
0.03604467213153839,
-0.12370047718286514,
0.0628918707370758,
0.03108403831720352,
0.0375843271613121,
0.011039070785045624,
-0.053596705198287964,
-0.131315678358078,
0.20352721214294434,
-0.08504047989845276,
-0.057783741503953934,
-0.11823975294828415,
-0.09986942261457443,
0.06580016016960144,
-0.09040012210607529,
0.08111396431922913,
-0.08633793145418167,
0.013077793642878532,
-0.03235287219285965,
-0.1910691112279892,
0.14946654438972473,
-0.11215386539697647,
-0.021799318492412567,
-0.08084782212972641,
0.13647201657295227,
-0.07384097576141357,
0.013447601348161697,
0.013732876628637314,
0.02437661960721016,
-0.08041444420814514,
-0.0828900933265686,
0.006355836056172848,
-0.014282151125371456,
0.031239764764904976,
0.025724230334162712,
-0.06753705441951752,
-0.0018155953148379922,
-0.011079292744398117,
0.043730828911066055,
0.24128669500350952,
0.1798989176750183,
-0.08305246382951736,
0.11879931390285492,
0.15382987260818481,
-0.04894930124282837,
-0.31972965598106384,
-0.07097899168729782,
-0.11553299427032471,
-0.045486610382795334,
-0.038512568920850754,
-0.1360434889793396,
0.1564210206270218,
0.02604006603360176,
-0.04126296564936638,
0.08387403935194016,
-0.14068999886512756,
-0.07977245002985,
0.22926518321037292,
0.0029623862355947495,
0.40268903970718384,
-0.08702389895915985,
-0.08436572551727295,
-0.01582670770585537,
-0.15964074432849884,
0.11801808327436447,
0.04110552370548248,
0.06407788395881653,
-0.02680877409875393,
0.053290240466594696,
0.039342157542705536,
-0.06043728440999985,
0.09375226497650146,
0.031009389087557793,
0.044987753033638,
-0.10373856127262115,
-0.13265928626060486,
0.028911620378494263,
-0.030960315838456154,
-0.014786438085138798,
0.057378023862838745,
0.022116998210549355,
-0.12861059606075287,
-0.025071382522583008,
-0.07482665032148361,
0.08682981133460999,
0.03502606973052025,
-0.06619370728731155,
-0.003656937973573804,
-0.011468438431620598,
-0.010950524359941483,
-0.007217009086161852,
0.25750675797462463,
0.003593818750232458,
0.14098960161209106,
0.10273806750774384,
0.09543769061565399,
-0.17969036102294922,
-0.03609991818666458,
-0.07452508062124252,
-0.06570184230804443,
0.096438467502594,
-0.030126970261335373,
0.07519517093896866,
0.1516796350479126,
-0.04122238606214523,
0.04297369718551636,
0.12120365351438522,
0.047616977244615555,
-0.0461055189371109,
0.14043885469436646,
-0.20969203114509583,
0.03770104795694351,
-0.02787908911705017,
-0.02063949778676033,
0.07864256948232651,
0.10774786025285721,
0.10418630391359329,
0.04115518182516098,
-0.035168662667274475,
0.01394918467849493,
-0.02965446747839451,
-0.03510928899049759,
0.07017704844474792,
0.06717298924922943,
0.043598420917987823,
-0.13562940061092377,
0.03572770580649376,
0.03545781224966049,
-0.15994900465011597,
-0.04561840742826462,
0.0798286497592926,
-0.15686658024787903,
-0.11122603714466095,
-0.02263965830206871,
0.11455298960208893,
-0.14521896839141846,
-0.039632488042116165,
-0.04641811549663544,
-0.13418996334075928,
0.06972779333591461,
0.18159188330173492,
0.13103143870830536,
0.10970540344715118,
-0.05561865121126175,
-0.021705783903598785,
-0.011595118790864944,
-0.01822039857506752,
0.006359242834150791,
0.06845489144325256,
-0.17037172615528107,
0.017899589613080025,
-0.01394583098590374,
0.14702916145324707,
-0.09660974144935608,
-0.07415182143449783,
-0.16891857981681824,
0.04697556421160698,
-0.09303826838731766,
-0.07159677147865295,
-0.08290879428386688,
-0.02118426188826561,
0.028802918270230293,
-0.08412807434797287,
-0.03680410608649254,
-0.03769955411553383,
-0.12591150403022766,
0.058045439422130585,
0.016383599489927292,
0.029256748035550117,
-0.04278605803847313,
-0.049387238919734955,
0.10788699239492416,
-0.040735047310590744,
0.09248634427785873,
0.11333432793617249,
-0.06952280551195145,
0.08524111658334732,
-0.09002139419317245,
-0.12434586137533188,
0.12248294800519943,
0.02376852184534073,
0.11538374423980713,
0.037905752658843994,
0.03187960386276245,
0.07364030182361603,
0.015629447996616364,
0.05559782683849335,
0.05336270108819008,
-0.12224778532981873,
0.02708340249955654,
-0.01786322332918644,
-0.1931128054857254,
-0.02867043949663639,
-0.07598401606082916,
0.12200799584388733,
-0.0023715540301054716,
0.15108288824558258,
-0.006962954066693783,
0.08709387481212616,
-0.04373457655310631,
-0.00553933484479785,
-0.020426053553819656,
-0.20569974184036255,
-0.038188423961400986,
-0.05518931895494461,
0.0054757255129516125,
-0.003793900366872549,
0.25238698720932007,
0.03206604719161987,
0.021465452387928963,
0.04169522970914841,
0.05841078236699104,
0.0034062466584146023,
0.03988324850797653,
0.17441175878047943,
0.1048174500465393,
-0.044596027582883835,
-0.06612113863229752,
0.07449153810739517,
0.014104411005973816,
-0.03425929695367813,
0.10280604660511017,
0.0579635351896286,
-0.022101202979683876,
0.061401285231113434,
0.006350876297801733,
0.03077852725982666,
-0.17194421589374542,
-0.18425245583057404,
-0.0518796369433403,
0.07249171286821365,
0.02366054244339466,
0.06857233494520187,
0.10899512469768524,
-0.026017313823103905,
0.0467560812830925,
-0.04243495315313339,
-0.03868629038333893,
-0.19200804829597473,
-0.11754908412694931,
-0.09499595314264297,
-0.10949105769395828,
0.012144351378083229,
-0.045467864722013474,
-0.025607097893953323,
0.112579844892025,
0.05665137246251106,
-0.02637707255780697,
0.07898925989866257,
0.0068581425584852695,
-0.015829376876354218,
0.035544686019420624,
-0.015088078565895557,
-0.002572772093117237,
-0.008363397791981697,
-0.025032468140125275,
-0.1646868884563446,
-0.01614730805158615,
-0.059904489666223526,
-0.0038963949773460627,
-0.0638962835073471,
0.002605070825666189,
-0.10870174318552017,
-0.11156944185495377,
-0.027449829503893852,
0.031112194061279297,
-0.07530830055475235,
0.08167865127325058,
-0.012178707867860794,
0.031354498118162155,
0.02408476360142231,
0.15556955337524414,
-0.0745464414358139,
-0.05411846935749054,
-0.04471001774072647,
0.26918578147888184,
0.0574658028781414,
0.1188357025384903,
0.007326812483370304,
0.018914537504315376,
-0.0766475573182106,
0.29574328660964966,
0.26567307114601135,
-0.03664971888065338,
0.05142061784863472,
0.04157736524939537,
0.01670730672776699,
0.09608681499958038,
0.1372172236442566,
0.07940847426652908,
0.23908233642578125,
-0.07535005360841751,
-0.04486403986811638,
-0.029895322397351265,
-0.017877299338579178,
-0.10601367056369781,
0.0677277147769928,
0.0556669756770134,
-0.0367586687207222,
-0.08847290277481079,
0.07175103574991226,
-0.16698697209358215,
0.1506146639585495,
0.055220942944288254,
-0.18291416764259338,
-0.07399025559425354,
-0.022136300802230835,
0.1443227231502533,
-0.019863387569785118,
0.0790862888097763,
-0.031677428632974625,
-0.10550876706838608,
0.039427150040864944,
0.01414680015295744,
-0.21330086886882782,
-0.05566805601119995,
0.0937623530626297,
0.0036555929109454155,
0.05017957463860512,
-0.023827895522117615,
0.03583429008722305,
0.08428710699081421,
0.0726693794131279,
-0.04607251659035683,
0.006363187450915575,
0.011929735541343689,
-0.08450563997030258,
-0.03499215841293335,
0.00016984343528747559,
0.013966171070933342,
-0.05488259717822075,
0.03193806856870651,
-0.18189109861850739,
0.04003556817770004,
-0.09101450443267822,
-0.036184389144182205,
-0.019026435911655426,
0.023357758298516273,
-0.029626764357089996,
0.05516811087727547,
0.07363101094961166,
0.009205193258821964,
-0.03664170578122139,
-0.06109684333205223,
-0.025447756052017212,
0.03078463301062584,
-0.11446559429168701,
-0.14089645445346832,
-0.08753776550292969,
-0.06155245006084442,
0.09708955883979797,
-0.01227374467998743,
-0.0782943144440651,
-0.04041222110390663,
-0.07965502887964249,
0.03774513676762581,
-0.14707180857658386,
0.06991016864776611,
0.03579777479171753,
0.04206673055887222,
-0.01093299314379692,
-0.03975704312324524,
0.019534343853592873,
0.054816145449876785,
-0.12402302771806717,
-0.09320621192455292
] |
null | null | transformers | This model identifies compound nouns in input sentences.
Try the test sentence:
I love apples [and] potatoes.
Accuracy is best when you place square brackets around the coordinating conjunction.
The model was derived using code adapted from an original program written by Dr. Le An Ha at the University of Wolverhampton. | {} | token-classification | RJ3vans/CLNspanTagger | [
"transformers",
"pytorch",
"bert",
"token-classification",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [] | TAGS
#transformers #pytorch #bert #token-classification #autotrain_compatible #endpoints_compatible #region-us
| This model identifies compound nouns in input sentences.
Try the test sentence:
I love apples [and] potatoes.
Accuracy is best when you place square brackets around the coordinating conjunction.
The model was derived using code adapted from an original program written by Dr. Le An Ha at the University of Wolverhampton. | [] | [
"TAGS\n#transformers #pytorch #bert #token-classification #autotrain_compatible #endpoints_compatible #region-us \n"
] | [
37
] | [
"passage: TAGS\n#transformers #pytorch #bert #token-classification #autotrain_compatible #endpoints_compatible #region-us \n"
] | [
-0.04952388256788254,
0.052763525396585464,
-0.008742042817175388,
0.033980391919612885,
0.16650345921516418,
0.031232766807079315,
0.056794650852680206,
0.08634597808122635,
0.05724777653813362,
-0.022096728906035423,
0.12041265517473221,
0.25661665201187134,
-0.04172574356198311,
0.09726614505052567,
-0.09425565600395203,
-0.2964369058609009,
0.072568878531456,
0.08518822491168976,
-0.03685073181986809,
0.10874255001544952,
0.0826217457652092,
-0.09953062981367111,
0.07087896764278412,
-0.021146323531866074,
-0.14565661549568176,
0.03782055899500847,
0.04566175118088722,
-0.12290510535240173,
0.10927622765302658,
0.024675650522112846,
0.195825457572937,
0.0250800009816885,
-0.04914003983139992,
-0.12460087239742279,
0.02292012982070446,
0.012434997595846653,
-0.0567244216799736,
0.05464257672429085,
0.07972842454910278,
-0.09245497733354568,
-0.02643316052854061,
0.0736164003610611,
0.040207646787166595,
0.040536463260650635,
-0.12785013020038605,
-0.130097895860672,
-0.01677129417657852,
0.04410306364297867,
0.06830945611000061,
0.03920266404747963,
0.028024563565850258,
0.19659002125263214,
-0.1535772681236267,
0.1040927842259407,
0.1168038472533226,
-0.2951667606830597,
-0.0032763986382633448,
0.12271767109632492,
0.02307300455868244,
0.002292903373017907,
-0.046535883098840714,
0.026178892701864243,
0.025552578270435333,
0.011873415671288967,
0.027952805161476135,
-0.08463964611291885,
-0.08450421690940857,
0.03911300376057625,
-0.10113436728715897,
-0.025793785229325294,
0.17653688788414001,
-0.04118471220135689,
0.051768165081739426,
0.011256481520831585,
-0.10429224371910095,
-0.06710886210203171,
-0.016446024179458618,
-0.016140753403306007,
-0.029147058725357056,
0.05014093220233917,
0.020645350217819214,
0.04166753962635994,
-0.09745467454195023,
0.026441968977451324,
-0.21823887526988983,
0.2394854873418808,
0.026930933818221092,
0.06543339043855667,
-0.1759624481201172,
0.06742089986801147,
0.002468109829351306,
-0.080185666680336,
0.03979361802339554,
-0.10338287800550461,
0.000461930176243186,
-0.05314353480935097,
-0.03661246970295906,
0.03140731528401375,
0.06317161023616791,
0.17395952343940735,
0.0793117806315422,
0.05008377507328987,
-0.005898697301745415,
0.07871981710195541,
0.034834783524274826,
0.12483206391334534,
0.004537689033895731,
-0.03222474083304405,
0.04727894067764282,
-0.12738829851150513,
-0.022797152400016785,
-0.057654689997434616,
-0.13722378015518188,
-0.035683806985616684,
0.08572584390640259,
0.08669279515743256,
-0.002899006474763155,
0.07131802290678024,
-0.07259770482778549,
-0.04170111194252968,
0.0977427288889885,
-0.06708423793315887,
0.03773060068488121,
0.011165250092744827,
0.01788998953998089,
0.10969475656747818,
-0.03837720677256584,
0.006092004477977753,
-0.038544073700904846,
0.16035914421081543,
-0.06788671761751175,
-0.0086200051009655,
-0.03894633799791336,
-0.07913154363632202,
0.03580477461218834,
-0.1462087333202362,
0.05359502136707306,
-0.16970540583133698,
-0.07705289125442505,
0.03400159999728203,
0.03982843458652496,
0.006068000569939613,
-0.023860258981585503,
0.004609986208379269,
0.0009630053536966443,
0.010919814929366112,
-0.06532157212495804,
-0.059639010578393936,
-0.06180674210190773,
0.0789342075586319,
-0.026356957852840424,
0.05199717730283737,
-0.09772148728370667,
0.06380254030227661,
-0.10777808725833893,
0.014992175623774529,
-0.1314254105091095,
-0.01425449550151825,
-0.07122194766998291,
0.1727602779865265,
-0.012890664860606194,
-0.0652521625161171,
-0.042813826352357864,
0.026928367093205452,
-0.05078446492552757,
0.1189238503575325,
-0.05899157002568245,
-0.11599802225828171,
0.16220137476921082,
-0.10287307947874069,
-0.12483754754066467,
0.07581168413162231,
-0.01511769462376833,
-0.007789896801114082,
0.06788471341133118,
0.1409188061952591,
0.11342629045248032,
-0.021826764568686485,
0.08137305825948715,
0.10294424742460251,
-0.1294003278017044,
-0.13587328791618347,
0.006361662410199642,
0.00798719096928835,
-0.14556561410427094,
0.052592433989048004,
0.05236833915114403,
0.0726967453956604,
-0.07074553519487381,
-0.03274666517972946,
-0.01327072735875845,
-0.016134168952703476,
0.11864199489355087,
0.06209159642457962,
0.11604267358779907,
-0.0657677873969078,
0.00820392556488514,
0.047656312584877014,
0.004204337950795889,
0.036543652415275574,
0.015800083056092262,
-0.08706195652484894,
0.12248360365629196,
-0.04112236201763153,
-0.0011841603554785252,
-0.1986646205186844,
-0.09740031510591507,
0.023004116490483284,
0.07795482873916626,
-0.03679869696497917,
0.13610167801380157,
0.0701465755701065,
-0.04764446243643761,
0.00010037058382295072,
-0.03370752930641174,
0.17777207493782043,
0.04204830154776573,
-0.07011453807353973,
-0.07916951924562454,
0.018525518476963043,
-0.07664067298173904,
-0.030626622959971428,
-0.05736034736037254,
0.004993060603737831,
0.09117806702852249,
0.15110987424850464,
0.013702728785574436,
0.07227712869644165,
-0.04311654344201088,
0.06486255675554276,
-0.07453466951847076,
0.009454507380723953,
0.10986973345279694,
-0.00844109058380127,
-0.05139068886637688,
0.12689460813999176,
-0.1288842260837555,
0.3503526449203491,
0.18760092556476593,
-0.3039218783378601,
-0.0014054732164368033,
-0.04166017845273018,
-0.015616725198924541,
0.008022678084671497,
0.052066002041101456,
0.02243027463555336,
0.055117130279541016,
0.004745377227663994,
0.16756683588027954,
-0.015916254371404648,
-0.05048782005906105,
0.01112330798059702,
-0.05658484622836113,
-0.04292769730091095,
0.0737248957157135,
0.08670081943273544,
-0.20279163122177124,
0.1850307285785675,
0.22442755103111267,
0.0013078266056254506,
0.10093411803245544,
-0.009863444603979588,
0.027319302782416344,
0.03913215547800064,
-0.0403471440076828,
-0.01890004798769951,
-0.015322047285735607,
-0.1852157860994339,
-0.04855918884277344,
0.07947539538145065,
0.031089356169104576,
0.044881563633680344,
-0.12700966000556946,
-0.023778779432177544,
0.019753707572817802,
0.05750936269760132,
-0.006435907445847988,
0.08618827164173126,
0.05749829113483429,
0.08228709548711777,
-0.00042137576383538544,
-0.12340108305215836,
0.11843181401491165,
0.008938729763031006,
-0.06630217283964157,
0.1701919436454773,
-0.13096192479133606,
-0.2919446527957916,
-0.12918199598789215,
-0.21609659492969513,
-0.02221912518143654,
0.04468577727675438,
0.06791006773710251,
-0.09210261702537537,
-0.05621439218521118,
0.07646377384662628,
-0.0043442221358418465,
-0.08893678337335587,
0.06794130802154541,
-0.08020073175430298,
0.05017327144742012,
-0.04635123163461685,
-0.06104189157485962,
-0.06701690703630447,
-0.041732318699359894,
-0.02662169374525547,
0.13805724680423737,
-0.09828965365886688,
0.06120699644088745,
0.17539545893669128,
-0.011075721122324467,
0.06305833160877228,
-0.023022029548883438,
0.17187125980854034,
-0.051036398857831955,
-0.01183983776718378,
0.15369866788387299,
-0.07339499890804291,
0.08695117384195328,
0.16216324269771576,
0.039562564343214035,
-0.05847073718905449,
0.00318810623139143,
-0.03215182200074196,
-0.10039623826742172,
-0.18391156196594238,
-0.12352752685546875,
-0.11139705777168274,
0.03804181143641472,
0.07061707973480225,
0.06962733715772629,
0.135431706905365,
0.09561170637607574,
0.051199477165937424,
0.010959632694721222,
-0.048153605312108994,
0.07466201484203339,
0.2208920270204544,
0.001999937929213047,
0.1435864269733429,
-0.04169393703341484,
-0.1389981210231781,
0.07379768788814545,
0.05560476705431938,
0.13692645728588104,
0.10865066200494766,
-0.01025706622749567,
0.01681477762758732,
0.15626244246959686,
0.18229423463344574,
0.1308237463235855,
0.006019692402333021,
-0.03366916626691818,
-0.00786581914871931,
0.01499095093458891,
-0.05352693051099777,
0.018093694001436234,
0.1126210168004036,
-0.1184353232383728,
-0.05580713599920273,
-0.1544867902994156,
0.06733953207731247,
0.09452182799577713,
0.05705242604017258,
-0.21316885948181152,
0.015583771280944347,
0.07424211502075195,
-0.02289263904094696,
-0.07244562357664108,
0.07719103991985321,
-0.06336886435747147,
-0.14039036631584167,
0.06610569357872009,
-0.05594165250658989,
0.1137172058224678,
-0.07018838077783585,
0.07125633209943771,
0.00436688307672739,
-0.09134659916162491,
0.03966715931892395,
0.09198823571205139,
-0.24242180585861206,
0.2353951781988144,
-0.005842206999659538,
-0.08202855288982391,
-0.07314476370811462,
-0.01116443332284689,
0.040539294481277466,
0.20659157633781433,
0.06959166377782822,
0.015105132944881916,
-0.09668217599391937,
-0.2116534262895584,
-0.01098657213151455,
0.0037440438754856586,
0.10247340053319931,
-0.042362287640571594,
-0.020731588825583458,
-0.04073396697640419,
-0.02784070186316967,
-0.013651788234710693,
-0.025264961645007133,
0.03604467213153839,
-0.12370047718286514,
0.0628918707370758,
0.03108403831720352,
0.0375843271613121,
0.011039070785045624,
-0.053596705198287964,
-0.131315678358078,
0.20352721214294434,
-0.08504047989845276,
-0.057783741503953934,
-0.11823975294828415,
-0.09986942261457443,
0.06580016016960144,
-0.09040012210607529,
0.08111396431922913,
-0.08633793145418167,
0.013077793642878532,
-0.03235287219285965,
-0.1910691112279892,
0.14946654438972473,
-0.11215386539697647,
-0.021799318492412567,
-0.08084782212972641,
0.13647201657295227,
-0.07384097576141357,
0.013447601348161697,
0.013732876628637314,
0.02437661960721016,
-0.08041444420814514,
-0.0828900933265686,
0.006355836056172848,
-0.014282151125371456,
0.031239764764904976,
0.025724230334162712,
-0.06753705441951752,
-0.0018155953148379922,
-0.011079292744398117,
0.043730828911066055,
0.24128669500350952,
0.1798989176750183,
-0.08305246382951736,
0.11879931390285492,
0.15382987260818481,
-0.04894930124282837,
-0.31972965598106384,
-0.07097899168729782,
-0.11553299427032471,
-0.045486610382795334,
-0.038512568920850754,
-0.1360434889793396,
0.1564210206270218,
0.02604006603360176,
-0.04126296564936638,
0.08387403935194016,
-0.14068999886512756,
-0.07977245002985,
0.22926518321037292,
0.0029623862355947495,
0.40268903970718384,
-0.08702389895915985,
-0.08436572551727295,
-0.01582670770585537,
-0.15964074432849884,
0.11801808327436447,
0.04110552370548248,
0.06407788395881653,
-0.02680877409875393,
0.053290240466594696,
0.039342157542705536,
-0.06043728440999985,
0.09375226497650146,
0.031009389087557793,
0.044987753033638,
-0.10373856127262115,
-0.13265928626060486,
0.028911620378494263,
-0.030960315838456154,
-0.014786438085138798,
0.057378023862838745,
0.022116998210549355,
-0.12861059606075287,
-0.025071382522583008,
-0.07482665032148361,
0.08682981133460999,
0.03502606973052025,
-0.06619370728731155,
-0.003656937973573804,
-0.011468438431620598,
-0.010950524359941483,
-0.007217009086161852,
0.25750675797462463,
0.003593818750232458,
0.14098960161209106,
0.10273806750774384,
0.09543769061565399,
-0.17969036102294922,
-0.03609991818666458,
-0.07452508062124252,
-0.06570184230804443,
0.096438467502594,
-0.030126970261335373,
0.07519517093896866,
0.1516796350479126,
-0.04122238606214523,
0.04297369718551636,
0.12120365351438522,
0.047616977244615555,
-0.0461055189371109,
0.14043885469436646,
-0.20969203114509583,
0.03770104795694351,
-0.02787908911705017,
-0.02063949778676033,
0.07864256948232651,
0.10774786025285721,
0.10418630391359329,
0.04115518182516098,
-0.035168662667274475,
0.01394918467849493,
-0.02965446747839451,
-0.03510928899049759,
0.07017704844474792,
0.06717298924922943,
0.043598420917987823,
-0.13562940061092377,
0.03572770580649376,
0.03545781224966049,
-0.15994900465011597,
-0.04561840742826462,
0.0798286497592926,
-0.15686658024787903,
-0.11122603714466095,
-0.02263965830206871,
0.11455298960208893,
-0.14521896839141846,
-0.039632488042116165,
-0.04641811549663544,
-0.13418996334075928,
0.06972779333591461,
0.18159188330173492,
0.13103143870830536,
0.10970540344715118,
-0.05561865121126175,
-0.021705783903598785,
-0.011595118790864944,
-0.01822039857506752,
0.006359242834150791,
0.06845489144325256,
-0.17037172615528107,
0.017899589613080025,
-0.01394583098590374,
0.14702916145324707,
-0.09660974144935608,
-0.07415182143449783,
-0.16891857981681824,
0.04697556421160698,
-0.09303826838731766,
-0.07159677147865295,
-0.08290879428386688,
-0.02118426188826561,
0.028802918270230293,
-0.08412807434797287,
-0.03680410608649254,
-0.03769955411553383,
-0.12591150403022766,
0.058045439422130585,
0.016383599489927292,
0.029256748035550117,
-0.04278605803847313,
-0.049387238919734955,
0.10788699239492416,
-0.040735047310590744,
0.09248634427785873,
0.11333432793617249,
-0.06952280551195145,
0.08524111658334732,
-0.09002139419317245,
-0.12434586137533188,
0.12248294800519943,
0.02376852184534073,
0.11538374423980713,
0.037905752658843994,
0.03187960386276245,
0.07364030182361603,
0.015629447996616364,
0.05559782683849335,
0.05336270108819008,
-0.12224778532981873,
0.02708340249955654,
-0.01786322332918644,
-0.1931128054857254,
-0.02867043949663639,
-0.07598401606082916,
0.12200799584388733,
-0.0023715540301054716,
0.15108288824558258,
-0.006962954066693783,
0.08709387481212616,
-0.04373457655310631,
-0.00553933484479785,
-0.020426053553819656,
-0.20569974184036255,
-0.038188423961400986,
-0.05518931895494461,
0.0054757255129516125,
-0.003793900366872549,
0.25238698720932007,
0.03206604719161987,
0.021465452387928963,
0.04169522970914841,
0.05841078236699104,
0.0034062466584146023,
0.03988324850797653,
0.17441175878047943,
0.1048174500465393,
-0.044596027582883835,
-0.06612113863229752,
0.07449153810739517,
0.014104411005973816,
-0.03425929695367813,
0.10280604660511017,
0.0579635351896286,
-0.022101202979683876,
0.061401285231113434,
0.006350876297801733,
0.03077852725982666,
-0.17194421589374542,
-0.18425245583057404,
-0.0518796369433403,
0.07249171286821365,
0.02366054244339466,
0.06857233494520187,
0.10899512469768524,
-0.026017313823103905,
0.0467560812830925,
-0.04243495315313339,
-0.03868629038333893,
-0.19200804829597473,
-0.11754908412694931,
-0.09499595314264297,
-0.10949105769395828,
0.012144351378083229,
-0.045467864722013474,
-0.025607097893953323,
0.112579844892025,
0.05665137246251106,
-0.02637707255780697,
0.07898925989866257,
0.0068581425584852695,
-0.015829376876354218,
0.035544686019420624,
-0.015088078565895557,
-0.002572772093117237,
-0.008363397791981697,
-0.025032468140125275,
-0.1646868884563446,
-0.01614730805158615,
-0.059904489666223526,
-0.0038963949773460627,
-0.0638962835073471,
0.002605070825666189,
-0.10870174318552017,
-0.11156944185495377,
-0.027449829503893852,
0.031112194061279297,
-0.07530830055475235,
0.08167865127325058,
-0.012178707867860794,
0.031354498118162155,
0.02408476360142231,
0.15556955337524414,
-0.0745464414358139,
-0.05411846935749054,
-0.04471001774072647,
0.26918578147888184,
0.0574658028781414,
0.1188357025384903,
0.007326812483370304,
0.018914537504315376,
-0.0766475573182106,
0.29574328660964966,
0.26567307114601135,
-0.03664971888065338,
0.05142061784863472,
0.04157736524939537,
0.01670730672776699,
0.09608681499958038,
0.1372172236442566,
0.07940847426652908,
0.23908233642578125,
-0.07535005360841751,
-0.04486403986811638,
-0.029895322397351265,
-0.017877299338579178,
-0.10601367056369781,
0.0677277147769928,
0.0556669756770134,
-0.0367586687207222,
-0.08847290277481079,
0.07175103574991226,
-0.16698697209358215,
0.1506146639585495,
0.055220942944288254,
-0.18291416764259338,
-0.07399025559425354,
-0.022136300802230835,
0.1443227231502533,
-0.019863387569785118,
0.0790862888097763,
-0.031677428632974625,
-0.10550876706838608,
0.039427150040864944,
0.01414680015295744,
-0.21330086886882782,
-0.05566805601119995,
0.0937623530626297,
0.0036555929109454155,
0.05017957463860512,
-0.023827895522117615,
0.03583429008722305,
0.08428710699081421,
0.0726693794131279,
-0.04607251659035683,
0.006363187450915575,
0.011929735541343689,
-0.08450563997030258,
-0.03499215841293335,
0.00016984343528747559,
0.013966171070933342,
-0.05488259717822075,
0.03193806856870651,
-0.18189109861850739,
0.04003556817770004,
-0.09101450443267822,
-0.036184389144182205,
-0.019026435911655426,
0.023357758298516273,
-0.029626764357089996,
0.05516811087727547,
0.07363101094961166,
0.009205193258821964,
-0.03664170578122139,
-0.06109684333205223,
-0.025447756052017212,
0.03078463301062584,
-0.11446559429168701,
-0.14089645445346832,
-0.08753776550292969,
-0.06155245006084442,
0.09708955883979797,
-0.01227374467998743,
-0.0782943144440651,
-0.04041222110390663,
-0.07965502887964249,
0.03774513676762581,
-0.14707180857658386,
0.06991016864776611,
0.03579777479171753,
0.04206673055887222,
-0.01093299314379692,
-0.03975704312324524,
0.019534343853592873,
0.054816145449876785,
-0.12402302771806717,
-0.09320621192455292
] |
null | null | transformers | This model identifies compound noun phrases in an input sentence.
Try the test sentence:
The inquiry, which continues, will recall John Smith [and] Peter Montgomery next month for further questioning.
Note that you need square brackets around the conjunction coordinating the NPs.
The model was derived using code adapted from an original program written by Dr. Le An Ha at the University of Wolverhampton. | {} | token-classification | RJ3vans/CMN1spanTagger | [
"transformers",
"pytorch",
"bert",
"token-classification",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [] | TAGS
#transformers #pytorch #bert #token-classification #autotrain_compatible #endpoints_compatible #region-us
| This model identifies compound noun phrases in an input sentence.
Try the test sentence:
The inquiry, which continues, will recall John Smith [and] Peter Montgomery next month for further questioning.
Note that you need square brackets around the conjunction coordinating the NPs.
The model was derived using code adapted from an original program written by Dr. Le An Ha at the University of Wolverhampton. | [] | [
"TAGS\n#transformers #pytorch #bert #token-classification #autotrain_compatible #endpoints_compatible #region-us \n"
] | [
37
] | [
"passage: TAGS\n#transformers #pytorch #bert #token-classification #autotrain_compatible #endpoints_compatible #region-us \n"
] | [
-0.04952388256788254,
0.052763525396585464,
-0.008742042817175388,
0.033980391919612885,
0.16650345921516418,
0.031232766807079315,
0.056794650852680206,
0.08634597808122635,
0.05724777653813362,
-0.022096728906035423,
0.12041265517473221,
0.25661665201187134,
-0.04172574356198311,
0.09726614505052567,
-0.09425565600395203,
-0.2964369058609009,
0.072568878531456,
0.08518822491168976,
-0.03685073181986809,
0.10874255001544952,
0.0826217457652092,
-0.09953062981367111,
0.07087896764278412,
-0.021146323531866074,
-0.14565661549568176,
0.03782055899500847,
0.04566175118088722,
-0.12290510535240173,
0.10927622765302658,
0.024675650522112846,
0.195825457572937,
0.0250800009816885,
-0.04914003983139992,
-0.12460087239742279,
0.02292012982070446,
0.012434997595846653,
-0.0567244216799736,
0.05464257672429085,
0.07972842454910278,
-0.09245497733354568,
-0.02643316052854061,
0.0736164003610611,
0.040207646787166595,
0.040536463260650635,
-0.12785013020038605,
-0.130097895860672,
-0.01677129417657852,
0.04410306364297867,
0.06830945611000061,
0.03920266404747963,
0.028024563565850258,
0.19659002125263214,
-0.1535772681236267,
0.1040927842259407,
0.1168038472533226,
-0.2951667606830597,
-0.0032763986382633448,
0.12271767109632492,
0.02307300455868244,
0.002292903373017907,
-0.046535883098840714,
0.026178892701864243,
0.025552578270435333,
0.011873415671288967,
0.027952805161476135,
-0.08463964611291885,
-0.08450421690940857,
0.03911300376057625,
-0.10113436728715897,
-0.025793785229325294,
0.17653688788414001,
-0.04118471220135689,
0.051768165081739426,
0.011256481520831585,
-0.10429224371910095,
-0.06710886210203171,
-0.016446024179458618,
-0.016140753403306007,
-0.029147058725357056,
0.05014093220233917,
0.020645350217819214,
0.04166753962635994,
-0.09745467454195023,
0.026441968977451324,
-0.21823887526988983,
0.2394854873418808,
0.026930933818221092,
0.06543339043855667,
-0.1759624481201172,
0.06742089986801147,
0.002468109829351306,
-0.080185666680336,
0.03979361802339554,
-0.10338287800550461,
0.000461930176243186,
-0.05314353480935097,
-0.03661246970295906,
0.03140731528401375,
0.06317161023616791,
0.17395952343940735,
0.0793117806315422,
0.05008377507328987,
-0.005898697301745415,
0.07871981710195541,
0.034834783524274826,
0.12483206391334534,
0.004537689033895731,
-0.03222474083304405,
0.04727894067764282,
-0.12738829851150513,
-0.022797152400016785,
-0.057654689997434616,
-0.13722378015518188,
-0.035683806985616684,
0.08572584390640259,
0.08669279515743256,
-0.002899006474763155,
0.07131802290678024,
-0.07259770482778549,
-0.04170111194252968,
0.0977427288889885,
-0.06708423793315887,
0.03773060068488121,
0.011165250092744827,
0.01788998953998089,
0.10969475656747818,
-0.03837720677256584,
0.006092004477977753,
-0.038544073700904846,
0.16035914421081543,
-0.06788671761751175,
-0.0086200051009655,
-0.03894633799791336,
-0.07913154363632202,
0.03580477461218834,
-0.1462087333202362,
0.05359502136707306,
-0.16970540583133698,
-0.07705289125442505,
0.03400159999728203,
0.03982843458652496,
0.006068000569939613,
-0.023860258981585503,
0.004609986208379269,
0.0009630053536966443,
0.010919814929366112,
-0.06532157212495804,
-0.059639010578393936,
-0.06180674210190773,
0.0789342075586319,
-0.026356957852840424,
0.05199717730283737,
-0.09772148728370667,
0.06380254030227661,
-0.10777808725833893,
0.014992175623774529,
-0.1314254105091095,
-0.01425449550151825,
-0.07122194766998291,
0.1727602779865265,
-0.012890664860606194,
-0.0652521625161171,
-0.042813826352357864,
0.026928367093205452,
-0.05078446492552757,
0.1189238503575325,
-0.05899157002568245,
-0.11599802225828171,
0.16220137476921082,
-0.10287307947874069,
-0.12483754754066467,
0.07581168413162231,
-0.01511769462376833,
-0.007789896801114082,
0.06788471341133118,
0.1409188061952591,
0.11342629045248032,
-0.021826764568686485,
0.08137305825948715,
0.10294424742460251,
-0.1294003278017044,
-0.13587328791618347,
0.006361662410199642,
0.00798719096928835,
-0.14556561410427094,
0.052592433989048004,
0.05236833915114403,
0.0726967453956604,
-0.07074553519487381,
-0.03274666517972946,
-0.01327072735875845,
-0.016134168952703476,
0.11864199489355087,
0.06209159642457962,
0.11604267358779907,
-0.0657677873969078,
0.00820392556488514,
0.047656312584877014,
0.004204337950795889,
0.036543652415275574,
0.015800083056092262,
-0.08706195652484894,
0.12248360365629196,
-0.04112236201763153,
-0.0011841603554785252,
-0.1986646205186844,
-0.09740031510591507,
0.023004116490483284,
0.07795482873916626,
-0.03679869696497917,
0.13610167801380157,
0.0701465755701065,
-0.04764446243643761,
0.00010037058382295072,
-0.03370752930641174,
0.17777207493782043,
0.04204830154776573,
-0.07011453807353973,
-0.07916951924562454,
0.018525518476963043,
-0.07664067298173904,
-0.030626622959971428,
-0.05736034736037254,
0.004993060603737831,
0.09117806702852249,
0.15110987424850464,
0.013702728785574436,
0.07227712869644165,
-0.04311654344201088,
0.06486255675554276,
-0.07453466951847076,
0.009454507380723953,
0.10986973345279694,
-0.00844109058380127,
-0.05139068886637688,
0.12689460813999176,
-0.1288842260837555,
0.3503526449203491,
0.18760092556476593,
-0.3039218783378601,
-0.0014054732164368033,
-0.04166017845273018,
-0.015616725198924541,
0.008022678084671497,
0.052066002041101456,
0.02243027463555336,
0.055117130279541016,
0.004745377227663994,
0.16756683588027954,
-0.015916254371404648,
-0.05048782005906105,
0.01112330798059702,
-0.05658484622836113,
-0.04292769730091095,
0.0737248957157135,
0.08670081943273544,
-0.20279163122177124,
0.1850307285785675,
0.22442755103111267,
0.0013078266056254506,
0.10093411803245544,
-0.009863444603979588,
0.027319302782416344,
0.03913215547800064,
-0.0403471440076828,
-0.01890004798769951,
-0.015322047285735607,
-0.1852157860994339,
-0.04855918884277344,
0.07947539538145065,
0.031089356169104576,
0.044881563633680344,
-0.12700966000556946,
-0.023778779432177544,
0.019753707572817802,
0.05750936269760132,
-0.006435907445847988,
0.08618827164173126,
0.05749829113483429,
0.08228709548711777,
-0.00042137576383538544,
-0.12340108305215836,
0.11843181401491165,
0.008938729763031006,
-0.06630217283964157,
0.1701919436454773,
-0.13096192479133606,
-0.2919446527957916,
-0.12918199598789215,
-0.21609659492969513,
-0.02221912518143654,
0.04468577727675438,
0.06791006773710251,
-0.09210261702537537,
-0.05621439218521118,
0.07646377384662628,
-0.0043442221358418465,
-0.08893678337335587,
0.06794130802154541,
-0.08020073175430298,
0.05017327144742012,
-0.04635123163461685,
-0.06104189157485962,
-0.06701690703630447,
-0.041732318699359894,
-0.02662169374525547,
0.13805724680423737,
-0.09828965365886688,
0.06120699644088745,
0.17539545893669128,
-0.011075721122324467,
0.06305833160877228,
-0.023022029548883438,
0.17187125980854034,
-0.051036398857831955,
-0.01183983776718378,
0.15369866788387299,
-0.07339499890804291,
0.08695117384195328,
0.16216324269771576,
0.039562564343214035,
-0.05847073718905449,
0.00318810623139143,
-0.03215182200074196,
-0.10039623826742172,
-0.18391156196594238,
-0.12352752685546875,
-0.11139705777168274,
0.03804181143641472,
0.07061707973480225,
0.06962733715772629,
0.135431706905365,
0.09561170637607574,
0.051199477165937424,
0.010959632694721222,
-0.048153605312108994,
0.07466201484203339,
0.2208920270204544,
0.001999937929213047,
0.1435864269733429,
-0.04169393703341484,
-0.1389981210231781,
0.07379768788814545,
0.05560476705431938,
0.13692645728588104,
0.10865066200494766,
-0.01025706622749567,
0.01681477762758732,
0.15626244246959686,
0.18229423463344574,
0.1308237463235855,
0.006019692402333021,
-0.03366916626691818,
-0.00786581914871931,
0.01499095093458891,
-0.05352693051099777,
0.018093694001436234,
0.1126210168004036,
-0.1184353232383728,
-0.05580713599920273,
-0.1544867902994156,
0.06733953207731247,
0.09452182799577713,
0.05705242604017258,
-0.21316885948181152,
0.015583771280944347,
0.07424211502075195,
-0.02289263904094696,
-0.07244562357664108,
0.07719103991985321,
-0.06336886435747147,
-0.14039036631584167,
0.06610569357872009,
-0.05594165250658989,
0.1137172058224678,
-0.07018838077783585,
0.07125633209943771,
0.00436688307672739,
-0.09134659916162491,
0.03966715931892395,
0.09198823571205139,
-0.24242180585861206,
0.2353951781988144,
-0.005842206999659538,
-0.08202855288982391,
-0.07314476370811462,
-0.01116443332284689,
0.040539294481277466,
0.20659157633781433,
0.06959166377782822,
0.015105132944881916,
-0.09668217599391937,
-0.2116534262895584,
-0.01098657213151455,
0.0037440438754856586,
0.10247340053319931,
-0.042362287640571594,
-0.020731588825583458,
-0.04073396697640419,
-0.02784070186316967,
-0.013651788234710693,
-0.025264961645007133,
0.03604467213153839,
-0.12370047718286514,
0.0628918707370758,
0.03108403831720352,
0.0375843271613121,
0.011039070785045624,
-0.053596705198287964,
-0.131315678358078,
0.20352721214294434,
-0.08504047989845276,
-0.057783741503953934,
-0.11823975294828415,
-0.09986942261457443,
0.06580016016960144,
-0.09040012210607529,
0.08111396431922913,
-0.08633793145418167,
0.013077793642878532,
-0.03235287219285965,
-0.1910691112279892,
0.14946654438972473,
-0.11215386539697647,
-0.021799318492412567,
-0.08084782212972641,
0.13647201657295227,
-0.07384097576141357,
0.013447601348161697,
0.013732876628637314,
0.02437661960721016,
-0.08041444420814514,
-0.0828900933265686,
0.006355836056172848,
-0.014282151125371456,
0.031239764764904976,
0.025724230334162712,
-0.06753705441951752,
-0.0018155953148379922,
-0.011079292744398117,
0.043730828911066055,
0.24128669500350952,
0.1798989176750183,
-0.08305246382951736,
0.11879931390285492,
0.15382987260818481,
-0.04894930124282837,
-0.31972965598106384,
-0.07097899168729782,
-0.11553299427032471,
-0.045486610382795334,
-0.038512568920850754,
-0.1360434889793396,
0.1564210206270218,
0.02604006603360176,
-0.04126296564936638,
0.08387403935194016,
-0.14068999886512756,
-0.07977245002985,
0.22926518321037292,
0.0029623862355947495,
0.40268903970718384,
-0.08702389895915985,
-0.08436572551727295,
-0.01582670770585537,
-0.15964074432849884,
0.11801808327436447,
0.04110552370548248,
0.06407788395881653,
-0.02680877409875393,
0.053290240466594696,
0.039342157542705536,
-0.06043728440999985,
0.09375226497650146,
0.031009389087557793,
0.044987753033638,
-0.10373856127262115,
-0.13265928626060486,
0.028911620378494263,
-0.030960315838456154,
-0.014786438085138798,
0.057378023862838745,
0.022116998210549355,
-0.12861059606075287,
-0.025071382522583008,
-0.07482665032148361,
0.08682981133460999,
0.03502606973052025,
-0.06619370728731155,
-0.003656937973573804,
-0.011468438431620598,
-0.010950524359941483,
-0.007217009086161852,
0.25750675797462463,
0.003593818750232458,
0.14098960161209106,
0.10273806750774384,
0.09543769061565399,
-0.17969036102294922,
-0.03609991818666458,
-0.07452508062124252,
-0.06570184230804443,
0.096438467502594,
-0.030126970261335373,
0.07519517093896866,
0.1516796350479126,
-0.04122238606214523,
0.04297369718551636,
0.12120365351438522,
0.047616977244615555,
-0.0461055189371109,
0.14043885469436646,
-0.20969203114509583,
0.03770104795694351,
-0.02787908911705017,
-0.02063949778676033,
0.07864256948232651,
0.10774786025285721,
0.10418630391359329,
0.04115518182516098,
-0.035168662667274475,
0.01394918467849493,
-0.02965446747839451,
-0.03510928899049759,
0.07017704844474792,
0.06717298924922943,
0.043598420917987823,
-0.13562940061092377,
0.03572770580649376,
0.03545781224966049,
-0.15994900465011597,
-0.04561840742826462,
0.0798286497592926,
-0.15686658024787903,
-0.11122603714466095,
-0.02263965830206871,
0.11455298960208893,
-0.14521896839141846,
-0.039632488042116165,
-0.04641811549663544,
-0.13418996334075928,
0.06972779333591461,
0.18159188330173492,
0.13103143870830536,
0.10970540344715118,
-0.05561865121126175,
-0.021705783903598785,
-0.011595118790864944,
-0.01822039857506752,
0.006359242834150791,
0.06845489144325256,
-0.17037172615528107,
0.017899589613080025,
-0.01394583098590374,
0.14702916145324707,
-0.09660974144935608,
-0.07415182143449783,
-0.16891857981681824,
0.04697556421160698,
-0.09303826838731766,
-0.07159677147865295,
-0.08290879428386688,
-0.02118426188826561,
0.028802918270230293,
-0.08412807434797287,
-0.03680410608649254,
-0.03769955411553383,
-0.12591150403022766,
0.058045439422130585,
0.016383599489927292,
0.029256748035550117,
-0.04278605803847313,
-0.049387238919734955,
0.10788699239492416,
-0.040735047310590744,
0.09248634427785873,
0.11333432793617249,
-0.06952280551195145,
0.08524111658334732,
-0.09002139419317245,
-0.12434586137533188,
0.12248294800519943,
0.02376852184534073,
0.11538374423980713,
0.037905752658843994,
0.03187960386276245,
0.07364030182361603,
0.015629447996616364,
0.05559782683849335,
0.05336270108819008,
-0.12224778532981873,
0.02708340249955654,
-0.01786322332918644,
-0.1931128054857254,
-0.02867043949663639,
-0.07598401606082916,
0.12200799584388733,
-0.0023715540301054716,
0.15108288824558258,
-0.006962954066693783,
0.08709387481212616,
-0.04373457655310631,
-0.00553933484479785,
-0.020426053553819656,
-0.20569974184036255,
-0.038188423961400986,
-0.05518931895494461,
0.0054757255129516125,
-0.003793900366872549,
0.25238698720932007,
0.03206604719161987,
0.021465452387928963,
0.04169522970914841,
0.05841078236699104,
0.0034062466584146023,
0.03988324850797653,
0.17441175878047943,
0.1048174500465393,
-0.044596027582883835,
-0.06612113863229752,
0.07449153810739517,
0.014104411005973816,
-0.03425929695367813,
0.10280604660511017,
0.0579635351896286,
-0.022101202979683876,
0.061401285231113434,
0.006350876297801733,
0.03077852725982666,
-0.17194421589374542,
-0.18425245583057404,
-0.0518796369433403,
0.07249171286821365,
0.02366054244339466,
0.06857233494520187,
0.10899512469768524,
-0.026017313823103905,
0.0467560812830925,
-0.04243495315313339,
-0.03868629038333893,
-0.19200804829597473,
-0.11754908412694931,
-0.09499595314264297,
-0.10949105769395828,
0.012144351378083229,
-0.045467864722013474,
-0.025607097893953323,
0.112579844892025,
0.05665137246251106,
-0.02637707255780697,
0.07898925989866257,
0.0068581425584852695,
-0.015829376876354218,
0.035544686019420624,
-0.015088078565895557,
-0.002572772093117237,
-0.008363397791981697,
-0.025032468140125275,
-0.1646868884563446,
-0.01614730805158615,
-0.059904489666223526,
-0.0038963949773460627,
-0.0638962835073471,
0.002605070825666189,
-0.10870174318552017,
-0.11156944185495377,
-0.027449829503893852,
0.031112194061279297,
-0.07530830055475235,
0.08167865127325058,
-0.012178707867860794,
0.031354498118162155,
0.02408476360142231,
0.15556955337524414,
-0.0745464414358139,
-0.05411846935749054,
-0.04471001774072647,
0.26918578147888184,
0.0574658028781414,
0.1188357025384903,
0.007326812483370304,
0.018914537504315376,
-0.0766475573182106,
0.29574328660964966,
0.26567307114601135,
-0.03664971888065338,
0.05142061784863472,
0.04157736524939537,
0.01670730672776699,
0.09608681499958038,
0.1372172236442566,
0.07940847426652908,
0.23908233642578125,
-0.07535005360841751,
-0.04486403986811638,
-0.029895322397351265,
-0.017877299338579178,
-0.10601367056369781,
0.0677277147769928,
0.0556669756770134,
-0.0367586687207222,
-0.08847290277481079,
0.07175103574991226,
-0.16698697209358215,
0.1506146639585495,
0.055220942944288254,
-0.18291416764259338,
-0.07399025559425354,
-0.022136300802230835,
0.1443227231502533,
-0.019863387569785118,
0.0790862888097763,
-0.031677428632974625,
-0.10550876706838608,
0.039427150040864944,
0.01414680015295744,
-0.21330086886882782,
-0.05566805601119995,
0.0937623530626297,
0.0036555929109454155,
0.05017957463860512,
-0.023827895522117615,
0.03583429008722305,
0.08428710699081421,
0.0726693794131279,
-0.04607251659035683,
0.006363187450915575,
0.011929735541343689,
-0.08450563997030258,
-0.03499215841293335,
0.00016984343528747559,
0.013966171070933342,
-0.05488259717822075,
0.03193806856870651,
-0.18189109861850739,
0.04003556817770004,
-0.09101450443267822,
-0.036184389144182205,
-0.019026435911655426,
0.023357758298516273,
-0.029626764357089996,
0.05516811087727547,
0.07363101094961166,
0.009205193258821964,
-0.03664170578122139,
-0.06109684333205223,
-0.025447756052017212,
0.03078463301062584,
-0.11446559429168701,
-0.14089645445346832,
-0.08753776550292969,
-0.06155245006084442,
0.09708955883979797,
-0.01227374467998743,
-0.0782943144440651,
-0.04041222110390663,
-0.07965502887964249,
0.03774513676762581,
-0.14707180857658386,
0.06991016864776611,
0.03579777479171753,
0.04206673055887222,
-0.01093299314379692,
-0.03975704312324524,
0.019534343853592873,
0.054816145449876785,
-0.12402302771806717,
-0.09320621192455292
] |
null | null | transformers | This model identifies compound verb phrases (including conjoins and coordinators) in an input sentence.
Try the test sentence:
John kicked the ball [and] chased after it.
The model was derived using code adapted from an original program written by Dr. Le An Ha at the University of Wolverhampton. | {} | token-classification | RJ3vans/CMV1spanTagger | [
"transformers",
"pytorch",
"bert",
"token-classification",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [] | TAGS
#transformers #pytorch #bert #token-classification #autotrain_compatible #endpoints_compatible #region-us
| This model identifies compound verb phrases (including conjoins and coordinators) in an input sentence.
Try the test sentence:
John kicked the ball [and] chased after it.
The model was derived using code adapted from an original program written by Dr. Le An Ha at the University of Wolverhampton. | [] | [
"TAGS\n#transformers #pytorch #bert #token-classification #autotrain_compatible #endpoints_compatible #region-us \n"
] | [
37
] | [
"passage: TAGS\n#transformers #pytorch #bert #token-classification #autotrain_compatible #endpoints_compatible #region-us \n"
] | [
-0.04952388256788254,
0.052763525396585464,
-0.008742042817175388,
0.033980391919612885,
0.16650345921516418,
0.031232766807079315,
0.056794650852680206,
0.08634597808122635,
0.05724777653813362,
-0.022096728906035423,
0.12041265517473221,
0.25661665201187134,
-0.04172574356198311,
0.09726614505052567,
-0.09425565600395203,
-0.2964369058609009,
0.072568878531456,
0.08518822491168976,
-0.03685073181986809,
0.10874255001544952,
0.0826217457652092,
-0.09953062981367111,
0.07087896764278412,
-0.021146323531866074,
-0.14565661549568176,
0.03782055899500847,
0.04566175118088722,
-0.12290510535240173,
0.10927622765302658,
0.024675650522112846,
0.195825457572937,
0.0250800009816885,
-0.04914003983139992,
-0.12460087239742279,
0.02292012982070446,
0.012434997595846653,
-0.0567244216799736,
0.05464257672429085,
0.07972842454910278,
-0.09245497733354568,
-0.02643316052854061,
0.0736164003610611,
0.040207646787166595,
0.040536463260650635,
-0.12785013020038605,
-0.130097895860672,
-0.01677129417657852,
0.04410306364297867,
0.06830945611000061,
0.03920266404747963,
0.028024563565850258,
0.19659002125263214,
-0.1535772681236267,
0.1040927842259407,
0.1168038472533226,
-0.2951667606830597,
-0.0032763986382633448,
0.12271767109632492,
0.02307300455868244,
0.002292903373017907,
-0.046535883098840714,
0.026178892701864243,
0.025552578270435333,
0.011873415671288967,
0.027952805161476135,
-0.08463964611291885,
-0.08450421690940857,
0.03911300376057625,
-0.10113436728715897,
-0.025793785229325294,
0.17653688788414001,
-0.04118471220135689,
0.051768165081739426,
0.011256481520831585,
-0.10429224371910095,
-0.06710886210203171,
-0.016446024179458618,
-0.016140753403306007,
-0.029147058725357056,
0.05014093220233917,
0.020645350217819214,
0.04166753962635994,
-0.09745467454195023,
0.026441968977451324,
-0.21823887526988983,
0.2394854873418808,
0.026930933818221092,
0.06543339043855667,
-0.1759624481201172,
0.06742089986801147,
0.002468109829351306,
-0.080185666680336,
0.03979361802339554,
-0.10338287800550461,
0.000461930176243186,
-0.05314353480935097,
-0.03661246970295906,
0.03140731528401375,
0.06317161023616791,
0.17395952343940735,
0.0793117806315422,
0.05008377507328987,
-0.005898697301745415,
0.07871981710195541,
0.034834783524274826,
0.12483206391334534,
0.004537689033895731,
-0.03222474083304405,
0.04727894067764282,
-0.12738829851150513,
-0.022797152400016785,
-0.057654689997434616,
-0.13722378015518188,
-0.035683806985616684,
0.08572584390640259,
0.08669279515743256,
-0.002899006474763155,
0.07131802290678024,
-0.07259770482778549,
-0.04170111194252968,
0.0977427288889885,
-0.06708423793315887,
0.03773060068488121,
0.011165250092744827,
0.01788998953998089,
0.10969475656747818,
-0.03837720677256584,
0.006092004477977753,
-0.038544073700904846,
0.16035914421081543,
-0.06788671761751175,
-0.0086200051009655,
-0.03894633799791336,
-0.07913154363632202,
0.03580477461218834,
-0.1462087333202362,
0.05359502136707306,
-0.16970540583133698,
-0.07705289125442505,
0.03400159999728203,
0.03982843458652496,
0.006068000569939613,
-0.023860258981585503,
0.004609986208379269,
0.0009630053536966443,
0.010919814929366112,
-0.06532157212495804,
-0.059639010578393936,
-0.06180674210190773,
0.0789342075586319,
-0.026356957852840424,
0.05199717730283737,
-0.09772148728370667,
0.06380254030227661,
-0.10777808725833893,
0.014992175623774529,
-0.1314254105091095,
-0.01425449550151825,
-0.07122194766998291,
0.1727602779865265,
-0.012890664860606194,
-0.0652521625161171,
-0.042813826352357864,
0.026928367093205452,
-0.05078446492552757,
0.1189238503575325,
-0.05899157002568245,
-0.11599802225828171,
0.16220137476921082,
-0.10287307947874069,
-0.12483754754066467,
0.07581168413162231,
-0.01511769462376833,
-0.007789896801114082,
0.06788471341133118,
0.1409188061952591,
0.11342629045248032,
-0.021826764568686485,
0.08137305825948715,
0.10294424742460251,
-0.1294003278017044,
-0.13587328791618347,
0.006361662410199642,
0.00798719096928835,
-0.14556561410427094,
0.052592433989048004,
0.05236833915114403,
0.0726967453956604,
-0.07074553519487381,
-0.03274666517972946,
-0.01327072735875845,
-0.016134168952703476,
0.11864199489355087,
0.06209159642457962,
0.11604267358779907,
-0.0657677873969078,
0.00820392556488514,
0.047656312584877014,
0.004204337950795889,
0.036543652415275574,
0.015800083056092262,
-0.08706195652484894,
0.12248360365629196,
-0.04112236201763153,
-0.0011841603554785252,
-0.1986646205186844,
-0.09740031510591507,
0.023004116490483284,
0.07795482873916626,
-0.03679869696497917,
0.13610167801380157,
0.0701465755701065,
-0.04764446243643761,
0.00010037058382295072,
-0.03370752930641174,
0.17777207493782043,
0.04204830154776573,
-0.07011453807353973,
-0.07916951924562454,
0.018525518476963043,
-0.07664067298173904,
-0.030626622959971428,
-0.05736034736037254,
0.004993060603737831,
0.09117806702852249,
0.15110987424850464,
0.013702728785574436,
0.07227712869644165,
-0.04311654344201088,
0.06486255675554276,
-0.07453466951847076,
0.009454507380723953,
0.10986973345279694,
-0.00844109058380127,
-0.05139068886637688,
0.12689460813999176,
-0.1288842260837555,
0.3503526449203491,
0.18760092556476593,
-0.3039218783378601,
-0.0014054732164368033,
-0.04166017845273018,
-0.015616725198924541,
0.008022678084671497,
0.052066002041101456,
0.02243027463555336,
0.055117130279541016,
0.004745377227663994,
0.16756683588027954,
-0.015916254371404648,
-0.05048782005906105,
0.01112330798059702,
-0.05658484622836113,
-0.04292769730091095,
0.0737248957157135,
0.08670081943273544,
-0.20279163122177124,
0.1850307285785675,
0.22442755103111267,
0.0013078266056254506,
0.10093411803245544,
-0.009863444603979588,
0.027319302782416344,
0.03913215547800064,
-0.0403471440076828,
-0.01890004798769951,
-0.015322047285735607,
-0.1852157860994339,
-0.04855918884277344,
0.07947539538145065,
0.031089356169104576,
0.044881563633680344,
-0.12700966000556946,
-0.023778779432177544,
0.019753707572817802,
0.05750936269760132,
-0.006435907445847988,
0.08618827164173126,
0.05749829113483429,
0.08228709548711777,
-0.00042137576383538544,
-0.12340108305215836,
0.11843181401491165,
0.008938729763031006,
-0.06630217283964157,
0.1701919436454773,
-0.13096192479133606,
-0.2919446527957916,
-0.12918199598789215,
-0.21609659492969513,
-0.02221912518143654,
0.04468577727675438,
0.06791006773710251,
-0.09210261702537537,
-0.05621439218521118,
0.07646377384662628,
-0.0043442221358418465,
-0.08893678337335587,
0.06794130802154541,
-0.08020073175430298,
0.05017327144742012,
-0.04635123163461685,
-0.06104189157485962,
-0.06701690703630447,
-0.041732318699359894,
-0.02662169374525547,
0.13805724680423737,
-0.09828965365886688,
0.06120699644088745,
0.17539545893669128,
-0.011075721122324467,
0.06305833160877228,
-0.023022029548883438,
0.17187125980854034,
-0.051036398857831955,
-0.01183983776718378,
0.15369866788387299,
-0.07339499890804291,
0.08695117384195328,
0.16216324269771576,
0.039562564343214035,
-0.05847073718905449,
0.00318810623139143,
-0.03215182200074196,
-0.10039623826742172,
-0.18391156196594238,
-0.12352752685546875,
-0.11139705777168274,
0.03804181143641472,
0.07061707973480225,
0.06962733715772629,
0.135431706905365,
0.09561170637607574,
0.051199477165937424,
0.010959632694721222,
-0.048153605312108994,
0.07466201484203339,
0.2208920270204544,
0.001999937929213047,
0.1435864269733429,
-0.04169393703341484,
-0.1389981210231781,
0.07379768788814545,
0.05560476705431938,
0.13692645728588104,
0.10865066200494766,
-0.01025706622749567,
0.01681477762758732,
0.15626244246959686,
0.18229423463344574,
0.1308237463235855,
0.006019692402333021,
-0.03366916626691818,
-0.00786581914871931,
0.01499095093458891,
-0.05352693051099777,
0.018093694001436234,
0.1126210168004036,
-0.1184353232383728,
-0.05580713599920273,
-0.1544867902994156,
0.06733953207731247,
0.09452182799577713,
0.05705242604017258,
-0.21316885948181152,
0.015583771280944347,
0.07424211502075195,
-0.02289263904094696,
-0.07244562357664108,
0.07719103991985321,
-0.06336886435747147,
-0.14039036631584167,
0.06610569357872009,
-0.05594165250658989,
0.1137172058224678,
-0.07018838077783585,
0.07125633209943771,
0.00436688307672739,
-0.09134659916162491,
0.03966715931892395,
0.09198823571205139,
-0.24242180585861206,
0.2353951781988144,
-0.005842206999659538,
-0.08202855288982391,
-0.07314476370811462,
-0.01116443332284689,
0.040539294481277466,
0.20659157633781433,
0.06959166377782822,
0.015105132944881916,
-0.09668217599391937,
-0.2116534262895584,
-0.01098657213151455,
0.0037440438754856586,
0.10247340053319931,
-0.042362287640571594,
-0.020731588825583458,
-0.04073396697640419,
-0.02784070186316967,
-0.013651788234710693,
-0.025264961645007133,
0.03604467213153839,
-0.12370047718286514,
0.0628918707370758,
0.03108403831720352,
0.0375843271613121,
0.011039070785045624,
-0.053596705198287964,
-0.131315678358078,
0.20352721214294434,
-0.08504047989845276,
-0.057783741503953934,
-0.11823975294828415,
-0.09986942261457443,
0.06580016016960144,
-0.09040012210607529,
0.08111396431922913,
-0.08633793145418167,
0.013077793642878532,
-0.03235287219285965,
-0.1910691112279892,
0.14946654438972473,
-0.11215386539697647,
-0.021799318492412567,
-0.08084782212972641,
0.13647201657295227,
-0.07384097576141357,
0.013447601348161697,
0.013732876628637314,
0.02437661960721016,
-0.08041444420814514,
-0.0828900933265686,
0.006355836056172848,
-0.014282151125371456,
0.031239764764904976,
0.025724230334162712,
-0.06753705441951752,
-0.0018155953148379922,
-0.011079292744398117,
0.043730828911066055,
0.24128669500350952,
0.1798989176750183,
-0.08305246382951736,
0.11879931390285492,
0.15382987260818481,
-0.04894930124282837,
-0.31972965598106384,
-0.07097899168729782,
-0.11553299427032471,
-0.045486610382795334,
-0.038512568920850754,
-0.1360434889793396,
0.1564210206270218,
0.02604006603360176,
-0.04126296564936638,
0.08387403935194016,
-0.14068999886512756,
-0.07977245002985,
0.22926518321037292,
0.0029623862355947495,
0.40268903970718384,
-0.08702389895915985,
-0.08436572551727295,
-0.01582670770585537,
-0.15964074432849884,
0.11801808327436447,
0.04110552370548248,
0.06407788395881653,
-0.02680877409875393,
0.053290240466594696,
0.039342157542705536,
-0.06043728440999985,
0.09375226497650146,
0.031009389087557793,
0.044987753033638,
-0.10373856127262115,
-0.13265928626060486,
0.028911620378494263,
-0.030960315838456154,
-0.014786438085138798,
0.057378023862838745,
0.022116998210549355,
-0.12861059606075287,
-0.025071382522583008,
-0.07482665032148361,
0.08682981133460999,
0.03502606973052025,
-0.06619370728731155,
-0.003656937973573804,
-0.011468438431620598,
-0.010950524359941483,
-0.007217009086161852,
0.25750675797462463,
0.003593818750232458,
0.14098960161209106,
0.10273806750774384,
0.09543769061565399,
-0.17969036102294922,
-0.03609991818666458,
-0.07452508062124252,
-0.06570184230804443,
0.096438467502594,
-0.030126970261335373,
0.07519517093896866,
0.1516796350479126,
-0.04122238606214523,
0.04297369718551636,
0.12120365351438522,
0.047616977244615555,
-0.0461055189371109,
0.14043885469436646,
-0.20969203114509583,
0.03770104795694351,
-0.02787908911705017,
-0.02063949778676033,
0.07864256948232651,
0.10774786025285721,
0.10418630391359329,
0.04115518182516098,
-0.035168662667274475,
0.01394918467849493,
-0.02965446747839451,
-0.03510928899049759,
0.07017704844474792,
0.06717298924922943,
0.043598420917987823,
-0.13562940061092377,
0.03572770580649376,
0.03545781224966049,
-0.15994900465011597,
-0.04561840742826462,
0.0798286497592926,
-0.15686658024787903,
-0.11122603714466095,
-0.02263965830206871,
0.11455298960208893,
-0.14521896839141846,
-0.039632488042116165,
-0.04641811549663544,
-0.13418996334075928,
0.06972779333591461,
0.18159188330173492,
0.13103143870830536,
0.10970540344715118,
-0.05561865121126175,
-0.021705783903598785,
-0.011595118790864944,
-0.01822039857506752,
0.006359242834150791,
0.06845489144325256,
-0.17037172615528107,
0.017899589613080025,
-0.01394583098590374,
0.14702916145324707,
-0.09660974144935608,
-0.07415182143449783,
-0.16891857981681824,
0.04697556421160698,
-0.09303826838731766,
-0.07159677147865295,
-0.08290879428386688,
-0.02118426188826561,
0.028802918270230293,
-0.08412807434797287,
-0.03680410608649254,
-0.03769955411553383,
-0.12591150403022766,
0.058045439422130585,
0.016383599489927292,
0.029256748035550117,
-0.04278605803847313,
-0.049387238919734955,
0.10788699239492416,
-0.040735047310590744,
0.09248634427785873,
0.11333432793617249,
-0.06952280551195145,
0.08524111658334732,
-0.09002139419317245,
-0.12434586137533188,
0.12248294800519943,
0.02376852184534073,
0.11538374423980713,
0.037905752658843994,
0.03187960386276245,
0.07364030182361603,
0.015629447996616364,
0.05559782683849335,
0.05336270108819008,
-0.12224778532981873,
0.02708340249955654,
-0.01786322332918644,
-0.1931128054857254,
-0.02867043949663639,
-0.07598401606082916,
0.12200799584388733,
-0.0023715540301054716,
0.15108288824558258,
-0.006962954066693783,
0.08709387481212616,
-0.04373457655310631,
-0.00553933484479785,
-0.020426053553819656,
-0.20569974184036255,
-0.038188423961400986,
-0.05518931895494461,
0.0054757255129516125,
-0.003793900366872549,
0.25238698720932007,
0.03206604719161987,
0.021465452387928963,
0.04169522970914841,
0.05841078236699104,
0.0034062466584146023,
0.03988324850797653,
0.17441175878047943,
0.1048174500465393,
-0.044596027582883835,
-0.06612113863229752,
0.07449153810739517,
0.014104411005973816,
-0.03425929695367813,
0.10280604660511017,
0.0579635351896286,
-0.022101202979683876,
0.061401285231113434,
0.006350876297801733,
0.03077852725982666,
-0.17194421589374542,
-0.18425245583057404,
-0.0518796369433403,
0.07249171286821365,
0.02366054244339466,
0.06857233494520187,
0.10899512469768524,
-0.026017313823103905,
0.0467560812830925,
-0.04243495315313339,
-0.03868629038333893,
-0.19200804829597473,
-0.11754908412694931,
-0.09499595314264297,
-0.10949105769395828,
0.012144351378083229,
-0.045467864722013474,
-0.025607097893953323,
0.112579844892025,
0.05665137246251106,
-0.02637707255780697,
0.07898925989866257,
0.0068581425584852695,
-0.015829376876354218,
0.035544686019420624,
-0.015088078565895557,
-0.002572772093117237,
-0.008363397791981697,
-0.025032468140125275,
-0.1646868884563446,
-0.01614730805158615,
-0.059904489666223526,
-0.0038963949773460627,
-0.0638962835073471,
0.002605070825666189,
-0.10870174318552017,
-0.11156944185495377,
-0.027449829503893852,
0.031112194061279297,
-0.07530830055475235,
0.08167865127325058,
-0.012178707867860794,
0.031354498118162155,
0.02408476360142231,
0.15556955337524414,
-0.0745464414358139,
-0.05411846935749054,
-0.04471001774072647,
0.26918578147888184,
0.0574658028781414,
0.1188357025384903,
0.007326812483370304,
0.018914537504315376,
-0.0766475573182106,
0.29574328660964966,
0.26567307114601135,
-0.03664971888065338,
0.05142061784863472,
0.04157736524939537,
0.01670730672776699,
0.09608681499958038,
0.1372172236442566,
0.07940847426652908,
0.23908233642578125,
-0.07535005360841751,
-0.04486403986811638,
-0.029895322397351265,
-0.017877299338579178,
-0.10601367056369781,
0.0677277147769928,
0.0556669756770134,
-0.0367586687207222,
-0.08847290277481079,
0.07175103574991226,
-0.16698697209358215,
0.1506146639585495,
0.055220942944288254,
-0.18291416764259338,
-0.07399025559425354,
-0.022136300802230835,
0.1443227231502533,
-0.019863387569785118,
0.0790862888097763,
-0.031677428632974625,
-0.10550876706838608,
0.039427150040864944,
0.01414680015295744,
-0.21330086886882782,
-0.05566805601119995,
0.0937623530626297,
0.0036555929109454155,
0.05017957463860512,
-0.023827895522117615,
0.03583429008722305,
0.08428710699081421,
0.0726693794131279,
-0.04607251659035683,
0.006363187450915575,
0.011929735541343689,
-0.08450563997030258,
-0.03499215841293335,
0.00016984343528747559,
0.013966171070933342,
-0.05488259717822075,
0.03193806856870651,
-0.18189109861850739,
0.04003556817770004,
-0.09101450443267822,
-0.036184389144182205,
-0.019026435911655426,
0.023357758298516273,
-0.029626764357089996,
0.05516811087727547,
0.07363101094961166,
0.009205193258821964,
-0.03664170578122139,
-0.06109684333205223,
-0.025447756052017212,
0.03078463301062584,
-0.11446559429168701,
-0.14089645445346832,
-0.08753776550292969,
-0.06155245006084442,
0.09708955883979797,
-0.01227374467998743,
-0.0782943144440651,
-0.04041222110390663,
-0.07965502887964249,
0.03774513676762581,
-0.14707180857658386,
0.06991016864776611,
0.03579777479171753,
0.04206673055887222,
-0.01093299314379692,
-0.03975704312324524,
0.019534343853592873,
0.054816145449876785,
-0.12402302771806717,
-0.09320621192455292
] |
null | null | transformers | Try the test sentences:
<i>My name is Sarah and I live in London[, which] is the largest city in the UK.</i>
<i>John thought that that was a strange idea.</i>
<i>It was on Tuesdays when Peter took Tess for a walk.</i>
<i>John was so large that he had to crouch to fit through the front door.</i>
The model should tag the tokens in the sentence with information about whether or not they are contained within particular types of syntactic constituents.
If you find the model useful, please cite my thesis which presents the dataset used for finetuning:
Evans, R. (2020) Sentence Simplification for Text Processing. Doctoral thesis. University of Wolverhampton. Wolverhampton, UK. (http://rgcl.wlv.ac.uk/~richard/Evans2020_SentenceSimplificationForTextProcessing.pdf)
There you will find more information about the tagging scheme. | {} | token-classification | RJ3vans/13.05.2022.SSCCVspanTagger | [
"transformers",
"pytorch",
"bert",
"token-classification",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [] | TAGS
#transformers #pytorch #bert #token-classification #autotrain_compatible #endpoints_compatible #region-us
| Try the test sentences:
<i>My name is Sarah and I live in London[, which] is the largest city in the UK.</i>
<i>John thought that that was a strange idea.</i>
<i>It was on Tuesdays when Peter took Tess for a walk.</i>
<i>John was so large that he had to crouch to fit through the front door.</i>
The model should tag the tokens in the sentence with information about whether or not they are contained within particular types of syntactic constituents.
If you find the model useful, please cite my thesis which presents the dataset used for finetuning:
Evans, R. (2020) Sentence Simplification for Text Processing. Doctoral thesis. University of Wolverhampton. Wolverhampton, UK. (URL
There you will find more information about the tagging scheme. | [] | [
"TAGS\n#transformers #pytorch #bert #token-classification #autotrain_compatible #endpoints_compatible #region-us \n"
] | [
37
] | [
"passage: TAGS\n#transformers #pytorch #bert #token-classification #autotrain_compatible #endpoints_compatible #region-us \n"
] | [
-0.04952388256788254,
0.052763525396585464,
-0.008742042817175388,
0.033980391919612885,
0.16650345921516418,
0.031232766807079315,
0.056794650852680206,
0.08634597808122635,
0.05724777653813362,
-0.022096728906035423,
0.12041265517473221,
0.25661665201187134,
-0.04172574356198311,
0.09726614505052567,
-0.09425565600395203,
-0.2964369058609009,
0.072568878531456,
0.08518822491168976,
-0.03685073181986809,
0.10874255001544952,
0.0826217457652092,
-0.09953062981367111,
0.07087896764278412,
-0.021146323531866074,
-0.14565661549568176,
0.03782055899500847,
0.04566175118088722,
-0.12290510535240173,
0.10927622765302658,
0.024675650522112846,
0.195825457572937,
0.0250800009816885,
-0.04914003983139992,
-0.12460087239742279,
0.02292012982070446,
0.012434997595846653,
-0.0567244216799736,
0.05464257672429085,
0.07972842454910278,
-0.09245497733354568,
-0.02643316052854061,
0.0736164003610611,
0.040207646787166595,
0.040536463260650635,
-0.12785013020038605,
-0.130097895860672,
-0.01677129417657852,
0.04410306364297867,
0.06830945611000061,
0.03920266404747963,
0.028024563565850258,
0.19659002125263214,
-0.1535772681236267,
0.1040927842259407,
0.1168038472533226,
-0.2951667606830597,
-0.0032763986382633448,
0.12271767109632492,
0.02307300455868244,
0.002292903373017907,
-0.046535883098840714,
0.026178892701864243,
0.025552578270435333,
0.011873415671288967,
0.027952805161476135,
-0.08463964611291885,
-0.08450421690940857,
0.03911300376057625,
-0.10113436728715897,
-0.025793785229325294,
0.17653688788414001,
-0.04118471220135689,
0.051768165081739426,
0.011256481520831585,
-0.10429224371910095,
-0.06710886210203171,
-0.016446024179458618,
-0.016140753403306007,
-0.029147058725357056,
0.05014093220233917,
0.020645350217819214,
0.04166753962635994,
-0.09745467454195023,
0.026441968977451324,
-0.21823887526988983,
0.2394854873418808,
0.026930933818221092,
0.06543339043855667,
-0.1759624481201172,
0.06742089986801147,
0.002468109829351306,
-0.080185666680336,
0.03979361802339554,
-0.10338287800550461,
0.000461930176243186,
-0.05314353480935097,
-0.03661246970295906,
0.03140731528401375,
0.06317161023616791,
0.17395952343940735,
0.0793117806315422,
0.05008377507328987,
-0.005898697301745415,
0.07871981710195541,
0.034834783524274826,
0.12483206391334534,
0.004537689033895731,
-0.03222474083304405,
0.04727894067764282,
-0.12738829851150513,
-0.022797152400016785,
-0.057654689997434616,
-0.13722378015518188,
-0.035683806985616684,
0.08572584390640259,
0.08669279515743256,
-0.002899006474763155,
0.07131802290678024,
-0.07259770482778549,
-0.04170111194252968,
0.0977427288889885,
-0.06708423793315887,
0.03773060068488121,
0.011165250092744827,
0.01788998953998089,
0.10969475656747818,
-0.03837720677256584,
0.006092004477977753,
-0.038544073700904846,
0.16035914421081543,
-0.06788671761751175,
-0.0086200051009655,
-0.03894633799791336,
-0.07913154363632202,
0.03580477461218834,
-0.1462087333202362,
0.05359502136707306,
-0.16970540583133698,
-0.07705289125442505,
0.03400159999728203,
0.03982843458652496,
0.006068000569939613,
-0.023860258981585503,
0.004609986208379269,
0.0009630053536966443,
0.010919814929366112,
-0.06532157212495804,
-0.059639010578393936,
-0.06180674210190773,
0.0789342075586319,
-0.026356957852840424,
0.05199717730283737,
-0.09772148728370667,
0.06380254030227661,
-0.10777808725833893,
0.014992175623774529,
-0.1314254105091095,
-0.01425449550151825,
-0.07122194766998291,
0.1727602779865265,
-0.012890664860606194,
-0.0652521625161171,
-0.042813826352357864,
0.026928367093205452,
-0.05078446492552757,
0.1189238503575325,
-0.05899157002568245,
-0.11599802225828171,
0.16220137476921082,
-0.10287307947874069,
-0.12483754754066467,
0.07581168413162231,
-0.01511769462376833,
-0.007789896801114082,
0.06788471341133118,
0.1409188061952591,
0.11342629045248032,
-0.021826764568686485,
0.08137305825948715,
0.10294424742460251,
-0.1294003278017044,
-0.13587328791618347,
0.006361662410199642,
0.00798719096928835,
-0.14556561410427094,
0.052592433989048004,
0.05236833915114403,
0.0726967453956604,
-0.07074553519487381,
-0.03274666517972946,
-0.01327072735875845,
-0.016134168952703476,
0.11864199489355087,
0.06209159642457962,
0.11604267358779907,
-0.0657677873969078,
0.00820392556488514,
0.047656312584877014,
0.004204337950795889,
0.036543652415275574,
0.015800083056092262,
-0.08706195652484894,
0.12248360365629196,
-0.04112236201763153,
-0.0011841603554785252,
-0.1986646205186844,
-0.09740031510591507,
0.023004116490483284,
0.07795482873916626,
-0.03679869696497917,
0.13610167801380157,
0.0701465755701065,
-0.04764446243643761,
0.00010037058382295072,
-0.03370752930641174,
0.17777207493782043,
0.04204830154776573,
-0.07011453807353973,
-0.07916951924562454,
0.018525518476963043,
-0.07664067298173904,
-0.030626622959971428,
-0.05736034736037254,
0.004993060603737831,
0.09117806702852249,
0.15110987424850464,
0.013702728785574436,
0.07227712869644165,
-0.04311654344201088,
0.06486255675554276,
-0.07453466951847076,
0.009454507380723953,
0.10986973345279694,
-0.00844109058380127,
-0.05139068886637688,
0.12689460813999176,
-0.1288842260837555,
0.3503526449203491,
0.18760092556476593,
-0.3039218783378601,
-0.0014054732164368033,
-0.04166017845273018,
-0.015616725198924541,
0.008022678084671497,
0.052066002041101456,
0.02243027463555336,
0.055117130279541016,
0.004745377227663994,
0.16756683588027954,
-0.015916254371404648,
-0.05048782005906105,
0.01112330798059702,
-0.05658484622836113,
-0.04292769730091095,
0.0737248957157135,
0.08670081943273544,
-0.20279163122177124,
0.1850307285785675,
0.22442755103111267,
0.0013078266056254506,
0.10093411803245544,
-0.009863444603979588,
0.027319302782416344,
0.03913215547800064,
-0.0403471440076828,
-0.01890004798769951,
-0.015322047285735607,
-0.1852157860994339,
-0.04855918884277344,
0.07947539538145065,
0.031089356169104576,
0.044881563633680344,
-0.12700966000556946,
-0.023778779432177544,
0.019753707572817802,
0.05750936269760132,
-0.006435907445847988,
0.08618827164173126,
0.05749829113483429,
0.08228709548711777,
-0.00042137576383538544,
-0.12340108305215836,
0.11843181401491165,
0.008938729763031006,
-0.06630217283964157,
0.1701919436454773,
-0.13096192479133606,
-0.2919446527957916,
-0.12918199598789215,
-0.21609659492969513,
-0.02221912518143654,
0.04468577727675438,
0.06791006773710251,
-0.09210261702537537,
-0.05621439218521118,
0.07646377384662628,
-0.0043442221358418465,
-0.08893678337335587,
0.06794130802154541,
-0.08020073175430298,
0.05017327144742012,
-0.04635123163461685,
-0.06104189157485962,
-0.06701690703630447,
-0.041732318699359894,
-0.02662169374525547,
0.13805724680423737,
-0.09828965365886688,
0.06120699644088745,
0.17539545893669128,
-0.011075721122324467,
0.06305833160877228,
-0.023022029548883438,
0.17187125980854034,
-0.051036398857831955,
-0.01183983776718378,
0.15369866788387299,
-0.07339499890804291,
0.08695117384195328,
0.16216324269771576,
0.039562564343214035,
-0.05847073718905449,
0.00318810623139143,
-0.03215182200074196,
-0.10039623826742172,
-0.18391156196594238,
-0.12352752685546875,
-0.11139705777168274,
0.03804181143641472,
0.07061707973480225,
0.06962733715772629,
0.135431706905365,
0.09561170637607574,
0.051199477165937424,
0.010959632694721222,
-0.048153605312108994,
0.07466201484203339,
0.2208920270204544,
0.001999937929213047,
0.1435864269733429,
-0.04169393703341484,
-0.1389981210231781,
0.07379768788814545,
0.05560476705431938,
0.13692645728588104,
0.10865066200494766,
-0.01025706622749567,
0.01681477762758732,
0.15626244246959686,
0.18229423463344574,
0.1308237463235855,
0.006019692402333021,
-0.03366916626691818,
-0.00786581914871931,
0.01499095093458891,
-0.05352693051099777,
0.018093694001436234,
0.1126210168004036,
-0.1184353232383728,
-0.05580713599920273,
-0.1544867902994156,
0.06733953207731247,
0.09452182799577713,
0.05705242604017258,
-0.21316885948181152,
0.015583771280944347,
0.07424211502075195,
-0.02289263904094696,
-0.07244562357664108,
0.07719103991985321,
-0.06336886435747147,
-0.14039036631584167,
0.06610569357872009,
-0.05594165250658989,
0.1137172058224678,
-0.07018838077783585,
0.07125633209943771,
0.00436688307672739,
-0.09134659916162491,
0.03966715931892395,
0.09198823571205139,
-0.24242180585861206,
0.2353951781988144,
-0.005842206999659538,
-0.08202855288982391,
-0.07314476370811462,
-0.01116443332284689,
0.040539294481277466,
0.20659157633781433,
0.06959166377782822,
0.015105132944881916,
-0.09668217599391937,
-0.2116534262895584,
-0.01098657213151455,
0.0037440438754856586,
0.10247340053319931,
-0.042362287640571594,
-0.020731588825583458,
-0.04073396697640419,
-0.02784070186316967,
-0.013651788234710693,
-0.025264961645007133,
0.03604467213153839,
-0.12370047718286514,
0.0628918707370758,
0.03108403831720352,
0.0375843271613121,
0.011039070785045624,
-0.053596705198287964,
-0.131315678358078,
0.20352721214294434,
-0.08504047989845276,
-0.057783741503953934,
-0.11823975294828415,
-0.09986942261457443,
0.06580016016960144,
-0.09040012210607529,
0.08111396431922913,
-0.08633793145418167,
0.013077793642878532,
-0.03235287219285965,
-0.1910691112279892,
0.14946654438972473,
-0.11215386539697647,
-0.021799318492412567,
-0.08084782212972641,
0.13647201657295227,
-0.07384097576141357,
0.013447601348161697,
0.013732876628637314,
0.02437661960721016,
-0.08041444420814514,
-0.0828900933265686,
0.006355836056172848,
-0.014282151125371456,
0.031239764764904976,
0.025724230334162712,
-0.06753705441951752,
-0.0018155953148379922,
-0.011079292744398117,
0.043730828911066055,
0.24128669500350952,
0.1798989176750183,
-0.08305246382951736,
0.11879931390285492,
0.15382987260818481,
-0.04894930124282837,
-0.31972965598106384,
-0.07097899168729782,
-0.11553299427032471,
-0.045486610382795334,
-0.038512568920850754,
-0.1360434889793396,
0.1564210206270218,
0.02604006603360176,
-0.04126296564936638,
0.08387403935194016,
-0.14068999886512756,
-0.07977245002985,
0.22926518321037292,
0.0029623862355947495,
0.40268903970718384,
-0.08702389895915985,
-0.08436572551727295,
-0.01582670770585537,
-0.15964074432849884,
0.11801808327436447,
0.04110552370548248,
0.06407788395881653,
-0.02680877409875393,
0.053290240466594696,
0.039342157542705536,
-0.06043728440999985,
0.09375226497650146,
0.031009389087557793,
0.044987753033638,
-0.10373856127262115,
-0.13265928626060486,
0.028911620378494263,
-0.030960315838456154,
-0.014786438085138798,
0.057378023862838745,
0.022116998210549355,
-0.12861059606075287,
-0.025071382522583008,
-0.07482665032148361,
0.08682981133460999,
0.03502606973052025,
-0.06619370728731155,
-0.003656937973573804,
-0.011468438431620598,
-0.010950524359941483,
-0.007217009086161852,
0.25750675797462463,
0.003593818750232458,
0.14098960161209106,
0.10273806750774384,
0.09543769061565399,
-0.17969036102294922,
-0.03609991818666458,
-0.07452508062124252,
-0.06570184230804443,
0.096438467502594,
-0.030126970261335373,
0.07519517093896866,
0.1516796350479126,
-0.04122238606214523,
0.04297369718551636,
0.12120365351438522,
0.047616977244615555,
-0.0461055189371109,
0.14043885469436646,
-0.20969203114509583,
0.03770104795694351,
-0.02787908911705017,
-0.02063949778676033,
0.07864256948232651,
0.10774786025285721,
0.10418630391359329,
0.04115518182516098,
-0.035168662667274475,
0.01394918467849493,
-0.02965446747839451,
-0.03510928899049759,
0.07017704844474792,
0.06717298924922943,
0.043598420917987823,
-0.13562940061092377,
0.03572770580649376,
0.03545781224966049,
-0.15994900465011597,
-0.04561840742826462,
0.0798286497592926,
-0.15686658024787903,
-0.11122603714466095,
-0.02263965830206871,
0.11455298960208893,
-0.14521896839141846,
-0.039632488042116165,
-0.04641811549663544,
-0.13418996334075928,
0.06972779333591461,
0.18159188330173492,
0.13103143870830536,
0.10970540344715118,
-0.05561865121126175,
-0.021705783903598785,
-0.011595118790864944,
-0.01822039857506752,
0.006359242834150791,
0.06845489144325256,
-0.17037172615528107,
0.017899589613080025,
-0.01394583098590374,
0.14702916145324707,
-0.09660974144935608,
-0.07415182143449783,
-0.16891857981681824,
0.04697556421160698,
-0.09303826838731766,
-0.07159677147865295,
-0.08290879428386688,
-0.02118426188826561,
0.028802918270230293,
-0.08412807434797287,
-0.03680410608649254,
-0.03769955411553383,
-0.12591150403022766,
0.058045439422130585,
0.016383599489927292,
0.029256748035550117,
-0.04278605803847313,
-0.049387238919734955,
0.10788699239492416,
-0.040735047310590744,
0.09248634427785873,
0.11333432793617249,
-0.06952280551195145,
0.08524111658334732,
-0.09002139419317245,
-0.12434586137533188,
0.12248294800519943,
0.02376852184534073,
0.11538374423980713,
0.037905752658843994,
0.03187960386276245,
0.07364030182361603,
0.015629447996616364,
0.05559782683849335,
0.05336270108819008,
-0.12224778532981873,
0.02708340249955654,
-0.01786322332918644,
-0.1931128054857254,
-0.02867043949663639,
-0.07598401606082916,
0.12200799584388733,
-0.0023715540301054716,
0.15108288824558258,
-0.006962954066693783,
0.08709387481212616,
-0.04373457655310631,
-0.00553933484479785,
-0.020426053553819656,
-0.20569974184036255,
-0.038188423961400986,
-0.05518931895494461,
0.0054757255129516125,
-0.003793900366872549,
0.25238698720932007,
0.03206604719161987,
0.021465452387928963,
0.04169522970914841,
0.05841078236699104,
0.0034062466584146023,
0.03988324850797653,
0.17441175878047943,
0.1048174500465393,
-0.044596027582883835,
-0.06612113863229752,
0.07449153810739517,
0.014104411005973816,
-0.03425929695367813,
0.10280604660511017,
0.0579635351896286,
-0.022101202979683876,
0.061401285231113434,
0.006350876297801733,
0.03077852725982666,
-0.17194421589374542,
-0.18425245583057404,
-0.0518796369433403,
0.07249171286821365,
0.02366054244339466,
0.06857233494520187,
0.10899512469768524,
-0.026017313823103905,
0.0467560812830925,
-0.04243495315313339,
-0.03868629038333893,
-0.19200804829597473,
-0.11754908412694931,
-0.09499595314264297,
-0.10949105769395828,
0.012144351378083229,
-0.045467864722013474,
-0.025607097893953323,
0.112579844892025,
0.05665137246251106,
-0.02637707255780697,
0.07898925989866257,
0.0068581425584852695,
-0.015829376876354218,
0.035544686019420624,
-0.015088078565895557,
-0.002572772093117237,
-0.008363397791981697,
-0.025032468140125275,
-0.1646868884563446,
-0.01614730805158615,
-0.059904489666223526,
-0.0038963949773460627,
-0.0638962835073471,
0.002605070825666189,
-0.10870174318552017,
-0.11156944185495377,
-0.027449829503893852,
0.031112194061279297,
-0.07530830055475235,
0.08167865127325058,
-0.012178707867860794,
0.031354498118162155,
0.02408476360142231,
0.15556955337524414,
-0.0745464414358139,
-0.05411846935749054,
-0.04471001774072647,
0.26918578147888184,
0.0574658028781414,
0.1188357025384903,
0.007326812483370304,
0.018914537504315376,
-0.0766475573182106,
0.29574328660964966,
0.26567307114601135,
-0.03664971888065338,
0.05142061784863472,
0.04157736524939537,
0.01670730672776699,
0.09608681499958038,
0.1372172236442566,
0.07940847426652908,
0.23908233642578125,
-0.07535005360841751,
-0.04486403986811638,
-0.029895322397351265,
-0.017877299338579178,
-0.10601367056369781,
0.0677277147769928,
0.0556669756770134,
-0.0367586687207222,
-0.08847290277481079,
0.07175103574991226,
-0.16698697209358215,
0.1506146639585495,
0.055220942944288254,
-0.18291416764259338,
-0.07399025559425354,
-0.022136300802230835,
0.1443227231502533,
-0.019863387569785118,
0.0790862888097763,
-0.031677428632974625,
-0.10550876706838608,
0.039427150040864944,
0.01414680015295744,
-0.21330086886882782,
-0.05566805601119995,
0.0937623530626297,
0.0036555929109454155,
0.05017957463860512,
-0.023827895522117615,
0.03583429008722305,
0.08428710699081421,
0.0726693794131279,
-0.04607251659035683,
0.006363187450915575,
0.011929735541343689,
-0.08450563997030258,
-0.03499215841293335,
0.00016984343528747559,
0.013966171070933342,
-0.05488259717822075,
0.03193806856870651,
-0.18189109861850739,
0.04003556817770004,
-0.09101450443267822,
-0.036184389144182205,
-0.019026435911655426,
0.023357758298516273,
-0.029626764357089996,
0.05516811087727547,
0.07363101094961166,
0.009205193258821964,
-0.03664170578122139,
-0.06109684333205223,
-0.025447756052017212,
0.03078463301062584,
-0.11446559429168701,
-0.14089645445346832,
-0.08753776550292969,
-0.06155245006084442,
0.09708955883979797,
-0.01227374467998743,
-0.0782943144440651,
-0.04041222110390663,
-0.07965502887964249,
0.03774513676762581,
-0.14707180857658386,
0.06991016864776611,
0.03579777479171753,
0.04206673055887222,
-0.01093299314379692,
-0.03975704312324524,
0.019534343853592873,
0.054816145449876785,
-0.12402302771806717,
-0.09320621192455292
] |
null | null | transformers | This model identifies complex NPs modified by non-finite nominal clauses ("appositives") in the input sentence.
Try the test sentence:
My name is Sarah and I live in London[,] the capital of England.
Note that accuracy is greatly improved if you place square brackets around the left boundary of the non-finite nominal clause.
The model was derived using code adapted from an original program written by Dr. Le An Ha at the University of Wolverhampton. | {} | token-classification | RJ3vans/SSMNspanTagger | [
"transformers",
"pytorch",
"bert",
"token-classification",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [] | TAGS
#transformers #pytorch #bert #token-classification #autotrain_compatible #endpoints_compatible #region-us
| This model identifies complex NPs modified by non-finite nominal clauses ("appositives") in the input sentence.
Try the test sentence:
My name is Sarah and I live in London[,] the capital of England.
Note that accuracy is greatly improved if you place square brackets around the left boundary of the non-finite nominal clause.
The model was derived using code adapted from an original program written by Dr. Le An Ha at the University of Wolverhampton. | [] | [
"TAGS\n#transformers #pytorch #bert #token-classification #autotrain_compatible #endpoints_compatible #region-us \n"
] | [
37
] | [
"passage: TAGS\n#transformers #pytorch #bert #token-classification #autotrain_compatible #endpoints_compatible #region-us \n"
] | [
-0.04952388256788254,
0.052763525396585464,
-0.008742042817175388,
0.033980391919612885,
0.16650345921516418,
0.031232766807079315,
0.056794650852680206,
0.08634597808122635,
0.05724777653813362,
-0.022096728906035423,
0.12041265517473221,
0.25661665201187134,
-0.04172574356198311,
0.09726614505052567,
-0.09425565600395203,
-0.2964369058609009,
0.072568878531456,
0.08518822491168976,
-0.03685073181986809,
0.10874255001544952,
0.0826217457652092,
-0.09953062981367111,
0.07087896764278412,
-0.021146323531866074,
-0.14565661549568176,
0.03782055899500847,
0.04566175118088722,
-0.12290510535240173,
0.10927622765302658,
0.024675650522112846,
0.195825457572937,
0.0250800009816885,
-0.04914003983139992,
-0.12460087239742279,
0.02292012982070446,
0.012434997595846653,
-0.0567244216799736,
0.05464257672429085,
0.07972842454910278,
-0.09245497733354568,
-0.02643316052854061,
0.0736164003610611,
0.040207646787166595,
0.040536463260650635,
-0.12785013020038605,
-0.130097895860672,
-0.01677129417657852,
0.04410306364297867,
0.06830945611000061,
0.03920266404747963,
0.028024563565850258,
0.19659002125263214,
-0.1535772681236267,
0.1040927842259407,
0.1168038472533226,
-0.2951667606830597,
-0.0032763986382633448,
0.12271767109632492,
0.02307300455868244,
0.002292903373017907,
-0.046535883098840714,
0.026178892701864243,
0.025552578270435333,
0.011873415671288967,
0.027952805161476135,
-0.08463964611291885,
-0.08450421690940857,
0.03911300376057625,
-0.10113436728715897,
-0.025793785229325294,
0.17653688788414001,
-0.04118471220135689,
0.051768165081739426,
0.011256481520831585,
-0.10429224371910095,
-0.06710886210203171,
-0.016446024179458618,
-0.016140753403306007,
-0.029147058725357056,
0.05014093220233917,
0.020645350217819214,
0.04166753962635994,
-0.09745467454195023,
0.026441968977451324,
-0.21823887526988983,
0.2394854873418808,
0.026930933818221092,
0.06543339043855667,
-0.1759624481201172,
0.06742089986801147,
0.002468109829351306,
-0.080185666680336,
0.03979361802339554,
-0.10338287800550461,
0.000461930176243186,
-0.05314353480935097,
-0.03661246970295906,
0.03140731528401375,
0.06317161023616791,
0.17395952343940735,
0.0793117806315422,
0.05008377507328987,
-0.005898697301745415,
0.07871981710195541,
0.034834783524274826,
0.12483206391334534,
0.004537689033895731,
-0.03222474083304405,
0.04727894067764282,
-0.12738829851150513,
-0.022797152400016785,
-0.057654689997434616,
-0.13722378015518188,
-0.035683806985616684,
0.08572584390640259,
0.08669279515743256,
-0.002899006474763155,
0.07131802290678024,
-0.07259770482778549,
-0.04170111194252968,
0.0977427288889885,
-0.06708423793315887,
0.03773060068488121,
0.011165250092744827,
0.01788998953998089,
0.10969475656747818,
-0.03837720677256584,
0.006092004477977753,
-0.038544073700904846,
0.16035914421081543,
-0.06788671761751175,
-0.0086200051009655,
-0.03894633799791336,
-0.07913154363632202,
0.03580477461218834,
-0.1462087333202362,
0.05359502136707306,
-0.16970540583133698,
-0.07705289125442505,
0.03400159999728203,
0.03982843458652496,
0.006068000569939613,
-0.023860258981585503,
0.004609986208379269,
0.0009630053536966443,
0.010919814929366112,
-0.06532157212495804,
-0.059639010578393936,
-0.06180674210190773,
0.0789342075586319,
-0.026356957852840424,
0.05199717730283737,
-0.09772148728370667,
0.06380254030227661,
-0.10777808725833893,
0.014992175623774529,
-0.1314254105091095,
-0.01425449550151825,
-0.07122194766998291,
0.1727602779865265,
-0.012890664860606194,
-0.0652521625161171,
-0.042813826352357864,
0.026928367093205452,
-0.05078446492552757,
0.1189238503575325,
-0.05899157002568245,
-0.11599802225828171,
0.16220137476921082,
-0.10287307947874069,
-0.12483754754066467,
0.07581168413162231,
-0.01511769462376833,
-0.007789896801114082,
0.06788471341133118,
0.1409188061952591,
0.11342629045248032,
-0.021826764568686485,
0.08137305825948715,
0.10294424742460251,
-0.1294003278017044,
-0.13587328791618347,
0.006361662410199642,
0.00798719096928835,
-0.14556561410427094,
0.052592433989048004,
0.05236833915114403,
0.0726967453956604,
-0.07074553519487381,
-0.03274666517972946,
-0.01327072735875845,
-0.016134168952703476,
0.11864199489355087,
0.06209159642457962,
0.11604267358779907,
-0.0657677873969078,
0.00820392556488514,
0.047656312584877014,
0.004204337950795889,
0.036543652415275574,
0.015800083056092262,
-0.08706195652484894,
0.12248360365629196,
-0.04112236201763153,
-0.0011841603554785252,
-0.1986646205186844,
-0.09740031510591507,
0.023004116490483284,
0.07795482873916626,
-0.03679869696497917,
0.13610167801380157,
0.0701465755701065,
-0.04764446243643761,
0.00010037058382295072,
-0.03370752930641174,
0.17777207493782043,
0.04204830154776573,
-0.07011453807353973,
-0.07916951924562454,
0.018525518476963043,
-0.07664067298173904,
-0.030626622959971428,
-0.05736034736037254,
0.004993060603737831,
0.09117806702852249,
0.15110987424850464,
0.013702728785574436,
0.07227712869644165,
-0.04311654344201088,
0.06486255675554276,
-0.07453466951847076,
0.009454507380723953,
0.10986973345279694,
-0.00844109058380127,
-0.05139068886637688,
0.12689460813999176,
-0.1288842260837555,
0.3503526449203491,
0.18760092556476593,
-0.3039218783378601,
-0.0014054732164368033,
-0.04166017845273018,
-0.015616725198924541,
0.008022678084671497,
0.052066002041101456,
0.02243027463555336,
0.055117130279541016,
0.004745377227663994,
0.16756683588027954,
-0.015916254371404648,
-0.05048782005906105,
0.01112330798059702,
-0.05658484622836113,
-0.04292769730091095,
0.0737248957157135,
0.08670081943273544,
-0.20279163122177124,
0.1850307285785675,
0.22442755103111267,
0.0013078266056254506,
0.10093411803245544,
-0.009863444603979588,
0.027319302782416344,
0.03913215547800064,
-0.0403471440076828,
-0.01890004798769951,
-0.015322047285735607,
-0.1852157860994339,
-0.04855918884277344,
0.07947539538145065,
0.031089356169104576,
0.044881563633680344,
-0.12700966000556946,
-0.023778779432177544,
0.019753707572817802,
0.05750936269760132,
-0.006435907445847988,
0.08618827164173126,
0.05749829113483429,
0.08228709548711777,
-0.00042137576383538544,
-0.12340108305215836,
0.11843181401491165,
0.008938729763031006,
-0.06630217283964157,
0.1701919436454773,
-0.13096192479133606,
-0.2919446527957916,
-0.12918199598789215,
-0.21609659492969513,
-0.02221912518143654,
0.04468577727675438,
0.06791006773710251,
-0.09210261702537537,
-0.05621439218521118,
0.07646377384662628,
-0.0043442221358418465,
-0.08893678337335587,
0.06794130802154541,
-0.08020073175430298,
0.05017327144742012,
-0.04635123163461685,
-0.06104189157485962,
-0.06701690703630447,
-0.041732318699359894,
-0.02662169374525547,
0.13805724680423737,
-0.09828965365886688,
0.06120699644088745,
0.17539545893669128,
-0.011075721122324467,
0.06305833160877228,
-0.023022029548883438,
0.17187125980854034,
-0.051036398857831955,
-0.01183983776718378,
0.15369866788387299,
-0.07339499890804291,
0.08695117384195328,
0.16216324269771576,
0.039562564343214035,
-0.05847073718905449,
0.00318810623139143,
-0.03215182200074196,
-0.10039623826742172,
-0.18391156196594238,
-0.12352752685546875,
-0.11139705777168274,
0.03804181143641472,
0.07061707973480225,
0.06962733715772629,
0.135431706905365,
0.09561170637607574,
0.051199477165937424,
0.010959632694721222,
-0.048153605312108994,
0.07466201484203339,
0.2208920270204544,
0.001999937929213047,
0.1435864269733429,
-0.04169393703341484,
-0.1389981210231781,
0.07379768788814545,
0.05560476705431938,
0.13692645728588104,
0.10865066200494766,
-0.01025706622749567,
0.01681477762758732,
0.15626244246959686,
0.18229423463344574,
0.1308237463235855,
0.006019692402333021,
-0.03366916626691818,
-0.00786581914871931,
0.01499095093458891,
-0.05352693051099777,
0.018093694001436234,
0.1126210168004036,
-0.1184353232383728,
-0.05580713599920273,
-0.1544867902994156,
0.06733953207731247,
0.09452182799577713,
0.05705242604017258,
-0.21316885948181152,
0.015583771280944347,
0.07424211502075195,
-0.02289263904094696,
-0.07244562357664108,
0.07719103991985321,
-0.06336886435747147,
-0.14039036631584167,
0.06610569357872009,
-0.05594165250658989,
0.1137172058224678,
-0.07018838077783585,
0.07125633209943771,
0.00436688307672739,
-0.09134659916162491,
0.03966715931892395,
0.09198823571205139,
-0.24242180585861206,
0.2353951781988144,
-0.005842206999659538,
-0.08202855288982391,
-0.07314476370811462,
-0.01116443332284689,
0.040539294481277466,
0.20659157633781433,
0.06959166377782822,
0.015105132944881916,
-0.09668217599391937,
-0.2116534262895584,
-0.01098657213151455,
0.0037440438754856586,
0.10247340053319931,
-0.042362287640571594,
-0.020731588825583458,
-0.04073396697640419,
-0.02784070186316967,
-0.013651788234710693,
-0.025264961645007133,
0.03604467213153839,
-0.12370047718286514,
0.0628918707370758,
0.03108403831720352,
0.0375843271613121,
0.011039070785045624,
-0.053596705198287964,
-0.131315678358078,
0.20352721214294434,
-0.08504047989845276,
-0.057783741503953934,
-0.11823975294828415,
-0.09986942261457443,
0.06580016016960144,
-0.09040012210607529,
0.08111396431922913,
-0.08633793145418167,
0.013077793642878532,
-0.03235287219285965,
-0.1910691112279892,
0.14946654438972473,
-0.11215386539697647,
-0.021799318492412567,
-0.08084782212972641,
0.13647201657295227,
-0.07384097576141357,
0.013447601348161697,
0.013732876628637314,
0.02437661960721016,
-0.08041444420814514,
-0.0828900933265686,
0.006355836056172848,
-0.014282151125371456,
0.031239764764904976,
0.025724230334162712,
-0.06753705441951752,
-0.0018155953148379922,
-0.011079292744398117,
0.043730828911066055,
0.24128669500350952,
0.1798989176750183,
-0.08305246382951736,
0.11879931390285492,
0.15382987260818481,
-0.04894930124282837,
-0.31972965598106384,
-0.07097899168729782,
-0.11553299427032471,
-0.045486610382795334,
-0.038512568920850754,
-0.1360434889793396,
0.1564210206270218,
0.02604006603360176,
-0.04126296564936638,
0.08387403935194016,
-0.14068999886512756,
-0.07977245002985,
0.22926518321037292,
0.0029623862355947495,
0.40268903970718384,
-0.08702389895915985,
-0.08436572551727295,
-0.01582670770585537,
-0.15964074432849884,
0.11801808327436447,
0.04110552370548248,
0.06407788395881653,
-0.02680877409875393,
0.053290240466594696,
0.039342157542705536,
-0.06043728440999985,
0.09375226497650146,
0.031009389087557793,
0.044987753033638,
-0.10373856127262115,
-0.13265928626060486,
0.028911620378494263,
-0.030960315838456154,
-0.014786438085138798,
0.057378023862838745,
0.022116998210549355,
-0.12861059606075287,
-0.025071382522583008,
-0.07482665032148361,
0.08682981133460999,
0.03502606973052025,
-0.06619370728731155,
-0.003656937973573804,
-0.011468438431620598,
-0.010950524359941483,
-0.007217009086161852,
0.25750675797462463,
0.003593818750232458,
0.14098960161209106,
0.10273806750774384,
0.09543769061565399,
-0.17969036102294922,
-0.03609991818666458,
-0.07452508062124252,
-0.06570184230804443,
0.096438467502594,
-0.030126970261335373,
0.07519517093896866,
0.1516796350479126,
-0.04122238606214523,
0.04297369718551636,
0.12120365351438522,
0.047616977244615555,
-0.0461055189371109,
0.14043885469436646,
-0.20969203114509583,
0.03770104795694351,
-0.02787908911705017,
-0.02063949778676033,
0.07864256948232651,
0.10774786025285721,
0.10418630391359329,
0.04115518182516098,
-0.035168662667274475,
0.01394918467849493,
-0.02965446747839451,
-0.03510928899049759,
0.07017704844474792,
0.06717298924922943,
0.043598420917987823,
-0.13562940061092377,
0.03572770580649376,
0.03545781224966049,
-0.15994900465011597,
-0.04561840742826462,
0.0798286497592926,
-0.15686658024787903,
-0.11122603714466095,
-0.02263965830206871,
0.11455298960208893,
-0.14521896839141846,
-0.039632488042116165,
-0.04641811549663544,
-0.13418996334075928,
0.06972779333591461,
0.18159188330173492,
0.13103143870830536,
0.10970540344715118,
-0.05561865121126175,
-0.021705783903598785,
-0.011595118790864944,
-0.01822039857506752,
0.006359242834150791,
0.06845489144325256,
-0.17037172615528107,
0.017899589613080025,
-0.01394583098590374,
0.14702916145324707,
-0.09660974144935608,
-0.07415182143449783,
-0.16891857981681824,
0.04697556421160698,
-0.09303826838731766,
-0.07159677147865295,
-0.08290879428386688,
-0.02118426188826561,
0.028802918270230293,
-0.08412807434797287,
-0.03680410608649254,
-0.03769955411553383,
-0.12591150403022766,
0.058045439422130585,
0.016383599489927292,
0.029256748035550117,
-0.04278605803847313,
-0.049387238919734955,
0.10788699239492416,
-0.040735047310590744,
0.09248634427785873,
0.11333432793617249,
-0.06952280551195145,
0.08524111658334732,
-0.09002139419317245,
-0.12434586137533188,
0.12248294800519943,
0.02376852184534073,
0.11538374423980713,
0.037905752658843994,
0.03187960386276245,
0.07364030182361603,
0.015629447996616364,
0.05559782683849335,
0.05336270108819008,
-0.12224778532981873,
0.02708340249955654,
-0.01786322332918644,
-0.1931128054857254,
-0.02867043949663639,
-0.07598401606082916,
0.12200799584388733,
-0.0023715540301054716,
0.15108288824558258,
-0.006962954066693783,
0.08709387481212616,
-0.04373457655310631,
-0.00553933484479785,
-0.020426053553819656,
-0.20569974184036255,
-0.038188423961400986,
-0.05518931895494461,
0.0054757255129516125,
-0.003793900366872549,
0.25238698720932007,
0.03206604719161987,
0.021465452387928963,
0.04169522970914841,
0.05841078236699104,
0.0034062466584146023,
0.03988324850797653,
0.17441175878047943,
0.1048174500465393,
-0.044596027582883835,
-0.06612113863229752,
0.07449153810739517,
0.014104411005973816,
-0.03425929695367813,
0.10280604660511017,
0.0579635351896286,
-0.022101202979683876,
0.061401285231113434,
0.006350876297801733,
0.03077852725982666,
-0.17194421589374542,
-0.18425245583057404,
-0.0518796369433403,
0.07249171286821365,
0.02366054244339466,
0.06857233494520187,
0.10899512469768524,
-0.026017313823103905,
0.0467560812830925,
-0.04243495315313339,
-0.03868629038333893,
-0.19200804829597473,
-0.11754908412694931,
-0.09499595314264297,
-0.10949105769395828,
0.012144351378083229,
-0.045467864722013474,
-0.025607097893953323,
0.112579844892025,
0.05665137246251106,
-0.02637707255780697,
0.07898925989866257,
0.0068581425584852695,
-0.015829376876354218,
0.035544686019420624,
-0.015088078565895557,
-0.002572772093117237,
-0.008363397791981697,
-0.025032468140125275,
-0.1646868884563446,
-0.01614730805158615,
-0.059904489666223526,
-0.0038963949773460627,
-0.0638962835073471,
0.002605070825666189,
-0.10870174318552017,
-0.11156944185495377,
-0.027449829503893852,
0.031112194061279297,
-0.07530830055475235,
0.08167865127325058,
-0.012178707867860794,
0.031354498118162155,
0.02408476360142231,
0.15556955337524414,
-0.0745464414358139,
-0.05411846935749054,
-0.04471001774072647,
0.26918578147888184,
0.0574658028781414,
0.1188357025384903,
0.007326812483370304,
0.018914537504315376,
-0.0766475573182106,
0.29574328660964966,
0.26567307114601135,
-0.03664971888065338,
0.05142061784863472,
0.04157736524939537,
0.01670730672776699,
0.09608681499958038,
0.1372172236442566,
0.07940847426652908,
0.23908233642578125,
-0.07535005360841751,
-0.04486403986811638,
-0.029895322397351265,
-0.017877299338579178,
-0.10601367056369781,
0.0677277147769928,
0.0556669756770134,
-0.0367586687207222,
-0.08847290277481079,
0.07175103574991226,
-0.16698697209358215,
0.1506146639585495,
0.055220942944288254,
-0.18291416764259338,
-0.07399025559425354,
-0.022136300802230835,
0.1443227231502533,
-0.019863387569785118,
0.0790862888097763,
-0.031677428632974625,
-0.10550876706838608,
0.039427150040864944,
0.01414680015295744,
-0.21330086886882782,
-0.05566805601119995,
0.0937623530626297,
0.0036555929109454155,
0.05017957463860512,
-0.023827895522117615,
0.03583429008722305,
0.08428710699081421,
0.0726693794131279,
-0.04607251659035683,
0.006363187450915575,
0.011929735541343689,
-0.08450563997030258,
-0.03499215841293335,
0.00016984343528747559,
0.013966171070933342,
-0.05488259717822075,
0.03193806856870651,
-0.18189109861850739,
0.04003556817770004,
-0.09101450443267822,
-0.036184389144182205,
-0.019026435911655426,
0.023357758298516273,
-0.029626764357089996,
0.05516811087727547,
0.07363101094961166,
0.009205193258821964,
-0.03664170578122139,
-0.06109684333205223,
-0.025447756052017212,
0.03078463301062584,
-0.11446559429168701,
-0.14089645445346832,
-0.08753776550292969,
-0.06155245006084442,
0.09708955883979797,
-0.01227374467998743,
-0.0782943144440651,
-0.04041222110390663,
-0.07965502887964249,
0.03774513676762581,
-0.14707180857658386,
0.06991016864776611,
0.03579777479171753,
0.04206673055887222,
-0.01093299314379692,
-0.03975704312324524,
0.019534343853592873,
0.054816145449876785,
-0.12402302771806717,
-0.09320621192455292
] |
null | null | transformers | This model is used to tag the tokens in an input sequence with information about the different signs of syntactic complexity that they contain. For more details, please see Chapters 2 and 3 of my thesis (http://rgcl.wlv.ac.uk/~richard/Evans2020_SentenceSimplificationForTextProcessing.pdf).
It was derived using code written by Dr. Le An Ha at the University of Wolverhampton.
To use this model, the following code snippet may help:
======================================================================
import torch
from transformers import AutoModelForTokenClassification, AutoTokenizer
SignTaggingModel = AutoModelForTokenClassification.from_pretrained('RJ3vans/SignTagger')
SignTaggingTokenizer = AutoTokenizer.from_pretrained('RJ3vans/SignTagger')
label_list = ["M:N_CCV", "M:N_CIN", "M:N_CLA", "M:N_CLAdv", "M:N_CLN", "M:N_CLP", # This could be obtained from the config file
"M:N_CLQ", "M:N_CLV", "M:N_CMA1", "M:N_CMAdv", "M:N_CMN1",
"M:N_CMN2", "M:N_CMN3", "M:N_CMN4", "M:N_CMP", "M:N_CMP2",
"M:N_CMV1", "M:N_CMV2", "M:N_CMV3", "M:N_COMBINATORY", "M:N_CPA",
"M:N_ESAdvP", "M:N_ESCCV", "M:N_ESCM", "M:N_ESMA", "M:N_ESMAdvP",
"M:N_ESMI", "M:N_ESMN", "M:N_ESMP", "M:N_ESMV", "M:N_HELP",
"M:N_SPECIAL", "M:N_SSCCV", "M:N_SSCM", "M:N_SSMA", "M:N_SSMAdvP",
"M:N_SSMI", "M:N_SSMN", "M:N_SSMP", "M:N_SSMV", "M:N_STQ",
"M:N_V", "M:N_nan", "M:Y_CCV", "M:Y_CIN", "M:Y_CLA", "M:Y_CLAdv",
"M:Y_CLN", "M:Y_CLP", "M:Y_CLQ", "M:Y_CLV", "M:Y_CMA1",
"M:Y_CMAdv", "M:Y_CMN1", "M:Y_CMN2", "M:Y_CMN4", "M:Y_CMP",
"M:Y_CMP2", "M:Y_CMV1", "M:Y_CMV2", "M:Y_CMV3",
"M:Y_COMBINATORY", "M:Y_CPA", "M:Y_ESAdvP", "M:Y_ESCCV",
"M:Y_ESCM", "M:Y_ESMA", "M:Y_ESMAdvP", "M:Y_ESMI", "M:Y_ESMN",
"M:Y_ESMP", "M:Y_ESMV", "M:Y_HELP", "M:Y_SPECIAL", "M:Y_SSCCV",
"M:Y_SSCM", "M:Y_SSMA", "M:Y_SSMAdvP", "M:Y_SSMI", "M:Y_SSMN",
"M:Y_SSMP", "M:Y_SSMV", "M:Y_STQ"]
sentence = 'The County Court in Nottingham heard that Roger Gedge, 30, had his leg amputated following the incident outside a rock festival in Wollaton Park, Nottingham, five years ago.'
tokens = SignTaggingTokenizer.tokenize(SignTaggingTokenizer.decode(SignTaggingTokenizer.encode(sentence)))
inputs = SignTaggingTokenizer.encode(sentence, return_tensors="pt")
outputs = SignTaggingModel(inputs)[0]
predictions = torch.argmax(outputs, dim=2)
print([(token, label_list[prediction]) for token, prediction in zip(tokens, predictions[0].tolist())])
======================================================================
| {} | token-classification | RJ3vans/SignTagger | [
"transformers",
"pytorch",
"bert",
"token-classification",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [] | TAGS
#transformers #pytorch #bert #token-classification #autotrain_compatible #endpoints_compatible #region-us
| This model is used to tag the tokens in an input sequence with information about the different signs of syntactic complexity that they contain. For more details, please see Chapters 2 and 3 of my thesis (URL
It was derived using code written by Dr. Le An Ha at the University of Wolverhampton.
To use this model, the following code snippet may help:
======================================================================
import torch
from transformers import AutoModelForTokenClassification, AutoTokenizer
SignTaggingModel = AutoModelForTokenClassification.from_pretrained('RJ3vans/SignTagger')
SignTaggingTokenizer = AutoTokenizer.from_pretrained('RJ3vans/SignTagger')
label_list = ["M:N_CCV", "M:N_CIN", "M:N_CLA", "M:N_CLAdv", "M:N_CLN", "M:N_CLP", # This could be obtained from the config file
"M:N_CLQ", "M:N_CLV", "M:N_CMA1", "M:N_CMAdv", "M:N_CMN1",
"M:N_CMN2", "M:N_CMN3", "M:N_CMN4", "M:N_CMP", "M:N_CMP2",
"M:N_CMV1", "M:N_CMV2", "M:N_CMV3", "M:N_COMBINATORY", "M:N_CPA",
"M:N_ESAdvP", "M:N_ESCCV", "M:N_ESCM", "M:N_ESMA", "M:N_ESMAdvP",
"M:N_ESMI", "M:N_ESMN", "M:N_ESMP", "M:N_ESMV", "M:N_HELP",
"M:N_SPECIAL", "M:N_SSCCV", "M:N_SSCM", "M:N_SSMA", "M:N_SSMAdvP",
"M:N_SSMI", "M:N_SSMN", "M:N_SSMP", "M:N_SSMV", "M:N_STQ",
"M:N_V", "M:N_nan", "M:Y_CCV", "M:Y_CIN", "M:Y_CLA", "M:Y_CLAdv",
"M:Y_CLN", "M:Y_CLP", "M:Y_CLQ", "M:Y_CLV", "M:Y_CMA1",
"M:Y_CMAdv", "M:Y_CMN1", "M:Y_CMN2", "M:Y_CMN4", "M:Y_CMP",
"M:Y_CMP2", "M:Y_CMV1", "M:Y_CMV2", "M:Y_CMV3",
"M:Y_COMBINATORY", "M:Y_CPA", "M:Y_ESAdvP", "M:Y_ESCCV",
"M:Y_ESCM", "M:Y_ESMA", "M:Y_ESMAdvP", "M:Y_ESMI", "M:Y_ESMN",
"M:Y_ESMP", "M:Y_ESMV", "M:Y_HELP", "M:Y_SPECIAL", "M:Y_SSCCV",
"M:Y_SSCM", "M:Y_SSMA", "M:Y_SSMAdvP", "M:Y_SSMI", "M:Y_SSMN",
"M:Y_SSMP", "M:Y_SSMV", "M:Y_STQ"]
sentence = 'The County Court in Nottingham heard that Roger Gedge, 30, had his leg amputated following the incident outside a rock festival in Wollaton Park, Nottingham, five years ago.'
tokens = SignTaggingTokenizer.tokenize(URL(URL(sentence)))
inputs = URL(sentence, return_tensors="pt")
outputs = SignTaggingModel(inputs)[0]
predictions = URL(outputs, dim=2)
print([(token, label_list[prediction]) for token, prediction in zip(tokens, predictions[0].tolist())])
======================================================================
| [
"# This could be obtained from the config file\n \"M:N_CLQ\", \"M:N_CLV\", \"M:N_CMA1\", \"M:N_CMAdv\", \"M:N_CMN1\", \n \"M:N_CMN2\", \"M:N_CMN3\", \"M:N_CMN4\", \"M:N_CMP\", \"M:N_CMP2\", \n \"M:N_CMV1\", \"M:N_CMV2\", \"M:N_CMV3\", \"M:N_COMBINATORY\", \"M:N_CPA\", \n \"M:N_ESAdvP\", \"M:N_ESCCV\", \"M:N_ESCM\", \"M:N_ESMA\", \"M:N_ESMAdvP\", \n \"M:N_ESMI\", \"M:N_ESMN\", \"M:N_ESMP\", \"M:N_ESMV\", \"M:N_HELP\", \n \"M:N_SPECIAL\", \"M:N_SSCCV\", \"M:N_SSCM\", \"M:N_SSMA\", \"M:N_SSMAdvP\",\n \"M:N_SSMI\", \"M:N_SSMN\", \"M:N_SSMP\", \"M:N_SSMV\", \"M:N_STQ\", \n \"M:N_V\", \"M:N_nan\", \"M:Y_CCV\", \"M:Y_CIN\", \"M:Y_CLA\", \"M:Y_CLAdv\", \n \"M:Y_CLN\", \"M:Y_CLP\", \"M:Y_CLQ\", \"M:Y_CLV\", \"M:Y_CMA1\", \n \"M:Y_CMAdv\", \"M:Y_CMN1\", \"M:Y_CMN2\", \"M:Y_CMN4\", \"M:Y_CMP\", \n \"M:Y_CMP2\", \"M:Y_CMV1\", \"M:Y_CMV2\", \"M:Y_CMV3\", \n \"M:Y_COMBINATORY\", \"M:Y_CPA\", \"M:Y_ESAdvP\", \"M:Y_ESCCV\", \n \"M:Y_ESCM\", \"M:Y_ESMA\", \"M:Y_ESMAdvP\", \"M:Y_ESMI\", \"M:Y_ESMN\", \n \"M:Y_ESMP\", \"M:Y_ESMV\", \"M:Y_HELP\", \"M:Y_SPECIAL\", \"M:Y_SSCCV\", \n \"M:Y_SSCM\", \"M:Y_SSMA\", \"M:Y_SSMAdvP\", \"M:Y_SSMI\", \"M:Y_SSMN\", \n \"M:Y_SSMP\", \"M:Y_SSMV\", \"M:Y_STQ\"]\n \nsentence = 'The County Court in Nottingham heard that Roger Gedge, 30, had his leg amputated following the incident outside a rock festival in Wollaton Park, Nottingham, five years ago.'\n\ntokens = SignTaggingTokenizer.tokenize(URL(URL(sentence)))\ninputs = URL(sentence, return_tensors=\"pt\")\n\noutputs = SignTaggingModel(inputs)[0]\npredictions = URL(outputs, dim=2)\n\nprint([(token, label_list[prediction]) for token, prediction in zip(tokens, predictions[0].tolist())]) \n\n \n======================================================================"
] | [
"TAGS\n#transformers #pytorch #bert #token-classification #autotrain_compatible #endpoints_compatible #region-us \n",
"# This could be obtained from the config file\n \"M:N_CLQ\", \"M:N_CLV\", \"M:N_CMA1\", \"M:N_CMAdv\", \"M:N_CMN1\", \n \"M:N_CMN2\", \"M:N_CMN3\", \"M:N_CMN4\", \"M:N_CMP\", \"M:N_CMP2\", \n \"M:N_CMV1\", \"M:N_CMV2\", \"M:N_CMV3\", \"M:N_COMBINATORY\", \"M:N_CPA\", \n \"M:N_ESAdvP\", \"M:N_ESCCV\", \"M:N_ESCM\", \"M:N_ESMA\", \"M:N_ESMAdvP\", \n \"M:N_ESMI\", \"M:N_ESMN\", \"M:N_ESMP\", \"M:N_ESMV\", \"M:N_HELP\", \n \"M:N_SPECIAL\", \"M:N_SSCCV\", \"M:N_SSCM\", \"M:N_SSMA\", \"M:N_SSMAdvP\",\n \"M:N_SSMI\", \"M:N_SSMN\", \"M:N_SSMP\", \"M:N_SSMV\", \"M:N_STQ\", \n \"M:N_V\", \"M:N_nan\", \"M:Y_CCV\", \"M:Y_CIN\", \"M:Y_CLA\", \"M:Y_CLAdv\", \n \"M:Y_CLN\", \"M:Y_CLP\", \"M:Y_CLQ\", \"M:Y_CLV\", \"M:Y_CMA1\", \n \"M:Y_CMAdv\", \"M:Y_CMN1\", \"M:Y_CMN2\", \"M:Y_CMN4\", \"M:Y_CMP\", \n \"M:Y_CMP2\", \"M:Y_CMV1\", \"M:Y_CMV2\", \"M:Y_CMV3\", \n \"M:Y_COMBINATORY\", \"M:Y_CPA\", \"M:Y_ESAdvP\", \"M:Y_ESCCV\", \n \"M:Y_ESCM\", \"M:Y_ESMA\", \"M:Y_ESMAdvP\", \"M:Y_ESMI\", \"M:Y_ESMN\", \n \"M:Y_ESMP\", \"M:Y_ESMV\", \"M:Y_HELP\", \"M:Y_SPECIAL\", \"M:Y_SSCCV\", \n \"M:Y_SSCM\", \"M:Y_SSMA\", \"M:Y_SSMAdvP\", \"M:Y_SSMI\", \"M:Y_SSMN\", \n \"M:Y_SSMP\", \"M:Y_SSMV\", \"M:Y_STQ\"]\n \nsentence = 'The County Court in Nottingham heard that Roger Gedge, 30, had his leg amputated following the incident outside a rock festival in Wollaton Park, Nottingham, five years ago.'\n\ntokens = SignTaggingTokenizer.tokenize(URL(URL(sentence)))\ninputs = URL(sentence, return_tensors=\"pt\")\n\noutputs = SignTaggingModel(inputs)[0]\npredictions = URL(outputs, dim=2)\n\nprint([(token, label_list[prediction]) for token, prediction in zip(tokens, predictions[0].tolist())]) \n\n \n======================================================================"
] | [
37,
823
] | [
"passage: TAGS\n#transformers #pytorch #bert #token-classification #autotrain_compatible #endpoints_compatible #region-us \n"
] | [
-0.04952388256788254,
0.052763525396585464,
-0.008742042817175388,
0.033980391919612885,
0.16650345921516418,
0.031232766807079315,
0.056794650852680206,
0.08634597808122635,
0.05724777653813362,
-0.022096728906035423,
0.12041265517473221,
0.25661665201187134,
-0.04172574356198311,
0.09726614505052567,
-0.09425565600395203,
-0.2964369058609009,
0.072568878531456,
0.08518822491168976,
-0.03685073181986809,
0.10874255001544952,
0.0826217457652092,
-0.09953062981367111,
0.07087896764278412,
-0.021146323531866074,
-0.14565661549568176,
0.03782055899500847,
0.04566175118088722,
-0.12290510535240173,
0.10927622765302658,
0.024675650522112846,
0.195825457572937,
0.0250800009816885,
-0.04914003983139992,
-0.12460087239742279,
0.02292012982070446,
0.012434997595846653,
-0.0567244216799736,
0.05464257672429085,
0.07972842454910278,
-0.09245497733354568,
-0.02643316052854061,
0.0736164003610611,
0.040207646787166595,
0.040536463260650635,
-0.12785013020038605,
-0.130097895860672,
-0.01677129417657852,
0.04410306364297867,
0.06830945611000061,
0.03920266404747963,
0.028024563565850258,
0.19659002125263214,
-0.1535772681236267,
0.1040927842259407,
0.1168038472533226,
-0.2951667606830597,
-0.0032763986382633448,
0.12271767109632492,
0.02307300455868244,
0.002292903373017907,
-0.046535883098840714,
0.026178892701864243,
0.025552578270435333,
0.011873415671288967,
0.027952805161476135,
-0.08463964611291885,
-0.08450421690940857,
0.03911300376057625,
-0.10113436728715897,
-0.025793785229325294,
0.17653688788414001,
-0.04118471220135689,
0.051768165081739426,
0.011256481520831585,
-0.10429224371910095,
-0.06710886210203171,
-0.016446024179458618,
-0.016140753403306007,
-0.029147058725357056,
0.05014093220233917,
0.020645350217819214,
0.04166753962635994,
-0.09745467454195023,
0.026441968977451324,
-0.21823887526988983,
0.2394854873418808,
0.026930933818221092,
0.06543339043855667,
-0.1759624481201172,
0.06742089986801147,
0.002468109829351306,
-0.080185666680336,
0.03979361802339554,
-0.10338287800550461,
0.000461930176243186,
-0.05314353480935097,
-0.03661246970295906,
0.03140731528401375,
0.06317161023616791,
0.17395952343940735,
0.0793117806315422,
0.05008377507328987,
-0.005898697301745415,
0.07871981710195541,
0.034834783524274826,
0.12483206391334534,
0.004537689033895731,
-0.03222474083304405,
0.04727894067764282,
-0.12738829851150513,
-0.022797152400016785,
-0.057654689997434616,
-0.13722378015518188,
-0.035683806985616684,
0.08572584390640259,
0.08669279515743256,
-0.002899006474763155,
0.07131802290678024,
-0.07259770482778549,
-0.04170111194252968,
0.0977427288889885,
-0.06708423793315887,
0.03773060068488121,
0.011165250092744827,
0.01788998953998089,
0.10969475656747818,
-0.03837720677256584,
0.006092004477977753,
-0.038544073700904846,
0.16035914421081543,
-0.06788671761751175,
-0.0086200051009655,
-0.03894633799791336,
-0.07913154363632202,
0.03580477461218834,
-0.1462087333202362,
0.05359502136707306,
-0.16970540583133698,
-0.07705289125442505,
0.03400159999728203,
0.03982843458652496,
0.006068000569939613,
-0.023860258981585503,
0.004609986208379269,
0.0009630053536966443,
0.010919814929366112,
-0.06532157212495804,
-0.059639010578393936,
-0.06180674210190773,
0.0789342075586319,
-0.026356957852840424,
0.05199717730283737,
-0.09772148728370667,
0.06380254030227661,
-0.10777808725833893,
0.014992175623774529,
-0.1314254105091095,
-0.01425449550151825,
-0.07122194766998291,
0.1727602779865265,
-0.012890664860606194,
-0.0652521625161171,
-0.042813826352357864,
0.026928367093205452,
-0.05078446492552757,
0.1189238503575325,
-0.05899157002568245,
-0.11599802225828171,
0.16220137476921082,
-0.10287307947874069,
-0.12483754754066467,
0.07581168413162231,
-0.01511769462376833,
-0.007789896801114082,
0.06788471341133118,
0.1409188061952591,
0.11342629045248032,
-0.021826764568686485,
0.08137305825948715,
0.10294424742460251,
-0.1294003278017044,
-0.13587328791618347,
0.006361662410199642,
0.00798719096928835,
-0.14556561410427094,
0.052592433989048004,
0.05236833915114403,
0.0726967453956604,
-0.07074553519487381,
-0.03274666517972946,
-0.01327072735875845,
-0.016134168952703476,
0.11864199489355087,
0.06209159642457962,
0.11604267358779907,
-0.0657677873969078,
0.00820392556488514,
0.047656312584877014,
0.004204337950795889,
0.036543652415275574,
0.015800083056092262,
-0.08706195652484894,
0.12248360365629196,
-0.04112236201763153,
-0.0011841603554785252,
-0.1986646205186844,
-0.09740031510591507,
0.023004116490483284,
0.07795482873916626,
-0.03679869696497917,
0.13610167801380157,
0.0701465755701065,
-0.04764446243643761,
0.00010037058382295072,
-0.03370752930641174,
0.17777207493782043,
0.04204830154776573,
-0.07011453807353973,
-0.07916951924562454,
0.018525518476963043,
-0.07664067298173904,
-0.030626622959971428,
-0.05736034736037254,
0.004993060603737831,
0.09117806702852249,
0.15110987424850464,
0.013702728785574436,
0.07227712869644165,
-0.04311654344201088,
0.06486255675554276,
-0.07453466951847076,
0.009454507380723953,
0.10986973345279694,
-0.00844109058380127,
-0.05139068886637688,
0.12689460813999176,
-0.1288842260837555,
0.3503526449203491,
0.18760092556476593,
-0.3039218783378601,
-0.0014054732164368033,
-0.04166017845273018,
-0.015616725198924541,
0.008022678084671497,
0.052066002041101456,
0.02243027463555336,
0.055117130279541016,
0.004745377227663994,
0.16756683588027954,
-0.015916254371404648,
-0.05048782005906105,
0.01112330798059702,
-0.05658484622836113,
-0.04292769730091095,
0.0737248957157135,
0.08670081943273544,
-0.20279163122177124,
0.1850307285785675,
0.22442755103111267,
0.0013078266056254506,
0.10093411803245544,
-0.009863444603979588,
0.027319302782416344,
0.03913215547800064,
-0.0403471440076828,
-0.01890004798769951,
-0.015322047285735607,
-0.1852157860994339,
-0.04855918884277344,
0.07947539538145065,
0.031089356169104576,
0.044881563633680344,
-0.12700966000556946,
-0.023778779432177544,
0.019753707572817802,
0.05750936269760132,
-0.006435907445847988,
0.08618827164173126,
0.05749829113483429,
0.08228709548711777,
-0.00042137576383538544,
-0.12340108305215836,
0.11843181401491165,
0.008938729763031006,
-0.06630217283964157,
0.1701919436454773,
-0.13096192479133606,
-0.2919446527957916,
-0.12918199598789215,
-0.21609659492969513,
-0.02221912518143654,
0.04468577727675438,
0.06791006773710251,
-0.09210261702537537,
-0.05621439218521118,
0.07646377384662628,
-0.0043442221358418465,
-0.08893678337335587,
0.06794130802154541,
-0.08020073175430298,
0.05017327144742012,
-0.04635123163461685,
-0.06104189157485962,
-0.06701690703630447,
-0.041732318699359894,
-0.02662169374525547,
0.13805724680423737,
-0.09828965365886688,
0.06120699644088745,
0.17539545893669128,
-0.011075721122324467,
0.06305833160877228,
-0.023022029548883438,
0.17187125980854034,
-0.051036398857831955,
-0.01183983776718378,
0.15369866788387299,
-0.07339499890804291,
0.08695117384195328,
0.16216324269771576,
0.039562564343214035,
-0.05847073718905449,
0.00318810623139143,
-0.03215182200074196,
-0.10039623826742172,
-0.18391156196594238,
-0.12352752685546875,
-0.11139705777168274,
0.03804181143641472,
0.07061707973480225,
0.06962733715772629,
0.135431706905365,
0.09561170637607574,
0.051199477165937424,
0.010959632694721222,
-0.048153605312108994,
0.07466201484203339,
0.2208920270204544,
0.001999937929213047,
0.1435864269733429,
-0.04169393703341484,
-0.1389981210231781,
0.07379768788814545,
0.05560476705431938,
0.13692645728588104,
0.10865066200494766,
-0.01025706622749567,
0.01681477762758732,
0.15626244246959686,
0.18229423463344574,
0.1308237463235855,
0.006019692402333021,
-0.03366916626691818,
-0.00786581914871931,
0.01499095093458891,
-0.05352693051099777,
0.018093694001436234,
0.1126210168004036,
-0.1184353232383728,
-0.05580713599920273,
-0.1544867902994156,
0.06733953207731247,
0.09452182799577713,
0.05705242604017258,
-0.21316885948181152,
0.015583771280944347,
0.07424211502075195,
-0.02289263904094696,
-0.07244562357664108,
0.07719103991985321,
-0.06336886435747147,
-0.14039036631584167,
0.06610569357872009,
-0.05594165250658989,
0.1137172058224678,
-0.07018838077783585,
0.07125633209943771,
0.00436688307672739,
-0.09134659916162491,
0.03966715931892395,
0.09198823571205139,
-0.24242180585861206,
0.2353951781988144,
-0.005842206999659538,
-0.08202855288982391,
-0.07314476370811462,
-0.01116443332284689,
0.040539294481277466,
0.20659157633781433,
0.06959166377782822,
0.015105132944881916,
-0.09668217599391937,
-0.2116534262895584,
-0.01098657213151455,
0.0037440438754856586,
0.10247340053319931,
-0.042362287640571594,
-0.020731588825583458,
-0.04073396697640419,
-0.02784070186316967,
-0.013651788234710693,
-0.025264961645007133,
0.03604467213153839,
-0.12370047718286514,
0.0628918707370758,
0.03108403831720352,
0.0375843271613121,
0.011039070785045624,
-0.053596705198287964,
-0.131315678358078,
0.20352721214294434,
-0.08504047989845276,
-0.057783741503953934,
-0.11823975294828415,
-0.09986942261457443,
0.06580016016960144,
-0.09040012210607529,
0.08111396431922913,
-0.08633793145418167,
0.013077793642878532,
-0.03235287219285965,
-0.1910691112279892,
0.14946654438972473,
-0.11215386539697647,
-0.021799318492412567,
-0.08084782212972641,
0.13647201657295227,
-0.07384097576141357,
0.013447601348161697,
0.013732876628637314,
0.02437661960721016,
-0.08041444420814514,
-0.0828900933265686,
0.006355836056172848,
-0.014282151125371456,
0.031239764764904976,
0.025724230334162712,
-0.06753705441951752,
-0.0018155953148379922,
-0.011079292744398117,
0.043730828911066055,
0.24128669500350952,
0.1798989176750183,
-0.08305246382951736,
0.11879931390285492,
0.15382987260818481,
-0.04894930124282837,
-0.31972965598106384,
-0.07097899168729782,
-0.11553299427032471,
-0.045486610382795334,
-0.038512568920850754,
-0.1360434889793396,
0.1564210206270218,
0.02604006603360176,
-0.04126296564936638,
0.08387403935194016,
-0.14068999886512756,
-0.07977245002985,
0.22926518321037292,
0.0029623862355947495,
0.40268903970718384,
-0.08702389895915985,
-0.08436572551727295,
-0.01582670770585537,
-0.15964074432849884,
0.11801808327436447,
0.04110552370548248,
0.06407788395881653,
-0.02680877409875393,
0.053290240466594696,
0.039342157542705536,
-0.06043728440999985,
0.09375226497650146,
0.031009389087557793,
0.044987753033638,
-0.10373856127262115,
-0.13265928626060486,
0.028911620378494263,
-0.030960315838456154,
-0.014786438085138798,
0.057378023862838745,
0.022116998210549355,
-0.12861059606075287,
-0.025071382522583008,
-0.07482665032148361,
0.08682981133460999,
0.03502606973052025,
-0.06619370728731155,
-0.003656937973573804,
-0.011468438431620598,
-0.010950524359941483,
-0.007217009086161852,
0.25750675797462463,
0.003593818750232458,
0.14098960161209106,
0.10273806750774384,
0.09543769061565399,
-0.17969036102294922,
-0.03609991818666458,
-0.07452508062124252,
-0.06570184230804443,
0.096438467502594,
-0.030126970261335373,
0.07519517093896866,
0.1516796350479126,
-0.04122238606214523,
0.04297369718551636,
0.12120365351438522,
0.047616977244615555,
-0.0461055189371109,
0.14043885469436646,
-0.20969203114509583,
0.03770104795694351,
-0.02787908911705017,
-0.02063949778676033,
0.07864256948232651,
0.10774786025285721,
0.10418630391359329,
0.04115518182516098,
-0.035168662667274475,
0.01394918467849493,
-0.02965446747839451,
-0.03510928899049759,
0.07017704844474792,
0.06717298924922943,
0.043598420917987823,
-0.13562940061092377,
0.03572770580649376,
0.03545781224966049,
-0.15994900465011597,
-0.04561840742826462,
0.0798286497592926,
-0.15686658024787903,
-0.11122603714466095,
-0.02263965830206871,
0.11455298960208893,
-0.14521896839141846,
-0.039632488042116165,
-0.04641811549663544,
-0.13418996334075928,
0.06972779333591461,
0.18159188330173492,
0.13103143870830536,
0.10970540344715118,
-0.05561865121126175,
-0.021705783903598785,
-0.011595118790864944,
-0.01822039857506752,
0.006359242834150791,
0.06845489144325256,
-0.17037172615528107,
0.017899589613080025,
-0.01394583098590374,
0.14702916145324707,
-0.09660974144935608,
-0.07415182143449783,
-0.16891857981681824,
0.04697556421160698,
-0.09303826838731766,
-0.07159677147865295,
-0.08290879428386688,
-0.02118426188826561,
0.028802918270230293,
-0.08412807434797287,
-0.03680410608649254,
-0.03769955411553383,
-0.12591150403022766,
0.058045439422130585,
0.016383599489927292,
0.029256748035550117,
-0.04278605803847313,
-0.049387238919734955,
0.10788699239492416,
-0.040735047310590744,
0.09248634427785873,
0.11333432793617249,
-0.06952280551195145,
0.08524111658334732,
-0.09002139419317245,
-0.12434586137533188,
0.12248294800519943,
0.02376852184534073,
0.11538374423980713,
0.037905752658843994,
0.03187960386276245,
0.07364030182361603,
0.015629447996616364,
0.05559782683849335,
0.05336270108819008,
-0.12224778532981873,
0.02708340249955654,
-0.01786322332918644,
-0.1931128054857254,
-0.02867043949663639,
-0.07598401606082916,
0.12200799584388733,
-0.0023715540301054716,
0.15108288824558258,
-0.006962954066693783,
0.08709387481212616,
-0.04373457655310631,
-0.00553933484479785,
-0.020426053553819656,
-0.20569974184036255,
-0.038188423961400986,
-0.05518931895494461,
0.0054757255129516125,
-0.003793900366872549,
0.25238698720932007,
0.03206604719161987,
0.021465452387928963,
0.04169522970914841,
0.05841078236699104,
0.0034062466584146023,
0.03988324850797653,
0.17441175878047943,
0.1048174500465393,
-0.044596027582883835,
-0.06612113863229752,
0.07449153810739517,
0.014104411005973816,
-0.03425929695367813,
0.10280604660511017,
0.0579635351896286,
-0.022101202979683876,
0.061401285231113434,
0.006350876297801733,
0.03077852725982666,
-0.17194421589374542,
-0.18425245583057404,
-0.0518796369433403,
0.07249171286821365,
0.02366054244339466,
0.06857233494520187,
0.10899512469768524,
-0.026017313823103905,
0.0467560812830925,
-0.04243495315313339,
-0.03868629038333893,
-0.19200804829597473,
-0.11754908412694931,
-0.09499595314264297,
-0.10949105769395828,
0.012144351378083229,
-0.045467864722013474,
-0.025607097893953323,
0.112579844892025,
0.05665137246251106,
-0.02637707255780697,
0.07898925989866257,
0.0068581425584852695,
-0.015829376876354218,
0.035544686019420624,
-0.015088078565895557,
-0.002572772093117237,
-0.008363397791981697,
-0.025032468140125275,
-0.1646868884563446,
-0.01614730805158615,
-0.059904489666223526,
-0.0038963949773460627,
-0.0638962835073471,
0.002605070825666189,
-0.10870174318552017,
-0.11156944185495377,
-0.027449829503893852,
0.031112194061279297,
-0.07530830055475235,
0.08167865127325058,
-0.012178707867860794,
0.031354498118162155,
0.02408476360142231,
0.15556955337524414,
-0.0745464414358139,
-0.05411846935749054,
-0.04471001774072647,
0.26918578147888184,
0.0574658028781414,
0.1188357025384903,
0.007326812483370304,
0.018914537504315376,
-0.0766475573182106,
0.29574328660964966,
0.26567307114601135,
-0.03664971888065338,
0.05142061784863472,
0.04157736524939537,
0.01670730672776699,
0.09608681499958038,
0.1372172236442566,
0.07940847426652908,
0.23908233642578125,
-0.07535005360841751,
-0.04486403986811638,
-0.029895322397351265,
-0.017877299338579178,
-0.10601367056369781,
0.0677277147769928,
0.0556669756770134,
-0.0367586687207222,
-0.08847290277481079,
0.07175103574991226,
-0.16698697209358215,
0.1506146639585495,
0.055220942944288254,
-0.18291416764259338,
-0.07399025559425354,
-0.022136300802230835,
0.1443227231502533,
-0.019863387569785118,
0.0790862888097763,
-0.031677428632974625,
-0.10550876706838608,
0.039427150040864944,
0.01414680015295744,
-0.21330086886882782,
-0.05566805601119995,
0.0937623530626297,
0.0036555929109454155,
0.05017957463860512,
-0.023827895522117615,
0.03583429008722305,
0.08428710699081421,
0.0726693794131279,
-0.04607251659035683,
0.006363187450915575,
0.011929735541343689,
-0.08450563997030258,
-0.03499215841293335,
0.00016984343528747559,
0.013966171070933342,
-0.05488259717822075,
0.03193806856870651,
-0.18189109861850739,
0.04003556817770004,
-0.09101450443267822,
-0.036184389144182205,
-0.019026435911655426,
0.023357758298516273,
-0.029626764357089996,
0.05516811087727547,
0.07363101094961166,
0.009205193258821964,
-0.03664170578122139,
-0.06109684333205223,
-0.025447756052017212,
0.03078463301062584,
-0.11446559429168701,
-0.14089645445346832,
-0.08753776550292969,
-0.06155245006084442,
0.09708955883979797,
-0.01227374467998743,
-0.0782943144440651,
-0.04041222110390663,
-0.07965502887964249,
0.03774513676762581,
-0.14707180857658386,
0.06991016864776611,
0.03579777479171753,
0.04206673055887222,
-0.01093299314379692,
-0.03975704312324524,
0.019534343853592873,
0.054816145449876785,
-0.12402302771806717,
-0.09320621192455292
] |
null | null | null |
# My Awesome Model
| {"tags": ["conversational"]} | text-generation | RTM/ChatBot | [
"conversational",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [] | TAGS
#conversational #region-us
|
# My Awesome Model
| [
"# My Awesome Model"
] | [
"TAGS\n#conversational #region-us \n",
"# My Awesome Model"
] | [
10,
4
] | [
"passage: TAGS\n#conversational #region-us \n# My Awesome Model"
] | [
-0.03546040877699852,
0.10198262333869934,
-0.009167643263936043,
-0.06873539090156555,
0.09720280766487122,
0.08721881359815598,
0.05205019563436508,
-0.007754879537969828,
0.1307058483362198,
-0.07418710738420486,
0.15552383661270142,
0.09356602281332016,
-0.055710893124341965,
0.054602161049842834,
0.023362739011645317,
-0.24776877462863922,
0.05416174605488777,
-0.018358204513788223,
0.038026146590709686,
0.059432677924633026,
-0.008917242288589478,
-0.06420617550611496,
0.032797642052173615,
-0.054550930857658386,
-0.047955937683582306,
0.05154015123844147,
-0.015393521636724472,
0.022211134433746338,
0.13093532621860504,
-0.025035109370946884,
0.11168728768825531,
0.02722480148077011,
-0.08139488101005554,
-0.2497675120830536,
0.04726335033774376,
-0.03638986125588417,
-0.05304733291268349,
0.00051964569138363,
0.040234215557575226,
-0.0361054390668869,
0.10917626321315765,
0.2034350037574768,
-0.013044852763414383,
0.12713150680065155,
-0.29209578037261963,
-0.05327993258833885,
0.019334495067596436,
-0.017972392961382866,
0.007713220082223415,
0.010176903568208218,
-0.0199898611754179,
0.12909340858459473,
-0.18584373593330383,
-0.03348228707909584,
-0.05190730839967728,
-0.19455693662166595,
0.02133459411561489,
0.195334330201149,
0.03859388828277588,
0.13404928147792816,
-0.03862352669239044,
0.11985649168491364,
-0.001176132122054696,
-0.02696487121284008,
-0.09787412732839584,
-0.083623968064785,
0.05212550610303879,
0.10331118106842041,
-0.045747701078653336,
-0.01321598794311285,
0.2609056234359741,
0.08284381031990051,
0.022602153941988945,
0.052660685032606125,
-0.037852801382541656,
-0.01829439401626587,
0.0014054370112717152,
-0.11398514360189438,
-0.012235384434461594,
0.15790092945098877,
0.0919213518500328,
-0.11842559278011322,
-0.11358761787414551,
0.05386804789304733,
-0.19349870085716248,
0.11392602324485779,
-0.05408310890197754,
0.09635519981384277,
-0.2302265763282776,
-0.03165312856435776,
-0.23591788113117218,
-0.03137180954217911,
0.008059423416852951,
-0.1426098644733429,
-0.019717667251825333,
-0.05638215318322182,
0.06416983157396317,
0.10064838826656342,
0.026653507724404335,
0.05982319638133049,
0.0024214035365730524,
0.03094852901995182,
0.04355722293257713,
0.06075821444392204,
0.13682802021503448,
0.07382544130086899,
0.08105745166540146,
-0.015595734119415283,
-0.13757064938545227,
-0.15028178691864014,
-0.042663972824811935,
-0.0603194423019886,
-0.10847903043031693,
0.1175917387008667,
-0.1348201483488083,
0.056088559329509735,
-0.008282779715955257,
-0.06160689890384674,
-0.1302913874387741,
0.07904011011123657,
-0.009780440479516983,
0.07031673192977905,
-0.0500408336520195,
-0.030323538929224014,
-0.012846678495407104,
0.09941235184669495,
-0.15968181192874908,
0.07661482691764832,
0.08417865633964539,
-0.017821768298745155,
-0.13900868594646454,
-0.0261816568672657,
-0.06725679337978363,
0.07161393761634827,
0.027083612978458405,
-0.05911887437105179,
0.07335446029901505,
-0.1062290146946907,
-0.10800924897193909,
-0.0462496280670166,
0.05760948359966278,
-0.04071643948554993,
0.02777569182217121,
-0.08487419039011002,
0.04323674738407135,
-0.043959055095911026,
-0.011524932458996773,
-0.0954136773943901,
-0.10528495907783508,
0.03591811656951904,
0.07539372146129608,
0.007705247029662132,
-0.16573995351791382,
0.034548647701740265,
-0.0835181400179863,
0.049681901931762695,
-0.02308676578104496,
0.0353124737739563,
0.043504517525434494,
0.1896352767944336,
0.09132891893386841,
0.11749628186225891,
-0.18855012953281403,
0.05541319027543068,
-0.12649250030517578,
0.2574777901172638,
-0.15162399411201477,
-0.0263845045119524,
0.23871201276779175,
-0.028618283569812775,
-0.1612473875284195,
0.031156204640865326,
-0.024838248267769814,
0.21029748022556305,
0.15949539840221405,
0.3231875002384186,
-0.14657290279865265,
-0.040706682950258255,
0.10120458155870438,
0.1641000658273697,
-0.0947515219449997,
-0.05097932368516922,
0.07063271850347519,
-0.08443530648946762,
-0.10856795310974121,
-0.009461409412324429,
0.06791952252388,
0.08725127577781677,
-0.08676911145448685,
-0.057048194110393524,
0.04897143691778183,
-0.05990632250905037,
0.0539734810590744,
0.09225133061408997,
0.019252249971032143,
-0.06463979184627533,
0.03549259155988693,
-0.03487003222107887,
0.06323052942752838,
0.1344345659017563,
-0.07567445188760757,
-0.04621171951293945,
0.05208157002925873,
-0.008416261523962021,
0.011982166208326817,
-0.07247529923915863,
-0.10293032974004745,
-0.09282280504703522,
0.15225498378276825,
0.12907147407531738,
0.24349625408649445,
0.07771562039852142,
-0.08539113402366638,
0.023976441472768784,
0.05230777710676193,
0.02473422884941101,
0.08408135920763016,
0.006961719132959843,
-0.04070304334163666,
0.1336214691400528,
-0.06637702882289886,
0.0505339689552784,
-0.15414337813854218,
-0.06644152104854584,
-0.003572809975594282,
0.03696983680129051,
0.08356471359729767,
-0.04645514488220215,
-0.0019455882720649242,
0.0381651446223259,
0.05654265359044075,
0.010789581574499607,
0.11545533686876297,
-0.04579673334956169,
-0.07804842293262482,
0.17771583795547485,
-0.10181614011526108,
0.1486288160085678,
0.11471833288669586,
-0.2789360582828522,
0.027798451483249664,
-0.09091047197580338,
-0.016147438436746597,
0.03836512193083763,
0.07616107910871506,
-0.04532977193593979,
0.0876641571521759,
0.008135076612234116,
0.045742783695459366,
0.030224641785025597,
0.049904048442840576,
-0.05050472542643547,
-0.038475628942251205,
-0.1468115597963333,
0.11378216743469238,
0.14237768948078156,
-0.1664484292268753,
0.15665550529956818,
0.344457745552063,
0.2189280241727829,
0.24893705546855927,
-0.02391449362039566,
-0.0023995572701096535,
-0.007929098792374134,
-0.05631881207227707,
-0.1310952752828598,
0.1292494386434555,
-0.3013957738876343,
-0.0045976797118783,
0.0019251068588346243,
0.020475171506404877,
0.10425154119729996,
-0.11188157647848129,
-0.11943034082651138,
0.01928076706826687,
0.013840734958648682,
-0.04093893617391586,
0.05837646499276161,
-0.08973170816898346,
0.06690972298383713,
0.044297512620687485,
-0.09705004096031189,
0.12118203192949295,
0.032428398728370667,
-0.027439109981060028,
0.06038731336593628,
-0.13205169141292572,
-0.161821186542511,
-0.014722511172294617,
-0.12305234372615814,
0.03822726011276245,
0.02258457988500595,
-0.0011937115341424942,
-0.11695839464664459,
-0.017612792551517487,
0.08651383221149445,
0.06550532579421997,
-0.21545930206775665,
-0.08663901686668396,
-0.05402546748518944,
0.06747249513864517,
-0.14086408913135529,
-0.005392237100750208,
-0.0580231137573719,
-0.04639870673418045,
-0.015921805053949356,
0.03196299821138382,
-0.1326124668121338,
0.05850570648908615,
0.1952214241027832,
0.11960950493812561,
0.08012472093105316,
0.00031379942083731294,
0.25854313373565674,
-0.15044859051704407,
-0.031102577224373817,
0.04013100266456604,
-0.03211164101958275,
0.08570355176925659,
0.2004045695066452,
0.07713621109724045,
-0.04407672584056854,
-0.032609183341264725,
-0.06265520304441452,
-0.08382013440132141,
-0.17984598875045776,
-0.09944407641887665,
-0.09241506457328796,
0.11593815684318542,
-0.10455609858036041,
0.02279566414654255,
0.1293002963066101,
0.038562655448913574,
0.10591760277748108,
-0.17956066131591797,
-0.08083869516849518,
-0.015997247770428658,
0.10070767998695374,
-0.1476999819278717,
-0.036619413644075394,
-0.057095691561698914,
-0.13042104244232178,
0.09085097908973694,
0.07351890206336975,
-0.07598336786031723,
0.2753245234489441,
0.14356902241706848,
0.06460168957710266,
0.0372491329908371,
0.050815973430871964,
0.03296361491084099,
0.06032256409525871,
-0.08821487426757812,
-0.024852164089679718,
0.008612229488790035,
-0.022380370646715164,
0.05025824159383774,
0.21010570228099823,
-0.24013914167881012,
-0.010044138878583908,
-0.12094947695732117,
0.058911196887493134,
-0.09226593375205994,
0.15273147821426392,
-0.005419398192316294,
0.07938405126333237,
0.13775718212127686,
0.017697615548968315,
-0.08790077269077301,
0.10226619243621826,
0.06094779446721077,
-0.12483128160238266,
-0.00920578371733427,
0.10987824946641922,
0.08385234326124191,
-0.016504161059856415,
0.11643458902835846,
-0.21195663511753082,
-0.13761650025844574,
0.033488254994153976,
0.10529548674821854,
-0.1958140879869461,
0.3039077818393707,
0.009235309436917305,
-0.1351068764925003,
-0.0639132410287857,
-0.11496353149414062,
-0.012014171108603477,
0.10743112862110138,
0.10711206495761871,
0.042469725012779236,
-0.07393775135278702,
-0.026096675544977188,
0.009214960969984531,
-0.007742607034742832,
0.09298452734947205,
-0.08414001762866974,
-0.12013377249240875,
0.010150929912924767,
0.03940318152308464,
-0.048703476786613464,
0.1009429320693016,
-0.08256068825721741,
-0.07715889811515808,
-0.009262125939130783,
-0.013167787343263626,
0.013363508507609367,
0.0613013356924057,
0.09955485910177231,
-0.03248724341392517,
-0.016322879120707512,
0.17398953437805176,
0.06143142655491829,
-0.028618335723876953,
-0.15928512811660767,
-0.0019777673296630383,
-0.04513169452548027,
-0.039428479969501495,
-0.06899980455636978,
-0.07661525160074234,
-0.11753853410482407,
-0.09040885418653488,
0.12060512602329254,
-0.1011100485920906,
0.08252820372581482,
-0.07234728336334229,
0.1556718945503235,
0.0449947789311409,
0.027362238615751266,
0.0420842170715332,
0.010504845529794693,
-0.0637444332242012,
-0.06514158844947815,
0.13852131366729736,
-0.17852531373500824,
-0.03856880962848663,
0.10458311438560486,
0.07282499223947525,
0.011911713518202305,
0.01789284311234951,
-0.12299755960702896,
0.19862310588359833,
0.21264135837554932,
-0.00729813938960433,
0.16878525912761688,
0.24475444853305817,
-0.06593596935272217,
-0.205520361661911,
-0.06422454118728638,
-0.23529553413391113,
-0.07331079244613647,
0.15935871005058289,
-0.17814143002033234,
0.09661445021629333,
-0.008739825338125229,
-0.0568917840719223,
0.14693109691143036,
-0.32532361149787903,
-0.005068281665444374,
0.19984132051467896,
-0.03098643198609352,
0.5961567759513855,
-0.067476287484169,
-0.10163161158561707,
-0.020260926336050034,
-0.08047869801521301,
0.2328169047832489,
-0.10700056701898575,
0.022178582847118378,
0.05551070347428322,
0.10260957479476929,
0.06909338384866714,
0.020471658557653427,
0.1812010407447815,
-0.05112580955028534,
-0.06011022627353668,
-0.11829929798841476,
-0.20441202819347382,
0.01321274135261774,
-0.002565407194197178,
-0.09609947353601456,
0.06216618791222572,
-0.06233842670917511,
-0.17662794888019562,
0.015285306610167027,
-0.08876541256904602,
-0.03281906247138977,
0.012494787573814392,
-0.04522540047764778,
-0.03335950896143913,
0.03184078261256218,
-0.1269286721944809,
0.02083008363842964,
0.15761429071426392,
-0.09231042861938477,
0.21183836460113525,
-0.09038146585226059,
0.11980026215314865,
-0.1715039312839508,
-0.07381453365087509,
-0.0892910361289978,
-0.0739288181066513,
0.022808637470006943,
-0.05498562753200531,
0.039651405066251755,
0.1229073703289032,
-0.06014255806803703,
0.13346922397613525,
0.04274857044219971,
-0.07362601161003113,
-0.009332284331321716,
0.14846959710121155,
-0.19871611893177032,
-0.2910851538181305,
-0.11566967517137527,
0.05923045426607132,
0.2028619349002838,
0.007538134697824717,
0.0777674987912178,
0.11478970944881439,
-0.016507035121321678,
0.015079355798661709,
0.010267493315041065,
-0.11431148648262024,
-0.10444751381874084,
0.06896253675222397,
0.0007569619920104742,
-0.08558188378810883,
0.10230734944343567,
0.019022267311811447,
-0.18386487662792206,
-0.11988023668527603,
0.20762711763381958,
-0.0200219564139843,
-0.09439916163682938,
-0.0994015783071518,
0.1823630928993225,
-0.060141727328300476,
-0.026388145983219147,
0.060486629605293274,
-0.0971314087510109,
-0.027692077681422234,
0.17358030378818512,
0.05215360224246979,
0.07331566512584686,
0.043052736669778824,
-0.019896553829312325,
0.19111433625221252,
-0.07610252499580383,
-0.08666720986366272,
-0.11411479860544205,
-0.1113981381058693,
-0.09765390306711197,
-0.022090891376137733,
0.17170315980911255,
-0.1043042540550232,
-0.1465844213962555,
-0.23367111384868622,
0.08566625416278839,
-0.07618606090545654,
-0.14835943281650543,
-0.12351330369710922,
-0.09960491955280304,
0.07022807002067566,
-0.005836328491568565,
-0.025358503684401512,
-0.09784673154354095,
-0.1479180008172989,
0.10302628576755524,
0.09353149682283401,
0.02215663343667984,
0.03276374191045761,
0.06490205228328705,
0.16425716876983643,
0.006264934781938791,
0.11560661345720291,
0.09335105121135712,
0.004334130324423313,
0.12737876176834106,
-0.24313320219516754,
-0.03612852096557617,
0.061543241143226624,
-0.02008756995201111,
0.03869541361927986,
0.1556338667869568,
-0.07101669907569885,
-0.008599703200161457,
0.07346312701702118,
0.05884246155619621,
-0.06158248707652092,
-0.07029277831315994,
-0.020444681867957115,
0.15146341919898987,
-0.21854759752750397,
-0.010464908555150032,
-0.13543701171875,
0.08618341386318207,
-0.06382738798856735,
0.026516791433095932,
0.07620655745267868,
0.08659784495830536,
0.003671627026051283,
0.051998503506183624,
0.02891702763736248,
-0.10376659035682678,
0.11479650437831879,
-0.1011483371257782,
-0.010525095276534557,
-0.04059837758541107,
0.3260350227355957,
0.008407027460634708,
0.01702333241701126,
0.04124368727207184,
0.1525338888168335,
0.036274950951337814,
0.002469088416546583,
0.11112259328365326,
0.1318541020154953,
-0.05502143129706383,
-0.1530759334564209,
0.1053222268819809,
-0.03983991593122482,
0.017480194568634033,
0.12883120775222778,
-0.017984678968787193,
0.05133776366710663,
0.0598396472632885,
0.03326093778014183,
0.06138930097222328,
0.08058228343725204,
-0.2519816756248474,
0.05864633992314339,
-0.008193781599402428,
-0.10156036913394928,
0.14093659818172455,
0.12776172161102295,
-0.04358195886015892,
0.03643115237355232,
-0.08332061767578125,
-0.017144199460744858,
-0.13900910317897797,
-0.08347687125205994,
0.011046548373997211,
-0.0890955775976181,
0.024407757446169853,
-0.02158970944583416,
-0.01773250661790371,
0.20654316246509552,
0.00810841005295515,
-0.08619078248739243,
0.024125345051288605,
-0.03246486932039261,
-0.1350407898426056,
-0.028485149145126343,
0.004192930646240711,
0.07534855604171753,
-0.10631464421749115,
-0.009459893219172955,
-0.1966288685798645,
-0.03228876367211342,
-0.10370618849992752,
0.031585320830345154,
-0.13690978288650513,
-0.056601304560899734,
-0.1394949108362198,
-0.046161238104104996,
-0.07070588320493698,
0.0341620109975338,
-0.10124900937080383,
0.14852306246757507,
-0.02862531691789627,
0.04669342562556267,
0.001221607206389308,
0.21473883092403412,
-0.0055917128920555115,
0.1111997663974762,
-0.03376127779483795,
0.025253819301724434,
-0.07697580754756927,
0.12172159552574158,
-0.06678933650255203,
-0.016789207234978676,
-0.0435773991048336,
0.26328104734420776,
0.3666163384914398,
-0.13708895444869995,
-0.03541192784905434,
-0.029750946909189224,
0.032959096133708954,
0.055593449622392654,
0.09368465840816498,
-0.04153439402580261,
0.29192063212394714,
-0.11245544999837875,
0.09509938210248947,
0.017580494284629822,
0.02136683464050293,
0.05382193997502327,
0.03408944979310036,
0.0992191731929779,
0.02480826899409294,
-0.06201104819774628,
0.22126658260822296,
-0.28489235043525696,
0.12929458916187286,
-0.09419834613800049,
-0.1977759450674057,
-0.024422185495495796,
-0.09278329461812973,
0.1075873076915741,
0.009720941074192524,
0.1459415853023529,
-0.059647753834724426,
-0.14828547835350037,
-0.10750431567430496,
0.04621806740760803,
-0.34202659130096436,
-0.19561326503753662,
0.13985638320446014,
0.041433196514844894,
0.09628590941429138,
-0.006398599129170179,
0.017934424802660942,
0.040109071880578995,
0.005008349195122719,
0.011764837428927422,
0.06299147754907608,
0.06230494752526283,
-0.03965628892183304,
-0.18101933598518372,
0.038948025554418564,
0.03522862121462822,
-0.13118009269237518,
0.08088650554418564,
-0.19435498118400574,
0.03574736788868904,
0.1191537082195282,
-0.08429282158613205,
0.052640412002801895,
0.12771375477313995,
-0.12840406596660614,
0.03900361806154251,
0.03826753795146942,
0.04374406486749649,
-0.039075564593076706,
0.019395308569073677,
0.00933300144970417,
-0.021677589043974876,
-0.11719156801700592,
-0.12927356362342834,
0.06891030073165894,
-0.07255057990550995,
0.15803246200084686,
-0.043637361377477646,
-0.06318710744380951,
0.02716391533613205,
-0.05997319892048836,
0.09305576235055923,
-0.026282401755452156,
0.04433848708868027,
0.20267623662948608,
0.046981945633888245,
0.005474291741847992,
-0.09724274277687073,
0.07344987988471985,
0.0035022953525185585,
0.002335904398933053,
-0.08806760609149933
] |
null | null | null |
# Lucky
| {"tags": ["conversational"]} | text-generation | RTM/Lucky | [
"conversational",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [] | TAGS
#conversational #region-us
|
# Lucky
| [
"# Lucky"
] | [
"TAGS\n#conversational #region-us \n",
"# Lucky"
] | [
10,
2
] | [
"passage: TAGS\n#conversational #region-us \n# Lucky"
] | [
0.014208397828042507,
0.012948853895068169,
-0.010771617293357849,
-0.02331152744591236,
0.07578682154417038,
0.07689684629440308,
-0.04980022460222244,
0.04912425950169563,
0.06727931648492813,
-0.04341219365596771,
0.16035979986190796,
0.07512760162353516,
-0.03894086182117462,
-0.1104687824845314,
-0.04001366347074509,
-0.21570296585559845,
0.014215167611837387,
0.020784389227628708,
0.15819379687309265,
0.061528608202934265,
-0.06763969361782074,
-0.040894582867622375,
-0.01625886559486389,
-0.08255337178707123,
-0.031484950333833694,
0.12727153301239014,
0.07758078724145889,
0.05770527943968773,
0.14264963567256927,
-0.007653323467820883,
0.10309035331010818,
0.023356782272458076,
-0.12317921966314316,
-0.30886998772621155,
0.03956448659300804,
-0.08525627106428146,
-0.0013138661161065102,
-0.021998563781380653,
0.03998832777142525,
-0.10069682449102402,
-0.10873887687921524,
0.16490015387535095,
-0.010713073424994946,
0.10420301556587219,
-0.25696542859077454,
-0.17602591216564178,
-0.04711494222283363,
-0.03352037072181702,
-0.09493307769298553,
0.028398357331752777,
-0.0020550815388560295,
0.19165995717048645,
-0.1941443234682083,
-0.06459483504295349,
0.03443315252661705,
-0.21697494387626648,
0.019669456407427788,
0.14648228883743286,
0.054950371384620667,
0.13274957239627838,
-0.06363332271575928,
0.083808533847332,
0.051141660660505295,
0.012395520694553852,
-0.20832955837249756,
-0.05822700262069702,
-0.02513730339705944,
0.2246781438589096,
-0.0369003601372242,
-0.09817253798246384,
0.2732020616531372,
0.04643453285098076,
0.015203820541501045,
0.027859173715114594,
-0.032852936536073685,
-0.10694285482168198,
0.048226602375507355,
-0.07654671370983124,
-0.015914155170321465,
0.17782826721668243,
0.13239657878875732,
0.016732119023799896,
-0.1688646525144577,
0.0621553435921669,
-0.1731516718864441,
0.14539149403572083,
-0.009520678780972958,
0.07952660322189331,
-0.24920792877674103,
-0.028162669390439987,
-0.17207758128643036,
-0.02257595583796501,
0.04031476378440857,
-0.087863028049469,
0.042376305907964706,
-0.06758762896060944,
-0.01833149418234825,
0.12159695476293564,
0.07302500307559967,
0.12666212022304535,
-0.0515761561691761,
0.032030247151851654,
-0.10725041478872299,
0.11190037429332733,
0.09816062450408936,
-0.04749457165598869,
0.16945096850395203,
0.05676667019724846,
-0.1329461634159088,
-0.1476850062608719,
0.014250979758799076,
0.014019805938005447,
-0.003427651710808277,
0.07698433101177216,
-0.17059102654457092,
0.1057099923491478,
-0.0006626598769798875,
-0.05826354771852493,
-0.1458015739917755,
0.06824808567762375,
-0.052655428647994995,
0.04020918533205986,
-0.1064608097076416,
0.0081467991694808,
-0.022651011124253273,
0.034946538507938385,
-0.20745849609375,
0.049818411469459534,
0.10425692796707153,
-0.02574801817536354,
-0.10739479213953018,
0.013695935718715191,
-0.01695086620748043,
-0.00813980307430029,
0.023982424288988113,
0.036483973264694214,
0.06456796824932098,
-0.1469273865222931,
-0.03373013436794281,
-0.011658884584903717,
0.0552382655441761,
-0.02998761460185051,
0.08388678729534149,
-0.010765273123979568,
0.028486713767051697,
-0.0026236670091748238,
-0.01100982166826725,
-0.19850511848926544,
-0.10505019873380661,
0.0757528617978096,
0.058976948261260986,
-0.00007326574996113777,
-0.04005548357963562,
0.023861737921833992,
-0.03952426090836525,
0.027764514088630676,
-0.03503832593560219,
0.031196460127830505,
0.05233516916632652,
0.12790217995643616,
0.10090824216604233,
0.07063838094472885,
-0.18432199954986572,
0.03502991050481796,
-0.15410606563091278,
0.3019653856754303,
-0.12400870025157928,
-0.03644933924078941,
0.2342241257429123,
-0.04360359162092209,
-0.057587556540966034,
0.027995746582746506,
0.008102879859507084,
0.07705723494291306,
0.1288706511259079,
0.3469896614551544,
-0.03810127079486847,
-0.0033019986003637314,
0.03901291266083717,
0.17521275579929352,
-0.1414957046508789,
0.027194691821932793,
0.0829504057765007,
-0.09552624076604843,
-0.09199142456054688,
0.008478788658976555,
0.17764192819595337,
0.14655523002147675,
-0.0938212126493454,
-0.03491126373410225,
0.05612789839506149,
-0.03506344556808472,
0.11971167474985123,
0.08076204359531403,
0.033353861421346664,
-0.12700322270393372,
0.07807417213916779,
-0.04456080496311188,
0.07477220147848129,
0.16647663712501526,
-0.017082123085856438,
-0.06964847445487976,
-0.062421929091215134,
0.03855883330106735,
0.0342741496860981,
-0.0660962462425232,
-0.03807622566819191,
-0.014401338994503021,
0.09689708799123764,
0.05984197556972504,
0.18766091763973236,
0.08365516364574432,
-0.10575929284095764,
-0.019213471561670303,
-0.021491141989827156,
0.10057292878627777,
0.046905506402254105,
0.00011073349742218852,
-0.048534736037254333,
0.16377022862434387,
-0.042138468474149704,
0.09681009501218796,
-0.06736186891794205,
-0.03200594335794449,
0.1858135461807251,
0.00807358231395483,
0.0981999859213829,
-0.05413338914513588,
0.007977182045578957,
0.01525687426328659,
0.061120353639125824,
-0.0064527299255132675,
0.11697857826948166,
0.00317009212449193,
-0.0774998888373375,
0.24036182463169098,
-0.06958679109811783,
0.03148322179913521,
0.09689193964004517,
-0.2377268671989441,
-0.03154190629720688,
-0.059291623532772064,
-0.0511348657310009,
0.020920194685459137,
0.06482235342264175,
-0.07322387397289276,
0.09165532886981964,
-0.009475002065300941,
0.03443554416298866,
0.05576179176568985,
0.04672101140022278,
-0.056444402784109116,
-0.019070643931627274,
-0.10953469574451447,
0.15419495105743408,
-0.14059023559093475,
-0.16485273838043213,
0.1427680402994156,
0.20513935387134552,
0.25920966267585754,
0.2239118218421936,
-0.016406366601586342,
0.030909936875104904,
-0.037164416164159775,
0.09770847856998444,
-0.002573091071099043,
0.09552062302827835,
-0.2260630577802658,
-0.010448671877384186,
0.010065320879220963,
0.039800211787223816,
0.08284208923578262,
-0.11777900159358978,
-0.12099163979291916,
0.014998527243733406,
0.03363293409347534,
0.006722659338265657,
0.07845015078783035,
-0.07768482714891434,
0.07871758937835693,
0.05189543962478638,
-0.045858558267354965,
0.1529507040977478,
0.004022344946861267,
-0.05861908569931984,
0.009752352721989155,
-0.1569940745830536,
-0.15998318791389465,
-0.06598193198442459,
-0.09588976204395294,
0.03552130237221718,
0.024698948487639427,
0.047423046082258224,
-0.12032154947519302,
-0.017118480056524277,
0.11263149231672287,
0.04269000142812729,
-0.2412528544664383,
-0.06449033319950104,
-0.09874277561903,
0.032632976770401,
-0.1191239282488823,
0.02927957847714424,
-0.03702973946928978,
-0.06511110067367554,
-0.035546623170375824,
0.10912452638149261,
-0.11056956648826599,
-0.0100927222520113,
0.1736803948879242,
0.1795298159122467,
0.05026118829846382,
-0.031509000808000565,
0.1239829808473587,
-0.15883632004261017,
-0.10683958977460861,
0.03209415450692177,
-0.01704561896622181,
0.06085443124175072,
0.22946029901504517,
0.08431050181388855,
-0.05280603840947151,
0.03323952853679657,
-0.029085516929626465,
-0.04210979864001274,
-0.2417670488357544,
-0.07489673793315887,
-0.07268412411212921,
0.20989780128002167,
-0.12147445976734161,
0.03922249749302864,
0.09032745659351349,
-0.06145903840661049,
0.0370093435049057,
-0.18123120069503784,
-0.07600358128547668,
0.01289682649075985,
0.2838434875011444,
-0.1490280032157898,
-0.02694082073867321,
-0.05137713998556137,
-0.09935573488473892,
0.10060778260231018,
-0.04478105157613754,
-0.14735771715641022,
0.2479087859392166,
0.15494535863399506,
0.08445495367050171,
0.07070119678974152,
0.06277637928724289,
-0.061618249863386154,
0.07401648163795471,
-0.08449695259332657,
-0.010577364824712276,
0.020408667623996735,
-0.0033067043405026197,
-0.015256496146321297,
0.1388583481311798,
-0.13417744636535645,
0.04534648358821869,
-0.20118604600429535,
0.032825857400894165,
-0.12569132447242737,
0.1988302320241928,
0.20301711559295654,
0.06212207302451134,
0.12575910985469818,
-0.03119378350675106,
-0.02600102685391903,
0.15625566244125366,
0.088678739964962,
-0.09330585598945618,
-0.06175544857978821,
0.06825490295886993,
0.05995264649391174,
-0.03442695736885071,
0.1233288124203682,
-0.21458500623703003,
-0.19273975491523743,
0.011960448697209358,
0.07980191707611084,
-0.2249092161655426,
0.21306107938289642,
0.038308531045913696,
-0.1915109008550644,
-0.007490408141165972,
-0.07937006652355194,
-0.005129698198288679,
0.028218451887369156,
0.08349568396806717,
0.04155370220541954,
0.03556389361619949,
-0.08410023152828217,
0.07465460151433945,
-0.05110645294189453,
0.19311733543872833,
-0.059408124536275864,
-0.11243300884962082,
-0.025321446359157562,
0.05144035443663597,
-0.053015194833278656,
0.17395751178264618,
-0.10510199517011642,
-0.004097044933587313,
0.025792837142944336,
0.06403609365224838,
0.034162864089012146,
0.010743977501988411,
0.07288701832294464,
0.02091040275990963,
-0.07492422312498093,
0.021208588033914566,
0.09047559648752213,
-0.049568768590688705,
-0.08027492463588715,
-0.02472003921866417,
-0.017201635986566544,
-0.0860576406121254,
-0.04225475341081619,
-0.11154439300298691,
-0.098983533680439,
-0.027230164036154747,
0.08438467234373093,
-0.03095201402902603,
0.09698589891195297,
-0.051194049417972565,
0.14598363637924194,
0.021534405648708344,
0.08626532554626465,
-0.06743216514587402,
-0.0034263725392520428,
0.02137875184416771,
-0.039708953350782394,
0.14279137551784515,
-0.2791241705417633,
-0.015874482691287994,
0.1700662076473236,
0.03400410711765289,
-0.07093699276447296,
-0.0478055365383625,
-0.1587391495704651,
0.21249154210090637,
0.29515305161476135,
-0.012591246515512466,
0.1837339699268341,
0.16429045796394348,
-0.012118089944124222,
-0.2096094936132431,
-0.07021220028400421,
-0.2361641824245453,
-0.08995669335126877,
0.1263258010149002,
-0.13363318145275116,
0.09155945479869843,
-0.056959785521030426,
-0.06818201392889023,
0.12278545647859573,
-0.3350037932395935,
0.029221031814813614,
0.14976613223552704,
-0.09574738889932632,
0.5408190488815308,
-0.11959360539913177,
-0.15729805827140808,
0.010984204709529877,
-0.10366964340209961,
0.11286478489637375,
-0.06592274457216263,
0.1023457869887352,
0.0069206953048706055,
-0.00779061671346426,
0.057461969554424286,
0.0031386553309857845,
0.1323348432779312,
-0.03540368378162384,
0.013396111316978931,
-0.08151596039533615,
-0.25041699409484863,
-0.024535253643989563,
0.004463013261556625,
-0.18686974048614502,
0.06884130090475082,
-0.06116972491145134,
-0.20695793628692627,
0.027754999697208405,
-0.10259152948856354,
-0.04155322164297104,
0.059090472757816315,
-0.011492874473333359,
-0.1060439944267273,
-0.02256043255329132,
-0.09134044498205185,
-0.0029378319159150124,
0.0959836095571518,
-0.1069106012582779,
0.24086451530456543,
-0.11128109693527222,
0.04066438972949982,
-0.11608907580375671,
-0.11748884618282318,
-0.11157595366239548,
-0.03328244388103485,
0.06801523268222809,
-0.0008197873830795288,
0.01254559401422739,
0.1324455291032791,
-0.02538159117102623,
0.17242063581943512,
0.05434064194560051,
-0.05114676430821419,
0.049499738961458206,
0.08604036271572113,
-0.2135801762342453,
-0.14891847968101501,
-0.10550229996442795,
0.0892777144908905,
0.24271990358829498,
-0.09416433423757553,
0.07362081110477448,
0.1214100569486618,
0.004525518510490656,
0.04907843843102455,
-0.02230340614914894,
-0.09731807559728622,
-0.051430150866508484,
-0.0032557877711951733,
0.006577889434993267,
-0.08442249149084091,
0.11830569058656693,
0.02167518623173237,
-0.2325950264930725,
-0.12743985652923584,
0.2270573228597641,
0.0007281540893018246,
-0.06623603403568268,
-0.1109895184636116,
0.13559970259666443,
-0.054217711091041565,
-0.03903944417834282,
0.0631493479013443,
-0.07393738627433777,
-0.03159411996603012,
0.23588578402996063,
0.0777217373251915,
0.1712498813867569,
0.028612986207008362,
-0.017332786694169044,
0.18905679881572723,
-0.07080905139446259,
-0.09053867310285568,
-0.1329076588153839,
-0.09724126756191254,
-0.08503163605928421,
0.007344319950789213,
0.0983661636710167,
-0.0715244710445404,
-0.16086556017398834,
-0.23076274991035461,
0.08796922862529755,
-0.01812134124338627,
-0.071486696600914,
-0.003901006421074271,
-0.044124260544776917,
0.06904798001050949,
0.004504912532866001,
-0.035945821553468704,
-0.11467461287975311,
-0.13418284058570862,
0.08938323706388474,
0.15365847945213318,
0.10425952821969986,
0.042406149208545685,
-0.01706162467598915,
0.16742505133152008,
0.008247495628893375,
0.10005965828895569,
0.13823889195919037,
-0.0013784496113657951,
0.13357725739479065,
-0.15436296164989471,
-0.09223917126655579,
0.09053295105695724,
-0.02624472603201866,
0.009195341728627682,
0.19649021327495575,
-0.13368266820907593,
-0.01605507731437683,
-0.014175803400576115,
0.07586214691400528,
-0.028213730081915855,
-0.03965848311781883,
-0.04759075492620468,
0.09878092259168625,
-0.18817992508411407,
-0.017758606001734734,
-0.20435655117034912,
0.1637558788061142,
-0.032598383724689484,
0.015821954235434532,
0.030717981979250908,
0.04669393226504326,
-0.0027440753765404224,
0.00012507475912570953,
0.03785129263997078,
-0.13294127583503723,
0.010541573166847229,
-0.0035228347405791283,
-0.014003809541463852,
0.01527718361467123,
0.19696848094463348,
-0.12132900953292847,
-0.06854558736085892,
0.07944177836179733,
0.19541968405246735,
0.005647948477417231,
-0.025047490373253822,
-0.008897081948816776,
0.1786767989397049,
-0.0806000605225563,
-0.17796024680137634,
0.09149844944477081,
-0.03575778007507324,
-0.0836782306432724,
0.10178837180137634,
0.023531462997198105,
0.07600878179073334,
0.024008918553590775,
-0.0796474888920784,
0.01026707049459219,
0.13459284603595734,
-0.21715307235717773,
-0.018603600561618805,
-0.04230055958032608,
-0.06971000134944916,
0.1876557320356369,
0.06887483596801758,
0.0014082773122936487,
0.043348170816898346,
-0.1593773514032364,
0.04382585361599922,
-0.10523799806833267,
0.08704905211925507,
0.06470876187086105,
-0.12525790929794312,
0.011430170387029648,
0.005282036028802395,
0.03895450383424759,
0.22975513339042664,
-0.007129085715860128,
-0.005098382942378521,
0.01691594161093235,
-0.015418661758303642,
-0.11437925696372986,
-0.043251462280750275,
-0.04734334349632263,
0.07091054320335388,
-0.14201126992702484,
-0.09022623300552368,
-0.14456790685653687,
-0.09985926002264023,
-0.08625718951225281,
0.04668433964252472,
-0.01604253053665161,
-0.08666278421878815,
-0.20313230156898499,
-0.04068072885274887,
-0.015791265293955803,
0.06944023817777634,
-0.09616135060787201,
0.09374037384986877,
-0.0235535129904747,
0.0674872025847435,
0.0269392691552639,
0.19319826364517212,
0.055477142333984375,
0.07876574993133545,
0.02789909951388836,
0.04542158171534538,
-0.10434144735336304,
0.12024562060832977,
-0.07091451436281204,
-0.008228208869695663,
-0.035393159836530685,
0.21428148448467255,
0.23531541228294373,
-0.05644851550459862,
-0.0076698679476976395,
0.015320463106036186,
0.03111964464187622,
0.1529913991689682,
0.07471919804811478,
-0.009537079371511936,
0.27021679282188416,
-0.1289253681898117,
0.08337492495775223,
-0.0008838532958179712,
0.061481114476919174,
0.061853617429733276,
0.08419399708509445,
0.12472782284021378,
0.03289391100406647,
-0.11249449849128723,
0.12542401254177094,
-0.2430373877286911,
0.24477533996105194,
-0.0057268752716481686,
-0.23387083411216736,
0.052962858229875565,
-0.16221407055854797,
0.23393657803535461,
-0.03451300039887428,
0.15702901780605316,
-0.03308279067277908,
-0.14235852658748627,
-0.2949789762496948,
-0.0005482526030391455,
-0.3481545150279999,
-0.2176307588815689,
0.12433848530054092,
-0.004226834513247013,
0.03762535750865936,
-0.07793233543634415,
0.01783565618097782,
0.04706740379333496,
0.05040746554732323,
0.031742166727781296,
0.02135380730032921,
0.04711056873202324,
-0.049247946590185165,
-0.22459879517555237,
0.060386575758457184,
0.009768731892108917,
-0.09254167228937149,
0.05435645580291748,
-0.165263369679451,
0.0008223438635468483,
0.14727792143821716,
0.00032367161475121975,
0.03895079717040062,
0.0779576376080513,
-0.0690726488828659,
-0.021535363048315048,
0.017634518444538116,
0.0716787800192833,
-0.00429919408634305,
0.020749496296048164,
-0.08975857496261597,
-0.009351206012070179,
-0.09010306745767593,
-0.06750942766666412,
0.015382910147309303,
-0.014548107981681824,
0.11540389060974121,
-0.0992681086063385,
-0.028513513505458832,
-0.031303636729717255,
0.005832617171108723,
0.14093801379203796,
-0.07417679578065872,
-0.006260727532207966,
0.13044771552085876,
0.06861814856529236,
0.03518044576048851,
-0.19695031642913818,
0.07976949959993362,
0.0031237704679369926,
0.07932330667972565,
-0.08178830146789551
] |
null | null | transformers |
# TIMBOT DialoGPT model | {"tags": ["conversational"]} | text-generation | RTurk/DialoGPT-small-TIMBOT | [
"transformers",
"pytorch",
"gpt2",
"text-generation",
"conversational",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [] | TAGS
#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
# TIMBOT DialoGPT model | [
"# TIMBOT DialoGPT model"
] | [
"TAGS\n#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n",
"# TIMBOT DialoGPT model"
] | [
51,
9
] | [
"passage: TAGS\n#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n# TIMBOT DialoGPT model"
] | [
-0.0023099458776414394,
0.02660573646426201,
-0.006103469058871269,
-0.001843482255935669,
0.143808051943779,
-0.018679561093449593,
0.14136084914207458,
0.11456913501024246,
-0.06906240433454514,
-0.052102230489254,
0.09434926509857178,
0.1758992075920105,
-0.0068955812603235245,
0.12612131237983704,
-0.07063500583171844,
-0.30578798055648804,
0.052857257425785065,
0.047000233083963394,
-0.024699470028281212,
0.12367017567157745,
0.10651455819606781,
-0.03468906506896019,
0.07754695415496826,
0.019200388342142105,
-0.12414956092834473,
0.017268668860197067,
0.037944015115499496,
-0.14746522903442383,
0.10432316362857819,
0.05573539435863495,
0.03526446968317032,
0.013296122662723064,
-0.04712897911667824,
-0.12487456202507019,
0.02655240148305893,
0.0037871929816901684,
-0.026412466540932655,
0.030803179368376732,
0.02948855608701706,
-0.08069716393947601,
0.18223951756954193,
0.08650121837854385,
0.015566053800284863,
0.05522974580526352,
-0.14724627137184143,
0.03131973743438721,
0.02440095879137516,
0.05665644258260727,
0.0819760262966156,
0.10451953113079071,
-0.04917977377772331,
0.12769711017608643,
-0.10844046622514725,
0.1269204467535019,
0.10840320587158203,
-0.3132908046245575,
-0.025263410061597824,
0.09782044589519501,
0.053383227437734604,
0.04960740730166435,
-0.017406128346920013,
0.07349444925785065,
0.012524980120360851,
0.005757234990596771,
-0.024108856916427612,
-0.090741828083992,
-0.10804229229688644,
0.0025802780874073505,
-0.12207242846488953,
-0.0312507227063179,
0.2753126621246338,
-0.05321979895234108,
0.05421370267868042,
-0.07260841876268387,
-0.09948404133319855,
-0.0012321870308369398,
-0.051414262503385544,
-0.031007586047053337,
-0.07413844764232635,
0.070799820125103,
-0.04290544241666794,
-0.07663960009813309,
-0.11865077912807465,
-0.014100334607064724,
-0.17957398295402527,
0.11452561616897583,
0.02624872885644436,
0.03543620556592941,
-0.2516584098339081,
0.08936139941215515,
-0.0031427196227014065,
-0.06874584406614304,
0.027805466204881668,
-0.08314155787229538,
0.02367394231259823,
-0.010902779176831245,
-0.026146618649363518,
-0.053208667784929276,
0.07552628219127655,
0.15447166562080383,
0.041121721267700195,
0.03664322942495346,
-0.05812865495681763,
0.040009625256061554,
0.038118090480566025,
0.081952765583992,
-0.0011447365395724773,
-0.09269990026950836,
0.03392736613750458,
-0.10638965666294098,
-0.0028007226064801216,
-0.062189023941755295,
-0.16093836724758148,
-0.018293924629688263,
0.07673272490501404,
0.05105277895927429,
0.045463450253009796,
0.1360863894224167,
-0.02268514409661293,
-0.06596532464027405,
0.03300569951534271,
-0.022528842091560364,
-0.01710580848157406,
-0.001994533697143197,
-0.01239493303000927,
0.13846659660339355,
0.020891990512609482,
0.05952373147010803,
-0.13955402374267578,
0.020700091496109962,
-0.046363718807697296,
-0.010944338515400887,
-0.016103606671094894,
-0.056066013872623444,
-0.006348932161927223,
-0.06065972149372101,
0.011246575973927975,
-0.15107929706573486,
-0.14972269535064697,
0.016733307391405106,
0.0023195105604827404,
-0.042483843863010406,
-0.10236800462007523,
-0.10350798070430756,
-0.020273249596357346,
0.041524577885866165,
-0.05777978152036667,
-0.01827893778681755,
-0.048734650015830994,
0.0775013193488121,
-0.010689393617212772,
0.10422955453395844,
-0.09213235229253769,
0.07134869694709778,
-0.1116650253534317,
-0.030721645802259445,
-0.10715924203395844,
0.12671136856079102,
0.003884911071509123,
0.0795627161860466,
-0.03922916203737259,
-0.014204226434230804,
-0.08544698357582092,
0.05862467736005783,
-0.03185536339879036,
0.21963199973106384,
-0.09248260408639908,
-0.1127559244632721,
0.3300940990447998,
-0.056874074041843414,
-0.10825072973966599,
0.12993735074996948,
0.005319781135767698,
0.1112266406416893,
0.14707110822200775,
0.19123336672782898,
-0.018906287848949432,
-0.015039702877402306,
0.09462311863899231,
0.08223094046115875,
-0.08552078157663345,
0.002656049095094204,
0.007933031767606735,
-0.027577396482229233,
-0.09085771441459656,
0.032219596207141876,
0.09730471670627594,
0.062094539403915405,
-0.043527692556381226,
-0.02042439579963684,
-0.0023574780207127333,
-0.012878184206783772,
0.09896224737167358,
-0.03455539792776108,
0.11841995269060135,
-0.04976245015859604,
-0.051429327577352524,
0.04371537268161774,
0.026646610349416733,
-0.03803396224975586,
0.021768843755126,
-0.09643037617206573,
0.06687277555465698,
-0.01603199541568756,
0.06295531988143921,
-0.14634500443935394,
-0.016781672835350037,
-0.03127778321504593,
0.18542690575122833,
0.09437859058380127,
0.11419198662042618,
0.06560567766427994,
-0.05135628953576088,
-0.0030475594103336334,
0.041968684643507004,
0.16946348547935486,
-0.012865735217928886,
-0.06476165354251862,
-0.10030792653560638,
0.11083060503005981,
-0.03504772484302521,
0.15532106161117554,
-0.075778067111969,
0.009210016578435898,
0.020914696156978607,
0.11573626101016998,
0.0055211391299963,
0.03273232281208038,
0.012462086975574493,
-0.00728465523570776,
-0.06750065833330154,
-0.002542224247008562,
0.09162965416908264,
-0.004304667469114065,
-0.07811814546585083,
0.21730560064315796,
-0.18604367971420288,
0.0896584540605545,
0.1948823779821396,
-0.24728497862815857,
-0.00042127398774027824,
-0.08694037050008774,
-0.02404654398560524,
-0.002207459881901741,
0.05579051375389099,
-0.04229608178138733,
0.196646586060524,
-0.025955643504858017,
0.17084665596485138,
-0.04512740299105644,
-0.036841753870248795,
-0.01774219051003456,
-0.07603107392787933,
0.013623842969536781,
0.08293913304805756,
0.12080909311771393,
-0.19053524732589722,
0.13681869208812714,
0.09785664081573486,
0.033617034554481506,
0.1873069703578949,
0.027562135830521584,
0.001004849560558796,
0.05986462160944939,
0.014113259501755238,
-0.010991341434419155,
-0.08407854288816452,
-0.3049880862236023,
-0.029557131230831146,
0.07170934975147247,
0.033082012087106705,
0.11650973558425903,
-0.09844337403774261,
-0.0029504061676561832,
0.004346542991697788,
-0.01827302947640419,
0.011005264706909657,
0.10354851186275482,
0.022237805649638176,
0.12154003977775574,
-0.016782639548182487,
-0.06537976861000061,
0.059408463537693024,
0.027555182576179504,
-0.09076374769210815,
0.16473320126533508,
-0.12582539021968842,
-0.3819532096385956,
-0.10481125861406326,
-0.17308494448661804,
-0.06760270893573761,
0.05354464426636696,
0.09581878781318665,
-0.10960988700389862,
-0.027113651856780052,
-0.010027620941400528,
0.07040185481309891,
-0.08909596502780914,
0.0190082136541605,
-0.046499043703079224,
0.01570839248597622,
-0.10057121515274048,
-0.09177613258361816,
-0.057424817234277725,
-0.025149455294013023,
-0.06755081564188004,
0.12857185304164886,
-0.16990280151367188,
0.04386628419160843,
0.2439245581626892,
0.031471699476242065,
0.05025913938879967,
-0.05238485708832741,
0.22669653594493866,
-0.10430474579334259,
0.017450738698244095,
0.16814713180065155,
-0.03645051643252373,
0.04997694864869118,
0.13166801631450653,
-0.029833650216460228,
-0.08878935873508453,
0.049876391887664795,
-0.008052094839513302,
-0.07126931846141815,
-0.2120329886674881,
-0.12451908737421036,
-0.12196944653987885,
0.10322420299053192,
0.03977802395820618,
0.040038418024778366,
0.20542964339256287,
0.05805192142724991,
-0.033322934061288834,
0.06450275331735611,
0.07440955191850662,
0.08175086975097656,
0.2440272569656372,
-0.06557337939739227,
0.15339559316635132,
-0.02594643086194992,
-0.1776258647441864,
0.08364897966384888,
0.049746979027986526,
0.10428023338317871,
0.06392671912908554,
0.11822064965963364,
0.014749612659215927,
0.027166366577148438,
0.1312617063522339,
0.08865731954574585,
0.01744505949318409,
-0.030479565262794495,
-0.03965257853269577,
-0.04055566340684891,
-0.03758636862039566,
0.033622078597545624,
0.0765833705663681,
-0.15789736807346344,
-0.015502721071243286,
-0.003301248885691166,
0.0711374580860138,
0.07245378196239471,
0.0826130285859108,
-0.17823588848114014,
-0.0027189217507839203,
0.07385151088237762,
-0.040918342769145966,
-0.10204386711120605,
0.08303989470005035,
0.008077952079474926,
-0.14240607619285583,
0.062041737139225006,
-0.016901547089219093,
0.12493360042572021,
-0.08656609058380127,
0.07170086354017258,
-0.12904703617095947,
-0.058955952525138855,
0.0045429496094584465,
0.11004102975130081,
-0.32539528608322144,
0.15622134506702423,
-0.013229679316282272,
-0.038909684866666794,
-0.13322731852531433,
-0.011760580353438854,
0.0056118834763765335,
0.09222495555877686,
0.07077004760503769,
-0.006424888037145138,
0.006824800744652748,
-0.012909552082419395,
-0.07980544120073318,
0.01536946278065443,
0.11428610235452652,
-0.01162651740014553,
0.009509209543466568,
-0.07166467607021332,
-0.022994529455900192,
-0.004635294899344444,
-0.12579673528671265,
-0.025411171838641167,
-0.17828980088233948,
0.08268708735704422,
0.07852242887020111,
0.06366796046495438,
0.029781974852085114,
-0.010858302935957909,
-0.018558554351329803,
0.2510567009449005,
-0.01115710660815239,
-0.08417841047048569,
-0.08984120190143585,
-0.05778167024254799,
0.06942864507436752,
-0.0603160597383976,
0.04275839030742645,
-0.04671494662761688,
0.016987435519695282,
-0.039977915585041046,
-0.17433667182922363,
0.12834778428077698,
-0.10418187081813812,
-0.03887781500816345,
-0.022470606490969658,
0.18463316559791565,
-0.02958020195364952,
0.03407106548547745,
0.04692304879426956,
-0.007855504751205444,
-0.09004218131303787,
-0.0825444906949997,
-0.04328109323978424,
0.039218224585056305,
-0.044182173907756805,
0.036801859736442566,
-0.0014952905476093292,
-0.06898144632577896,
-0.07128775864839554,
-0.01729397103190422,
0.320848673582077,
0.12186089903116226,
-0.02232980728149414,
0.1786716878414154,
0.0790676400065422,
-0.06033547222614288,
-0.265683650970459,
-0.0826214924454689,
-0.07087855041027069,
-0.04194710776209831,
-0.08143045008182526,
-0.14157144725322723,
0.07591742277145386,
-0.04431021213531494,
-0.014462728053331375,
0.13048425316810608,
-0.3026347756385803,
-0.11498863995075226,
0.1780327558517456,
-0.021354172378778458,
0.3410193920135498,
-0.10013137012720108,
-0.07898551225662231,
-0.053295549005270004,
-0.11049683392047882,
0.15186837315559387,
-0.05729467421770096,
0.10780692100524902,
0.023142775520682335,
0.16142535209655762,
0.05598701536655426,
-0.0069459122605621815,
0.06120319664478302,
0.0012066310737282038,
-0.06871819496154785,
-0.10683507472276688,
-0.047224510461091995,
0.05032733082771301,
0.03465444594621658,
0.04318351298570633,
-0.0569368340075016,
0.021075952798128128,
-0.1052132397890091,
-0.06498857587575912,
-0.07215135544538498,
0.04641323536634445,
0.02302243933081627,
-0.07137392461299896,
0.007292182184755802,
-0.0637187510728836,
-0.005515751428902149,
0.013674437999725342,
0.12224247306585312,
-0.09033751487731934,
0.11800409108400345,
0.11036355793476105,
0.1687176376581192,
-0.1259620636701584,
0.009467369876801968,
-0.05308152735233307,
-0.053612858057022095,
0.057875439524650574,
-0.06638520956039429,
0.03255081921815872,
0.0960923582315445,
-0.02717808447778225,
0.09560807794332504,
0.08308769762516022,
-0.010722557082772255,
0.01575128547847271,
0.10988503694534302,
-0.23825854063034058,
-0.06012040749192238,
-0.07570823282003403,
-0.00317056174390018,
0.06118251383304596,
0.10773532092571259,
0.20682403445243835,
-0.01595160737633705,
-0.032028548419475555,
0.006161438766866922,
0.025081193074584007,
-0.051426712423563004,
0.10175301134586334,
-0.00850438978523016,
0.009306540712714195,
-0.1542714387178421,
0.052623700350522995,
0.04047539830207825,
-0.09609611332416534,
0.04305865615606308,
0.1389177441596985,
-0.10646723210811615,
-0.10669606178998947,
-0.01962330751121044,
0.0910460501909256,
-0.11303894221782684,
-0.02294110506772995,
-0.0457809679210186,
-0.11326159536838531,
0.06771966814994812,
0.10676945000886917,
0.04026808589696884,
0.06627380847930908,
-0.09423818439245224,
-0.011365479789674282,
-0.05125509202480316,
0.006403600797057152,
0.06916820257902145,
-0.012939929962158203,
-0.06269463896751404,
0.09851591289043427,
-0.01830287091434002,
0.09664984792470932,
-0.0924367681145668,
-0.10987381637096405,
-0.15888582170009613,
0.04101009666919708,
-0.12641514837741852,
-0.07627630233764648,
-0.09730367362499237,
-0.03739850968122482,
0.0058367811143398285,
-0.008786050602793694,
-0.03754468262195587,
-0.03609686717391014,
-0.10068465024232864,
0.04032355546951294,
-0.04539433866739273,
0.003346766112372279,
-0.08381801843643188,
0.035338759422302246,
0.07231274247169495,
-0.033457085490226746,
0.13417546451091766,
0.14116661250591278,
-0.11637914925813675,
0.09046107530593872,
-0.09213077276945114,
-0.0685385912656784,
0.11235214024782181,
0.016551129519939423,
0.06272143870592117,
0.07092241942882538,
0.005486485082656145,
0.0644557923078537,
0.07693806290626526,
0.03860459849238396,
0.020590703934431076,
-0.08144107460975647,
0.0346941202878952,
-0.05244690924882889,
-0.14118261635303497,
-0.044581733644008636,
-0.010089093819260597,
0.019179321825504303,
0.018675396218895912,
0.10204403847455978,
-0.0609918013215065,
0.07132266461849213,
-0.043223533779382706,
0.026843715459108353,
-0.0013395235873758793,
-0.1537885069847107,
-0.027840128168463707,
-0.07761774957180023,
0.03960075229406357,
0.0007268866756930947,
0.1744314283132553,
0.03836313635110855,
0.04899048060178757,
0.007268973626196384,
0.04579585790634155,
0.043670959770679474,
0.008297844789922237,
0.19502878189086914,
0.12569840252399445,
-0.06398509442806244,
-0.11328578740358353,
0.07648259401321411,
0.04158511757850647,
0.025973070412874222,
0.11374230682849884,
-0.02357502095401287,
-0.09419481456279755,
0.08241459727287292,
-0.00551952887326479,
0.04556173086166382,
-0.18973831832408905,
-0.1218472272157669,
-0.041987039148807526,
0.06343447417020798,
-0.05499844253063202,
0.15843304991722107,
0.15773314237594604,
-0.023186124861240387,
0.006000488996505737,
-0.008139461278915405,
-0.07137639820575714,
-0.17805713415145874,
-0.2197026014328003,
-0.08489017933607101,
-0.14077261090278625,
-0.0036345769185572863,
-0.13743042945861816,
0.04820366948843002,
0.008843403309583664,
0.09339353442192078,
-0.06589148938655853,
0.08298983424901962,
0.010393290780484676,
-0.12201486527919769,
0.07560597360134125,
-0.040533825755119324,
0.0666273832321167,
-0.04280650615692139,
0.0027543455362319946,
-0.10044848918914795,
0.06075272709131241,
0.03327653184533119,
0.039157260209321976,
-0.07089508324861526,
0.025202501565217972,
-0.11633367091417313,
-0.07882300764322281,
-0.052585139870643616,
0.062323637306690216,
-0.008791645057499409,
0.11548975110054016,
0.03309089317917824,
-0.05045447126030922,
0.02711884304881096,
0.22849680483341217,
-0.08333233743906021,
-0.1600600779056549,
-0.07880116254091263,
0.20572257041931152,
-0.00860708300024271,
0.11137383431196213,
-0.025939971208572388,
-0.007036937400698662,
-0.09459050744771957,
0.3766510486602783,
0.29616108536720276,
-0.07818101346492767,
0.00529971020296216,
-0.007820661179721355,
0.03948690742254257,
0.09469813108444214,
0.11991974711418152,
0.08580390363931656,
0.33686068654060364,
-0.05757066607475281,
-0.04474802687764168,
-0.006766926497220993,
-0.04308084398508072,
-0.08298500627279282,
0.030343279242515564,
0.06408356875181198,
-0.06446751207113266,
-0.012421103194355965,
0.1145060658454895,
-0.2720162868499756,
0.12966600060462952,
-0.19977232813835144,
-0.1537483185529709,
-0.08169860392808914,
-0.005203427281230688,
0.06630567461252213,
0.03637923300266266,
0.0865963026881218,
-0.002816703636199236,
-0.07224106043577194,
0.06227362900972366,
0.03009752184152603,
-0.21256989240646362,
0.017692655324935913,
0.08258357644081116,
-0.034081704914569855,
-0.05942942202091217,
-0.034341778606176376,
0.06390594691038132,
0.07758176326751709,
0.0639234185218811,
-0.006368638016283512,
0.010090604424476624,
-0.0036463970318436623,
-0.03553808480501175,
0.054850734770298004,
0.04378902167081833,
0.0073180850595235825,
-0.06993220746517181,
0.09067346900701523,
-0.14202576875686646,
0.042458564043045044,
-0.0003038155846297741,
0.01839541271328926,
-0.03462358936667442,
0.022495876997709274,
-0.07232709228992462,
0.07424923777580261,
0.1062895655632019,
-0.015886763110756874,
0.007781245280057192,
-0.04361455887556076,
-0.02026071399450302,
-0.028770674020051956,
-0.09261920303106308,
-0.09954306483268738,
-0.17496353387832642,
-0.14447878301143646,
0.06155117601156235,
-0.001919364556670189,
-0.18733331561088562,
0.023164093494415283,
-0.12315784394741058,
0.06466542929410934,
-0.14240014553070068,
0.10682518035173416,
0.10610894113779068,
0.017361830919981003,
-0.011512926779687405,
0.02455800399184227,
0.02503582462668419,
0.10739293694496155,
-0.13242177665233612,
-0.07190727442502975
] |
null | null | transformers |
!!!
At the moment, the model is distilled, a version from one of the first checkpoints is available for download.
We plan to post the full model in the next few days.
!!!
This is a distilled HRBert model for an mlm task.
Sentence embeddings can be produced as follows:
```python
# pip install transformers
from transformers import pipeline
fill_mask = pipeline(
"fill-mask",
model='RabotaRu/HRBert-mini',
tokenizer='RabotaRu/HRBert-mini'
)
fill_mask('<mask> на склад')
``` | {"language": ["ru", "en", "be", "bg", "uk", "ro", "kz", "tg", "tat", "sv", "sl", "sr", "uz", "es", "fi"], "license": "mit", "tags": ["russian", "fill-mask", "pretraining", "embeddings", "masked-lm"], "widget": [{"text": "<mask> \u043d\u0430 \u0441\u043a\u043b\u0430\u0434"}]} | fill-mask | RabotaRu/HRBert-mini | [
"transformers",
"pytorch",
"safetensors",
"roberta",
"fill-mask",
"russian",
"pretraining",
"embeddings",
"masked-lm",
"ru",
"en",
"be",
"bg",
"uk",
"ro",
"kz",
"tg",
"tat",
"sv",
"sl",
"sr",
"uz",
"es",
"fi",
"license:mit",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [
"ru",
"en",
"be",
"bg",
"uk",
"ro",
"kz",
"tg",
"tat",
"sv",
"sl",
"sr",
"uz",
"es",
"fi"
] | TAGS
#transformers #pytorch #safetensors #roberta #fill-mask #russian #pretraining #embeddings #masked-lm #ru #en #be #bg #uk #ro #kz #tg #tat #sv #sl #sr #uz #es #fi #license-mit #autotrain_compatible #endpoints_compatible #region-us
|
!!!
At the moment, the model is distilled, a version from one of the first checkpoints is available for download.
We plan to post the full model in the next few days.
!!!
This is a distilled HRBert model for an mlm task.
Sentence embeddings can be produced as follows:
| [] | [
"TAGS\n#transformers #pytorch #safetensors #roberta #fill-mask #russian #pretraining #embeddings #masked-lm #ru #en #be #bg #uk #ro #kz #tg #tat #sv #sl #sr #uz #es #fi #license-mit #autotrain_compatible #endpoints_compatible #region-us \n"
] | [
94
] | [
"passage: TAGS\n#transformers #pytorch #safetensors #roberta #fill-mask #russian #pretraining #embeddings #masked-lm #ru #en #be #bg #uk #ro #kz #tg #tat #sv #sl #sr #uz #es #fi #license-mit #autotrain_compatible #endpoints_compatible #region-us \n"
] | [
-0.04216274991631508,
-0.0928809717297554,
-0.00817276630550623,
0.0315968319773674,
0.1095675528049469,
0.006932755932211876,
0.22864870727062225,
0.04690135270357132,
0.12196759879589081,
0.015653865411877632,
0.20346660912036896,
0.10710650682449341,
-0.02667253650724888,
0.15467435121536255,
0.04303202033042908,
-0.21714915335178375,
0.055045731365680695,
-0.050217486917972565,
-0.11501538008451462,
0.11166274547576904,
0.11573226004838943,
-0.03373008966445923,
0.06190505623817444,
-0.05324360355734825,
-0.0445837639272213,
0.032900042831897736,
0.04956303909420967,
-0.08116867393255234,
0.12685376405715942,
0.06130464747548103,
0.1374531239271164,
0.060348909348249435,
0.03854480758309364,
-0.12722350656986237,
0.05585508421063423,
-0.018811363726854324,
-0.09937695413827896,
0.026258738711476326,
-0.011800896376371384,
-0.07162278145551682,
0.09830017387866974,
-0.05840744823217392,
-0.031011993065476418,
0.041641268879175186,
-0.1545911729335785,
-0.16875648498535156,
-0.026796478778123856,
0.11136926710605621,
0.04278126358985901,
0.1129353865981102,
-0.012760083191096783,
0.16430461406707764,
-0.14087869226932526,
0.09234453737735748,
0.15716461837291718,
-0.3429279029369354,
-0.024841414764523506,
0.03510609269142151,
-0.00666961632668972,
0.04541458934545517,
-0.0450759120285511,
0.0746159702539444,
0.03262133151292801,
-0.037148617208004,
-0.00383109413087368,
-0.027582423761487007,
0.008345888927578926,
0.014563209377229214,
-0.12341289222240448,
-0.051013946533203125,
0.15581592917442322,
-0.03322504460811615,
0.03179693594574928,
0.04533540457487106,
-0.02998652681708336,
-0.10348842293024063,
-0.00453732255846262,
0.04089144617319107,
-0.039526987820863724,
0.007259439677000046,
-0.04739749804139137,
-0.008313215337693691,
-0.1347334384918213,
-0.031512051820755005,
-0.12276607006788254,
0.2947717308998108,
0.03425104171037674,
0.0591108612716198,
-0.11087820678949356,
0.052269406616687775,
-0.03405153378844261,
-0.10808845609426498,
0.06004074588418007,
-0.04538703337311745,
0.026176534593105316,
0.045781057327985764,
0.0012772275367751718,
-0.020137103274464607,
0.1105991080403328,
0.16978716850280762,
-0.021172238513827324,
0.05573341250419617,
0.08411496877670288,
0.16073693335056305,
-0.07208698987960815,
0.04645983502268791,
0.009707546792924404,
0.017896635457873344,
-0.03495735302567482,
-0.03675509989261627,
0.04025445505976677,
0.005749206990003586,
-0.14808230102062225,
-0.09246376901865005,
-0.020598191767930984,
0.0949004590511322,
-0.039636559784412384,
0.11123451590538025,
-0.03463525325059891,
0.10394630581140518,
0.011025534942746162,
-0.050844114273786545,
-0.030959002673625946,
-0.0027875404339283705,
0.02708894945681095,
-0.08837928622961044,
-0.005872307810932398,
-0.0182323195040226,
0.02179799973964691,
0.226754292845726,
-0.04965725168585777,
0.039327822625637054,
-0.018917197361588478,
-0.06910610944032669,
0.0438484326004982,
-0.17889750003814697,
0.06032341718673706,
-0.18945933878421783,
-0.09594222158193588,
0.030157333239912987,
0.03944559767842293,
0.017649350687861443,
0.05107923969626427,
-0.025758013129234314,
-0.0879402905702591,
0.007429538760334253,
-0.05072978883981705,
-0.05823809280991554,
-0.0829235166311264,
0.09533735364675522,
-0.04789428785443306,
0.09728632122278214,
-0.05330090597271919,
-0.005144186783581972,
-0.1330384761095047,
-0.005063800141215324,
-0.09668023884296417,
-0.06310927122831345,
-0.12132063508033752,
0.11822135001420975,
0.03508575260639191,
-0.05992354080080986,
-0.08467163145542145,
0.06509814411401749,
0.039297740906476974,
0.1657661348581314,
-0.10441885888576508,
-0.10101525485515594,
0.281254380941391,
-0.13521064817905426,
-0.04191845655441284,
0.09142369031906128,
0.00206286390312016,
-0.007376085966825485,
0.02681546099483967,
0.13893571496009827,
-0.007057603448629379,
-0.15847961604595184,
-0.015184978954494,
0.07399941980838776,
-0.14846940338611603,
0.005107802804559469,
0.08096230775117874,
0.0020215201657265425,
-0.04059821367263794,
0.049616970121860504,
-0.024432756006717682,
0.10232473909854889,
-0.06830938905477524,
-0.0921456441283226,
0.0038100096862763166,
-0.05250765010714531,
0.15904846787452698,
0.007551113609224558,
0.08000147342681885,
-0.1593778431415558,
-0.09587673097848892,
-0.10937643051147461,
0.06381461769342422,
0.08948927372694016,
0.004183429758995771,
-0.14113585650920868,
0.07291164994239807,
0.06525272130966187,
-0.019504614174365997,
-0.08183681964874268,
0.002896852558478713,
-0.06962193548679352,
0.08123841881752014,
0.05864528566598892,
0.08436933904886246,
0.12702012062072754,
0.0189957395195961,
-0.06941241025924683,
-0.06557735055685043,
-0.01781633496284485,
-0.012996591627597809,
-0.010862167924642563,
-0.22588476538658142,
0.06597159057855606,
-0.04451600834727287,
0.031446076929569244,
-0.07180812209844589,
0.025034960359334946,
0.0487825870513916,
0.10729875415563583,
-0.007635527290403843,
0.0689430832862854,
-0.09304288774728775,
0.03687724098563194,
-0.03815965726971626,
0.04481726139783859,
0.09947281330823898,
-0.03810616582632065,
-0.1004142016172409,
0.11008398979902267,
-0.050530385226011276,
0.27371394634246826,
0.17205855250358582,
-0.15589594841003418,
-0.07580877095460892,
0.07513006031513214,
-0.071835957467556,
0.029398808255791664,
0.08172160387039185,
0.032719969749450684,
-0.018662748858332634,
-0.0246370080858469,
0.09462577104568481,
-0.0370924137532711,
-0.013058433309197426,
0.04467325285077095,
-0.11916367709636688,
-0.08788961917161942,
0.11290603131055832,
0.08102160692214966,
-0.18547990918159485,
0.21006597578525543,
0.2223185896873474,
-0.015137393027544022,
0.2578302025794983,
-0.014843766577541828,
0.01772143691778183,
-0.03275422751903534,
0.032687149941921234,
-0.022387171164155006,
0.2005940079689026,
-0.11567161232233047,
-0.033824171870946884,
-0.005050081759691238,
-0.020411629229784012,
0.018413562327623367,
-0.14635339379310608,
-0.11105180531740189,
-0.03657850995659828,
-0.04983166232705116,
-0.09876016527414322,
0.13226568698883057,
-0.06590688228607178,
0.12318871915340424,
-0.0029596658423542976,
-0.10830020159482956,
0.06604540348052979,
-0.026529386639595032,
-0.06572859734296799,
0.130674347281456,
-0.14965255558490753,
-0.17337779700756073,
-0.10967585444450378,
-0.062329769134521484,
0.0103905089199543,
0.016705133020877838,
0.05494071915745735,
-0.18257644772529602,
-0.022161301225423813,
0.04922903701663017,
0.047344621270895004,
-0.03103874810039997,
0.04601354897022247,
-0.08139711618423462,
0.09753121435642242,
-0.03298055753111839,
-0.032041098922491074,
-0.06811399012804031,
-0.06778241693973541,
-0.012123234570026398,
0.14869864284992218,
-0.028950542211532593,
0.04153570905327797,
0.01661490648984909,
-0.023584984242916107,
0.02251654490828514,
-0.06697668880224228,
0.14047464728355408,
-0.12062997370958328,
-0.006834435276687145,
0.06072899326682091,
-0.04172338917851448,
0.057004351168870926,
0.19263707101345062,
0.05866912007331848,
-0.021242549642920494,
0.009356915950775146,
-0.03868938982486725,
-0.08592980355024338,
-0.1511395275592804,
-0.12568405270576477,
-0.04929337278008461,
0.09549576789140701,
-0.033567048609256744,
0.06090868264436722,
0.033728525042533875,
0.06797726452350616,
-0.02375655248761177,
-0.13328823447227478,
-0.0030829012393951416,
0.02914903126657009,
0.02346910536289215,
-0.046031802892684937,
0.09564296901226044,
-0.07819268852472305,
-0.07810071855783463,
0.05034422501921654,
-0.07317950576543808,
0.10497833788394928,
0.07495336979627609,
-0.09672318398952484,
0.12591159343719482,
0.18867354094982147,
0.11625386029481888,
0.06647304445505142,
0.0690750777721405,
-0.07671086490154266,
0.03560395911335945,
-0.05871084704995155,
-0.03096526488661766,
0.03946765512228012,
-0.017650797963142395,
-0.01009557768702507,
-0.024693602696061134,
-0.06888259947299957,
0.10486733913421631,
0.07078462839126587,
0.04208517074584961,
-0.1130901575088501,
-0.07521339505910873,
0.0159293320029974,
0.051653116941452026,
-0.00898486003279686,
0.04297464340925217,
0.04494595527648926,
-0.10109259188175201,
0.07003520429134369,
-0.01882951892912388,
0.030078941956162453,
0.1410231739282608,
0.08318781107664108,
-0.03719716519117355,
-0.03089308924973011,
-0.04685463383793831,
0.09607867151498795,
-0.33151382207870483,
0.35654523968696594,
-0.0019024207722395658,
0.03500673547387123,
-0.05244586989283562,
-0.07850014418363571,
0.05207740515470505,
0.18380571901798248,
0.1729598194360733,
0.05881670489907265,
-0.13294054567813873,
-0.18600624799728394,
0.005056384950876236,
0.01666044257581234,
0.1407092809677124,
0.001654540654271841,
0.026297109201550484,
-0.030336668714880943,
-0.022161005064845085,
-0.0045659346505999565,
0.021025726571679115,
-0.08233693987131119,
-0.08422769606113434,
0.024276889860630035,
0.0389380156993866,
0.01567683182656765,
-0.07571181654930115,
-0.058755241334438324,
-0.12514042854309082,
0.061146654188632965,
-0.17090237140655518,
-0.046009697020053864,
-0.046774186193943024,
-0.054579418152570724,
0.018221000209450722,
-0.10367505252361298,
0.03720616176724434,
-0.009491312317550182,
-0.03363831341266632,
-0.09106315672397614,
-0.059614118188619614,
0.10552771389484406,
-0.11874478310346603,
-0.1315990537405014,
-0.04875140264630318,
0.1500759869813919,
0.05083192512392998,
0.06698308140039444,
-0.0639285147190094,
0.019804252311587334,
-0.010310208424925804,
-0.06479466706514359,
0.07221835106611252,
-0.06743606925010681,
0.002505122683942318,
0.0743694081902504,
-0.12178574502468109,
-0.09902047365903854,
-0.07508102804422379,
-0.09154132753610611,
0.1532316952943802,
0.41153398156166077,
-0.09733826667070389,
0.14751172065734863,
0.14600315690040588,
0.000010946499060082715,
-0.34842821955680847,
-0.0801204964518547,
-0.11515400558710098,
0.05665302276611328,
0.019868483766913414,
-0.10790199786424637,
-0.06345194578170776,
-0.028444841504096985,
-0.06512797623872757,
0.11533840000629425,
-0.1880580335855484,
-0.11590726673603058,
0.19380345940589905,
0.008312428370118141,
0.3651951849460602,
-0.06846386939287186,
-0.008414963260293007,
-0.036645688116550446,
0.030660556629300117,
-0.011334758251905441,
-0.01676701009273529,
0.09497509896755219,
-0.009942453354597092,
0.0007467482937499881,
0.03258373960852623,
-0.058485910296440125,
0.10874690115451813,
-0.07297410815954208,
0.051540691405534744,
-0.14541840553283691,
-0.09507627040147781,
0.1294444501399994,
0.01670559123158455,
-0.04816266521811485,
-0.1096421629190445,
-0.002179150702431798,
-0.006932826247066259,
-0.006925022695213556,
-0.07722146809101105,
0.118341825902462,
-0.024160027503967285,
-0.08939778804779053,
-0.06060710549354553,
0.053607165813446045,
-0.02485564723610878,
-0.012526591308414936,
0.20826806128025055,
-0.041713617742061615,
0.1363145411014557,
0.017528794705867767,
0.05233561247587204,
-0.02441921830177307,
0.03804021328687668,
0.030468974262475967,
-0.06275925040245056,
0.059075478464365005,
-0.060722578316926956,
-0.035319652408361435,
0.08534355461597443,
0.011965613812208176,
0.046506118029356,
0.05261971428990364,
-0.07472167164087296,
-0.011903837323188782,
0.17459626495838165,
-0.2213040590286255,
-0.05736182630062103,
-0.00664388295263052,
-0.02882634662091732,
0.09420161694288254,
0.04307859390974045,
0.08613251894712448,
-0.05729867145419121,
0.037913404405117035,
-0.004669899120926857,
0.02176806330680847,
-0.05996214598417282,
0.08707964420318604,
0.06610842794179916,
0.029357383027672768,
-0.052735425531864166,
0.06726760417222977,
-0.06790006160736084,
-0.09260588139295578,
0.015344226732850075,
0.09313029795885086,
-0.1574379950761795,
-0.11551212519407272,
-0.053215641528367996,
0.11666864156723022,
-0.04768587648868561,
-0.0800132229924202,
-0.08815418183803558,
-0.11178963631391525,
0.03342195972800255,
0.19306938350200653,
0.08057495206594467,
0.019967911764979362,
0.03514450415968895,
-0.0012076651910319924,
0.010558288544416428,
0.08138465136289597,
0.008647088892757893,
-0.006191776599735022,
-0.11659767478704453,
0.09818428754806519,
-0.0077012465335428715,
0.14377351105213165,
-0.07211829721927643,
0.023080358281731606,
-0.14438989758491516,
0.05687975883483887,
0.005857642274349928,
-0.08225405961275101,
-0.045367877930402756,
-0.0681682899594307,
-0.002344811335206032,
-0.11598087102174759,
-0.03502599522471428,
-0.04081571847200394,
-0.10442466288805008,
0.034988027065992355,
0.04155272990465164,
0.02962944470345974,
-0.05341295152902603,
-0.014308998361229897,
0.09859564155340195,
-0.019542135298252106,
0.0871891900897026,
0.17853261530399323,
0.024136686697602272,
0.1671745777130127,
-0.15338873863220215,
0.010783485136926174,
0.052980124950408936,
-0.018393823876976967,
0.02420044131577015,
0.11338101327419281,
-0.01189830806106329,
-0.037209779024124146,
0.02617805078625679,
0.09125678241252899,
0.08369970321655273,
-0.07335502654314041,
0.1333492398262024,
0.062404222786426544,
-0.06922750920057297,
-0.018582766875624657,
0.005617889109998941,
0.048932913690805435,
-0.010831732302904129,
0.19047249853610992,
-0.12328016757965088,
0.06300707161426544,
-0.028434257954359055,
0.055130597203969955,
0.005300808697938919,
-0.14231345057487488,
-0.04786846414208412,
-0.06202811002731323,
0.0003955175634473562,
-0.08893020451068878,
0.14514251053333282,
0.019356966018676758,
-0.09641513228416443,
0.06988122314214706,
-0.0015556523576378822,
-0.05124244838953018,
0.028158308938145638,
0.11164385080337524,
0.03165213018655777,
-0.037297867238521576,
-0.19054414331912994,
0.052862562239170074,
-0.027391057461500168,
-0.1654556393623352,
0.10355295240879059,
0.1316302865743637,
0.08359628915786743,
0.06256790459156036,
0.04169538989663124,
-0.029068080708384514,
-0.03753534331917763,
-0.11580790579319,
-0.052817344665527344,
-0.04540599510073662,
-0.01657559908926487,
0.16503643989562988,
0.27403524518013,
-0.022133316844701767,
0.0341283455491066,
-0.025418056175112724,
-0.022478794679045677,
-0.15614137053489685,
-0.13768909871578217,
-0.03468099981546402,
-0.06408336013555527,
0.06180070713162422,
0.019246293231844902,
-0.008971082046627998,
0.020205488428473473,
0.049823082983493805,
-0.04547715187072754,
0.17559538781642914,
-0.036357998847961426,
-0.025708211585879326,
0.04762258753180504,
-0.0017562081338837743,
-0.011582816950976849,
-0.05431315302848816,
-0.052282240241765976,
-0.05251150578260422,
-0.129381000995636,
-0.04948197677731514,
0.00361832557246089,
-0.09090673923492432,
-0.0009591050329618156,
-0.04534509405493736,
-0.0695207267999649,
-0.029499191790819168,
0.07748117297887802,
0.058902814984321594,
0.06750007718801498,
-0.004369222559034824,
0.03139238432049751,
-0.022614989429712296,
0.17850634455680847,
0.007428046315908432,
-0.12264250963926315,
0.02929014340043068,
0.10498861223459244,
0.045359108597040176,
0.11085151135921478,
-0.02678297460079193,
0.009361154399812222,
-0.02058342844247818,
0.22571930289268494,
0.3736037015914917,
-0.010582460090517998,
0.10498169809579849,
-0.021403836086392403,
0.047827545553445816,
0.00012466037878766656,
0.03313031420111656,
0.05198505148291588,
0.18680483102798462,
-0.1027141660451889,
0.02323015034198761,
-0.08689805120229721,
0.02718927338719368,
-0.08221442252397537,
-0.004662640392780304,
0.04242286458611488,
-0.04160156473517418,
-0.09259185940027237,
0.052532751113176346,
-0.1558876931667328,
-0.008269647136330605,
0.07148939371109009,
-0.17001889646053314,
-0.020007047802209854,
0.010189815424382687,
0.14870768785476685,
0.08947690576314926,
0.1049252450466156,
-0.03240976482629776,
-0.03841728717088699,
-0.11614852398633957,
0.05285806581377983,
-0.11749711632728577,
-0.07891321927309036,
0.0376921072602272,
0.05861015245318413,
0.1559581309556961,
-0.053455691784620285,
0.05668066814541817,
0.1065647155046463,
0.018949873745441437,
-0.0015756010543555021,
0.10711246728897095,
0.053325049579143524,
-0.027945175766944885,
-0.10236397385597229,
-0.03265950456261635,
0.011210307478904724,
-0.05580620467662811,
0.07009276747703552,
-0.08745362609624863,
0.08703210204839706,
-0.02850935235619545,
-0.07821676880121231,
-0.03369593247771263,
0.19687511026859283,
-0.0589541457593441,
0.05397064983844757,
0.10718441009521484,
0.00449249567463994,
-0.061567001044750214,
-0.04716335982084274,
-0.003061803290620446,
0.07110399007797241,
-0.10533430427312851,
-0.037016861140728,
-0.07868562638759613,
-0.01416501123458147,
0.027615070343017578,
0.014184366911649704,
-0.15521353483200073,
-0.042544346302747726,
-0.11323636770248413,
0.0447077639400959,
-0.0956864133477211,
0.00008783398516243324,
0.10056833177804947,
0.07519993185997009,
0.00818649772554636,
-0.1949012279510498,
0.045567587018013,
0.055651675909757614,
-0.13937239348888397,
-0.09313148260116577
] |
null | null | transformers |
### T5 for question-generation
This is [t5-base](https://arxiv.org/abs/1910.10683) model trained for answer aware question generation task. The answer spans are highlighted within the text with special highlight tokens.
You can play with the model using the inference API, just highlight the answer spans with `<hl>` tokens and end the text with `</s>`. For example
`<hl> 42 <hl> is the answer to life, the universe and everything. </s>`
For more deatils see [this](https://github.com/patil-suraj/question_generation) repo.
| {"license": "mit", "tags": ["question-generation"], "datasets": ["squad"], "widget": [{"text": "<hl> 42 <hl> is the answer to life, the universe and everything. </s>"}, {"text": "Python is a programming language. It is developed by <hl> Guido Van Rossum <hl>. </s>"}, {"text": "Although <hl> practicality <hl> beats purity </s>"}]} | text2text-generation | Rachneet/t5-base-qg-hl-squadv2 | [
"transformers",
"pytorch",
"jax",
"t5",
"text2text-generation",
"question-generation",
"dataset:squad",
"arxiv:1910.10683",
"license:mit",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [
"1910.10683"
] | [] | TAGS
#transformers #pytorch #jax #t5 #text2text-generation #question-generation #dataset-squad #arxiv-1910.10683 #license-mit #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
### T5 for question-generation
This is t5-base model trained for answer aware question generation task. The answer spans are highlighted within the text with special highlight tokens.
You can play with the model using the inference API, just highlight the answer spans with '<hl>' tokens and end the text with '</s>'. For example
'<hl> 42 <hl> is the answer to life, the universe and everything. </s>'
For more deatils see this repo.
| [
"### T5 for question-generation\r\nThis is t5-base model trained for answer aware question generation task. The answer spans are highlighted within the text with special highlight tokens. \r\n\r\nYou can play with the model using the inference API, just highlight the answer spans with '<hl>' tokens and end the text with '</s>'. For example\r\n\r\n'<hl> 42 <hl> is the answer to life, the universe and everything. </s>'\r\n\r\nFor more deatils see this repo."
] | [
"TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #question-generation #dataset-squad #arxiv-1910.10683 #license-mit #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n",
"### T5 for question-generation\r\nThis is t5-base model trained for answer aware question generation task. The answer spans are highlighted within the text with special highlight tokens. \r\n\r\nYou can play with the model using the inference API, just highlight the answer spans with '<hl>' tokens and end the text with '</s>'. For example\r\n\r\n'<hl> 42 <hl> is the answer to life, the universe and everything. </s>'\r\n\r\nFor more deatils see this repo."
] | [
77,
108
] | [
"passage: TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #question-generation #dataset-squad #arxiv-1910.10683 #license-mit #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n### T5 for question-generation\r\nThis is t5-base model trained for answer aware question generation task. The answer spans are highlighted within the text with special highlight tokens. \r\n\r\nYou can play with the model using the inference API, just highlight the answer spans with '<hl>' tokens and end the text with '</s>'. For example\r\n\r\n'<hl> 42 <hl> is the answer to life, the universe and everything. </s>'\r\n\r\nFor more deatils see this repo."
] | [
-0.05855387821793556,
0.014719372615218163,
-0.003103340743109584,
0.06638116389513016,
0.10403589904308319,
0.04301740229129791,
0.1288074254989624,
0.14109894633293152,
0.04808737337589264,
0.004110090434551239,
0.13126756250858307,
0.13840089738368988,
0.012349419295787811,
0.12809313833713531,
-0.09196875989437103,
-0.21031475067138672,
0.005878914147615433,
0.024573449045419693,
0.04600443318486214,
0.12953504920005798,
0.02146303653717041,
-0.10025931894779205,
0.09837902337312698,
-0.01306245569139719,
-0.10471833497285843,
0.07049505412578583,
0.06045473366975784,
-0.07246022671461105,
0.15470558404922485,
0.05199044942855835,
-0.0014522848650813103,
0.02574204094707966,
0.026149310171604156,
-0.05219639837741852,
0.057846494019031525,
0.028697937726974487,
-0.03933753818273544,
0.08067994564771652,
-0.025673389434814453,
-0.047550689429044724,
0.11785509437322617,
0.07072271406650543,
-0.020588437095284462,
0.0609232597053051,
-0.07475616782903671,
-0.08674249053001404,
-0.01516745239496231,
0.05383886769413948,
0.056516729295253754,
0.03900305554270744,
0.0005454010097309947,
0.25279131531715393,
-0.04600157588720322,
0.10569436103105545,
0.24646809697151184,
-0.27458539605140686,
-0.004916558042168617,
0.09429213404655457,
-0.016467811539769173,
0.05823293328285217,
-0.024079464375972748,
0.07903429865837097,
0.09459897875785828,
0.026512587442994118,
0.07173026353120804,
-0.09826245158910751,
-0.19423384964466095,
0.05778234079480171,
-0.06790849566459656,
-0.04274497926235199,
0.26882874965667725,
-0.014519580639898777,
-0.018578344956040382,
0.03087094984948635,
-0.12200941890478134,
-0.02523832954466343,
-0.027999764308333397,
-0.041026219725608826,
0.017859628424048424,
-0.01577160693705082,
0.03948143497109413,
-0.030811874195933342,
-0.07670222222805023,
-0.09627146273851395,
-0.12987712025642395,
0.15926694869995117,
0.046053629368543625,
0.08700592070817947,
-0.19959402084350586,
0.07244245707988739,
-0.039249345660209656,
-0.09537161886692047,
-0.04999624565243721,
-0.01905086822807789,
-0.04447291046380997,
-0.032628390938043594,
-0.08026821166276932,
-0.11483382433652878,
0.08942767232656479,
0.08587387204170227,
0.007705236319452524,
0.003137418534606695,
-0.05373571440577507,
0.03873913735151291,
0.05202290415763855,
-0.0032042416278272867,
-0.054880570620298386,
-0.053335342556238174,
0.09727457165718079,
-0.11450009793043137,
-0.029850801452994347,
-0.07098017632961273,
-0.11721639335155487,
-0.04271451756358147,
0.08647116273641586,
0.06801757961511612,
0.008921164087951183,
0.13354630768299103,
0.012084487825632095,
-0.05257155001163483,
-0.10391727834939957,
-0.12788771092891693,
-0.04795286804437637,
0.010632134042680264,
-0.08839060366153717,
0.0786304697394371,
-0.021287625655531883,
-0.03829503804445267,
-0.14647476375102997,
-0.04258374869823456,
-0.1017213761806488,
-0.04199009761214256,
-0.06310391426086426,
-0.09337249398231506,
0.04394804313778877,
0.007115419488400221,
-0.012387756258249283,
-0.10640915483236313,
-0.20271290838718414,
-0.020082104951143265,
0.033157508820295334,
-0.004202182870358229,
-0.01781821995973587,
-0.025247201323509216,
-0.028969641774892807,
-0.00010850158287212253,
-0.07737261801958084,
0.01928933709859848,
-0.044618766754865646,
0.08336493372917175,
-0.013087482191622257,
0.14543890953063965,
-0.13357119262218475,
0.06213686242699623,
-0.12591798603534698,
-0.01839805208146572,
-0.10876724123954773,
0.004788939841091633,
0.03105350211262703,
0.05801268666982651,
-0.0517972931265831,
-0.06623785942792892,
0.04778033122420311,
0.037538353353738785,
0.003105334471911192,
0.14528705179691315,
-0.14181563258171082,
-0.029804827645421028,
0.14370359480381012,
-0.0598202608525753,
-0.21471312642097473,
0.05871449410915375,
-0.012708169408142567,
0.10023347288370132,
0.09668254107236862,
0.1417999416589737,
0.06352701783180237,
-0.077751524746418,
0.0284226406365633,
0.0257849283516407,
-0.1144121065735817,
-0.10945377498865128,
0.00220079580321908,
0.09541193395853043,
-0.1325264722108841,
0.06517930328845978,
-0.07565157115459442,
-0.043997880071401596,
-0.046228352934122086,
-0.035306334495544434,
-0.019288143143057823,
0.006192085798829794,
0.018985774368047714,
0.005900274030864239,
0.06256453692913055,
-0.0646243542432785,
0.011778554879128933,
0.12606465816497803,
0.040537964552640915,
-0.020759517326951027,
0.022313052788376808,
-0.10465433448553085,
0.13175418972969055,
-0.0064759766682982445,
0.067562036216259,
-0.1927342712879181,
-0.0667971819639206,
-0.033482734113931656,
0.15025648474693298,
0.04838256165385246,
0.09715671837329865,
0.01945764757692814,
-0.04003976285457611,
0.004727797117084265,
0.021611329168081284,
0.08986492455005646,
-0.036500558257102966,
-0.04606577008962631,
-0.15169084072113037,
-0.03821214288473129,
-0.02580334059894085,
-0.03769904002547264,
-0.023279739543795586,
-0.0021838543470948935,
-0.002191374311223626,
0.03576226159930229,
-0.02435053512454033,
0.05671403184533119,
0.0022242844570428133,
-0.04550855979323387,
-0.04430009797215462,
0.027336211875081062,
0.04578566551208496,
-0.0342751145362854,
-0.017280729487538338,
0.10357540845870972,
-0.15100538730621338,
0.11320920288562775,
0.15078860521316528,
-0.1094130352139473,
-0.04374763369560242,
0.005542372819036245,
-0.029561784118413925,
0.004816906061023474,
-0.07308172434568405,
-0.005316487513482571,
0.15252798795700073,
0.022595923393964767,
0.10629133880138397,
-0.10685200989246368,
-0.08813566714525223,
-0.011143920011818409,
-0.04874357953667641,
0.05733683705329895,
0.010279959067702293,
0.017021292820572853,
-0.18392403423786163,
0.09858846664428711,
0.023376023396849632,
-0.01974502019584179,
0.16913250088691711,
-0.010478274896740913,
-0.05905511975288391,
0.009677375666797161,
0.05192890018224716,
-0.04066208004951477,
0.04737364128232002,
-0.12044769525527954,
-0.03607028350234032,
0.050796475261449814,
-0.006926143076270819,
-0.009971779771149158,
-0.11500684916973114,
-0.02108057774603367,
-0.06904982030391693,
-0.03752921521663666,
-0.06837243586778641,
0.09114322811365128,
0.037757985293865204,
0.1673116236925125,
-0.0023694599512964487,
-0.04994567111134529,
-0.006507429294288158,
-0.02300923876464367,
-0.12717154622077942,
0.21056799590587616,
-0.05530204996466637,
-0.2314857840538025,
0.00225925724953413,
-0.1066332757472992,
-0.035921383649110794,
0.018944086506962776,
0.07609067857265472,
-0.03495985269546509,
-0.04625881463289261,
-0.035818085074424744,
0.010160119272768497,
0.03600648045539856,
0.007748481351882219,
-0.035579659044742584,
-0.017279155552387238,
-0.03765732795000076,
-0.12403342872858047,
-0.04134945571422577,
-0.061980217695236206,
-0.017038166522979736,
0.12094461172819138,
-0.034762099385261536,
0.023728275671601295,
0.142201766371727,
-0.10760703682899475,
0.050637904554605484,
-0.05202709510922432,
0.14468954503536224,
-0.032398756593465805,
0.061111051589250565,
0.16809162497520447,
-0.0059929233975708485,
0.04297834634780884,
0.12145037204027176,
0.009095879271626472,
-0.0882146954536438,
0.08477359265089035,
0.04865623638033867,
-0.08409765362739563,
-0.17637795209884644,
-0.059859585016965866,
-0.11688967049121857,
0.06614565849304199,
0.07885590940713882,
0.04806593060493469,
0.08514553308486938,
0.03725487366318703,
-0.07621718943119049,
0.07506320625543594,
-0.04591581970453262,
0.1049959734082222,
0.1609601378440857,
-0.028702879324555397,
0.09505883604288101,
-0.02001476101577282,
-0.05198121815919876,
0.08578604459762573,
0.09554655849933624,
0.09265601634979248,
-0.0488094799220562,
0.003069571452215314,
0.07583171129226685,
0.050202783197164536,
0.03208422288298607,
0.13099154829978943,
-0.05881063640117645,
-0.0012095400597900152,
-0.04228287190198898,
-0.08624380826950073,
-0.03863988444209099,
0.08815252780914307,
-0.09131114929914474,
-0.010243255645036697,
-0.0640643984079361,
0.017179518938064575,
0.02785095013678074,
0.16087770462036133,
0.12438564002513885,
-0.23344899713993073,
-0.05856453254818916,
0.06306460499763489,
-0.05247762054204941,
-0.06358316540718079,
0.09133268147706985,
0.04622212052345276,
-0.09584313631057739,
0.0389123298227787,
-0.0030667553655803204,
0.13093706965446472,
0.06717178970575333,
0.07464271783828735,
-0.0870918408036232,
0.018754517659544945,
-0.029777824878692627,
0.11003215610980988,
-0.31661486625671387,
0.1306196004152298,
-0.0006828733021393418,
-0.060013074427843094,
-0.08242673426866531,
-0.02185913361608982,
0.025245554745197296,
0.08428965508937836,
0.10787937790155411,
0.02897154539823532,
0.018996357917785645,
-0.01580289751291275,
0.05922995135188103,
0.08184851706027985,
-0.0009148063836619258,
0.0255307387560606,
0.05481770262122154,
-0.05173704773187637,
0.006259513087570667,
0.004457179922610521,
0.044553812593221664,
-0.012560521252453327,
0.021945539861917496,
0.026586826890707016,
0.044478170573711395,
0.051986537873744965,
-0.0073038795962929726,
-0.03589345142245293,
-0.00943796243518591,
0.11664600670337677,
0.07486329227685928,
-0.09575830399990082,
-0.14356890320777893,
0.10196705162525177,
0.008227607235312462,
-0.08648530393838882,
-0.03150037303566933,
0.0038160572294145823,
0.05612117424607277,
0.00826183520257473,
-0.18337707221508026,
0.14473803341388702,
-0.0379810594022274,
-0.06085212156176567,
-0.02293623797595501,
0.06761964410543442,
-0.0155143728479743,
0.046087924391031265,
0.04854338988661766,
-0.03535667806863785,
-0.09519419819116592,
-0.09666406363248825,
-0.01643010787665844,
-0.07162778079509735,
0.09315743297338486,
-0.012304526753723621,
-0.056919898837804794,
-0.07635268568992615,
0.011695103719830513,
0.09924045205116272,
0.22577184438705444,
0.09344218671321869,
-0.04757948964834213,
0.12595948576927185,
0.13157843053340912,
-0.04192047193646431,
-0.28485509753227234,
-0.03453106805682182,
-0.042596157640218735,
-0.0032905542757362127,
-0.05603010579943657,
-0.11189626902341843,
0.15648506581783295,
-0.027984973043203354,
-0.017950020730495453,
0.026504674926400185,
-0.19243557751178741,
-0.07450360804796219,
0.14904962480068207,
0.12072349339723587,
0.38396239280700684,
-0.07860146462917328,
-0.020059572532773018,
-0.06479928642511368,
-0.22422532737255096,
0.1541605144739151,
-0.20050930976867676,
0.06736019998788834,
-0.07881411164999008,
0.1587960571050644,
0.011794662103056908,
-0.026726311072707176,
-0.02035265602171421,
0.03752885386347771,
0.07245844602584839,
-0.06550561636686325,
-0.052099183201789856,
0.10886934399604797,
-0.0364665724337101,
0.16548892855644226,
-0.13397112488746643,
0.09072750061750412,
-0.08519507944583893,
-0.004075641743838787,
-0.07356712967157364,
0.11424873024225235,
-0.008479829877614975,
-0.12363222986459732,
-0.0196059662848711,
-0.009738985449075699,
0.03956308588385582,
-0.027033265680074692,
0.13022080063819885,
-0.05965167656540871,
0.14915575087070465,
0.09851662814617157,
0.1239023432135582,
-0.16019244492053986,
0.04049735143780708,
-0.0045069437474012375,
-0.0030625034123659134,
0.13850583136081696,
-0.21901144087314606,
0.07546663284301758,
0.11065221577882767,
0.07935357838869095,
0.0668136328458786,
0.08407623320817947,
-0.0073240818455815315,
0.012486488558351994,
0.048880815505981445,
-0.19794286787509918,
-0.04122987389564514,
-0.015893422067165375,
0.0012937074061483145,
-0.012863443233072758,
0.13165980577468872,
0.12441995739936829,
-0.07095081359148026,
-0.06271132826805115,
0.04312412813305855,
-0.01781676709651947,
-0.008171838708221912,
0.07346043735742569,
0.03234764561057091,
0.05756044760346413,
-0.15431776642799377,
-0.006820676848292351,
-0.025153465569019318,
0.04843226447701454,
0.06661700457334518,
0.13139885663986206,
-0.13406379520893097,
-0.08554743230342865,
0.07938908040523529,
0.08712372183799744,
-0.12978018820285797,
-0.02243511751294136,
-0.06519121676683426,
-0.10770163685083389,
0.08261959999799728,
0.22445444762706757,
0.043845076113939285,
0.0070006647147238255,
-0.024357907474040985,
-0.06045891344547272,
-0.0399242602288723,
0.0773281529545784,
-0.07632094621658325,
0.03910385072231293,
-0.02930433861911297,
0.0652562826871872,
-0.027600806206464767,
0.11380220204591751,
-0.08615721762180328,
-0.1096818596124649,
-0.14430993795394897,
0.044377993792295456,
-0.21156567335128784,
-0.020236043259501457,
-0.08568406850099564,
-0.004609873052686453,
-0.02233845926821232,
-0.03242480382323265,
-0.06676028668880463,
0.03926463797688484,
-0.10480403155088425,
0.021568190306425095,
0.026949739083647728,
0.05291347950696945,
-0.14463719725608826,
0.017994683235883713,
0.023646345362067223,
0.011876066215336323,
0.1404772698879242,
0.0838543102145195,
-0.12991034984588623,
0.13515889644622803,
-0.10300516337156296,
-0.06197216361761093,
0.056391049176454544,
0.07311946898698807,
0.14239449799060822,
0.013341736979782581,
0.01308133453130722,
0.03258742019534111,
0.04012906178832054,
0.028544893488287926,
0.02362683415412903,
-0.0686350017786026,
0.012429683469235897,
-0.10999012738466263,
-0.06579701602458954,
-0.06980718672275543,
0.003601126605644822,
-0.0378856398165226,
0.029752269387245178,
0.07361093163490295,
-0.02113419584929943,
0.046212099492549896,
-0.16173973679542542,
-0.0023153165820986032,
0.044375695288181305,
-0.1228874921798706,
-0.16116593778133392,
-0.06316515803337097,
0.03780823200941086,
-0.08564458787441254,
0.09908820688724518,
0.012366639450192451,
0.033763185143470764,
0.006095633842051029,
0.15024499595165253,
0.024510305374860764,
-0.04097108915448189,
0.24307820200920105,
-0.00024292251328006387,
-0.017709819599986076,
0.00710700498893857,
0.07389193773269653,
0.006944017950445414,
-0.04625421762466431,
0.24424031376838684,
-0.03385317325592041,
0.04042286053299904,
0.034334950149059296,
0.0630335807800293,
0.02667749486863613,
-0.09049979597330093,
-0.14467047154903412,
0.008385485038161278,
0.06462268531322479,
0.007112491875886917,
0.026752620935440063,
0.2306324988603592,
-0.04511580988764763,
0.06747361272573471,
-0.049403149634599686,
-0.030745932832360268,
-0.1500423401594162,
-0.1894989013671875,
-0.05772583559155464,
-0.16367846727371216,
0.015394114889204502,
-0.17894046008586884,
-0.007185094524174929,
-0.014617823995649815,
0.07744075357913971,
-0.02824360318481922,
0.0900716558098793,
0.0389491505920887,
-0.10347335785627365,
-0.002769185695797205,
0.008404284715652466,
0.04485413804650307,
0.010955903679132462,
0.06402863562107086,
0.013769938610494137,
-0.028628846630454063,
-0.0015616026939824224,
-0.00030264825909398496,
0.08132653683423996,
0.010353447869420052,
-0.13956277072429657,
-0.05540318414568901,
-0.0032692276872694492,
0.0310369823127985,
-0.04216521605849266,
0.06739408522844315,
0.032032933086156845,
-0.021182721480727196,
0.006286873482167721,
0.25976693630218506,
-0.10179809480905533,
-0.10602045804262161,
-0.07922260463237762,
0.3370479941368103,
0.03805859759449959,
0.02904898300766945,
0.04053504019975662,
-0.018333733081817627,
-0.046350687742233276,
0.23173026740550995,
0.19089366495609283,
-0.01790383830666542,
0.03164207190275192,
-0.026706356555223465,
0.008814437314867973,
0.022162118926644325,
0.1628604531288147,
0.1147223487496376,
0.23577067255973816,
-0.0014374812599271536,
0.03935258835554123,
-0.011784310452640057,
-0.014588347636163235,
-0.021233905106782913,
0.07020188868045807,
0.05957600474357605,
-0.1112746000289917,
0.0026856197509914637,
0.069378562271595,
-0.14132830500602722,
-0.0420808345079422,
-0.08739995211362839,
-0.017135143280029297,
-0.08442183583974838,
-0.03210440278053284,
0.0032177905086427927,
0.04284625127911568,
0.04450594261288643,
-0.02050141990184784,
0.027550702914595604,
0.13689261674880981,
0.022369123995304108,
-0.20832858979701996,
-0.0834355279803276,
0.14021217823028564,
-0.1763816773891449,
0.032192159444093704,
-0.025678008794784546,
0.10553108900785446,
0.04168041795492172,
0.00476519949734211,
-0.04407073184847832,
0.10399357229471207,
0.03698539361357689,
-0.07896026223897934,
0.008585305884480476,
0.043651383370161057,
0.03530043736100197,
-0.006701597478240728,
0.0707099512219429,
-0.05934721976518631,
-0.055715449154376984,
-0.0348522812128067,
0.052619267255067825,
-0.1302931159734726,
0.024999404326081276,
-0.04456163942813873,
0.09516032785177231,
0.08469855040311813,
-0.041959915310144424,
-0.0070451428182423115,
-0.10312753170728683,
0.008647626265883446,
0.024758966639637947,
-0.10345365852117538,
-0.041347257792949677,
-0.09125057607889175,
-0.01669342629611492,
0.013333247043192387,
-0.015168349258601665,
-0.22209571301937103,
0.027585357427597046,
-0.06440187245607376,
-0.02534385770559311,
-0.06957263499498367,
0.05607329308986664,
0.11114940792322159,
0.0482705719769001,
-0.02915792167186737,
-0.11230220645666122,
0.0020718351006507874,
0.1372334212064743,
-0.13679511845111847,
-0.1116495206952095
] |
null | null | transformers | # radical DialoGPT Model | {"tags": ["conversational"]} | text-generation | Radicalkiddo/DialoGPT-small-Radical | [
"transformers",
"pytorch",
"gpt2",
"text-generation",
"conversational",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [] | TAGS
#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
| # radical DialoGPT Model | [
"# radical DialoGPT Model"
] | [
"TAGS\n#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n",
"# radical DialoGPT Model"
] | [
51,
7
] | [
"passage: TAGS\n#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n# radical DialoGPT Model"
] | [
-0.034180980175733566,
0.03909493610262871,
-0.00620402954518795,
0.008990895003080368,
0.10764410346746445,
-0.009701896458864212,
0.11361576616764069,
0.12469834834337234,
0.0018112309044227004,
-0.052081480622291565,
0.12135562300682068,
0.1748208850622177,
-0.02698943391442299,
0.14774322509765625,
-0.06762948632240295,
-0.2914566695690155,
0.05099174752831459,
0.06250428408384323,
0.02228834107518196,
0.1146172434091568,
0.10854628682136536,
-0.03452562913298607,
0.0781412124633789,
0.019230075180530548,
-0.11983131617307663,
0.012776502408087254,
0.004152384120970964,
-0.11216682940721512,
0.130338653922081,
0.059451907873153687,
0.029754769057035446,
0.022346846759319305,
-0.06220738962292671,
-0.1307293027639389,
0.04261567443609238,
-0.007957530207931995,
-0.048231225460767746,
0.036511074751615524,
0.025553502142429352,
-0.09707273542881012,
0.1635899841785431,
0.09750567376613617,
-0.013071686960756779,
0.03624037653207779,
-0.18160781264305115,
-0.020725896582007408,
-0.00916975736618042,
0.057090409100055695,
0.08019068092107773,
0.10477028042078018,
-0.04709723964333534,
0.10804180800914764,
-0.06540225446224213,
0.12338969111442566,
0.10412324219942093,
-0.2986808121204376,
-0.021219773218035698,
0.07150686532258987,
0.03491653874516487,
0.03973175957798958,
-0.03422418609261513,
0.09245222061872482,
0.023892555385828018,
0.007993386127054691,
-0.04270695522427559,
-0.09104422479867935,
-0.12018366158008575,
-0.0055518923327326775,
-0.1104622483253479,
-0.005826484877616167,
0.20811957120895386,
-0.03888198733329773,
0.06665406376123428,
-0.08052365481853485,
-0.1195201501250267,
-0.036241643130779266,
-0.057720039039850235,
-0.053409114480018616,
-0.08764414489269257,
0.08634606003761292,
0.012424577958881855,
-0.10214896500110626,
-0.10572709143161774,
-0.012990795075893402,
-0.18493033945560455,
0.1539258360862732,
0.03498859703540802,
0.030492760241031647,
-0.2013023942708969,
0.08912080526351929,
-0.04761291295289993,
-0.060309037566185,
0.021721836179494858,
-0.10356717556715012,
0.024238625541329384,
0.00593806616961956,
-0.05004994198679924,
-0.03867997229099274,
0.04999978095293045,
0.13125364482402802,
-0.03257790952920914,
0.002532562240958214,
-0.0110221141949296,
0.04024872928857803,
0.04466303810477257,
0.05021997541189194,
-0.016498522832989693,
-0.05001678317785263,
0.02530200593173504,
-0.07241027802228928,
0.017975101247429848,
-0.05850129947066307,
-0.17184193432331085,
-0.044346340000629425,
0.05612388625741005,
0.05729413032531738,
0.02864736318588257,
0.14158295094966888,
-0.02295053005218506,
-0.04647859185934067,
0.038688432425260544,
-0.02006501890718937,
-0.018168481066823006,
-0.001012651715427637,
0.006325470749288797,
0.1173335537314415,
0.03993126004934311,
0.043768260627985,
-0.12904764711856842,
-0.017169859260320663,
-0.05015145614743233,
0.00442335894331336,
-0.0210418738424778,
-0.03861456736922264,
-0.007312283851206303,
-0.024263780564069748,
0.0005866221617907286,
-0.12439344078302383,
-0.14210958778858185,
0.011783142574131489,
0.002636553253978491,
-0.035260047763586044,
-0.13173212110996246,
-0.10429800301790237,
-0.011281359940767288,
0.035438813269138336,
-0.06078976020216942,
-0.04032537341117859,
-0.05395360291004181,
0.07330063730478287,
-0.0008017968502826989,
0.08792635798454285,
-0.11219053715467453,
0.07707318663597107,
-0.09424932301044464,
-0.015245533548295498,
-0.10115662217140198,
0.12574616074562073,
0.010948487557470798,
0.05442643165588379,
-0.024427097290754318,
-0.024446802213788033,
-0.08880944550037384,
0.08291138708591461,
-0.026460666209459305,
0.2486076056957245,
-0.0933738648891449,
-0.10003045201301575,
0.31365013122558594,
-0.05147821828722954,
-0.11647789180278778,
0.15415579080581665,
-0.006056609563529491,
0.08754834532737732,
0.13130824267864227,
0.1908656358718872,
-0.013100091367959976,
-0.006673917640000582,
0.08982770144939423,
0.10955308377742767,
-0.09472962468862534,
0.012636171653866768,
0.017334094271063805,
-0.03329669311642647,
-0.05904386565089226,
0.02473336271941662,
0.10546907037496567,
0.061276715248823166,
-0.046345580369234085,
-0.03618478775024414,
-0.0018945119809359312,
0.012174902483820915,
0.10501832515001297,
-0.03913574665784836,
0.10704003274440765,
-0.03861748054623604,
-0.06385812908411026,
-0.01671566441655159,
0.026131698861718178,
-0.032275229692459106,
0.04702121391892433,
-0.09283764660358429,
0.07197632640600204,
-0.011564256623387337,
0.0698319524526596,
-0.14069770276546478,
-0.061981696635484695,
-0.04380907490849495,
0.18132610619068146,
0.06415282934904099,
0.1245296522974968,
0.046352170407772064,
-0.0692560002207756,
-0.028459247201681137,
0.03820379450917244,
0.1577269732952118,
-0.03158455342054367,
-0.09457271546125412,
-0.07566750794649124,
0.09589444845914841,
-0.05663331598043442,
0.09368612617254257,
-0.09861121326684952,
0.010366409085690975,
0.03861995413899422,
0.09218165278434753,
-0.015496868640184402,
0.04948118329048157,
0.022402342408895493,
-0.006686381530016661,
-0.03941335529088974,
0.011733770370483398,
0.10680205374956131,
-0.010868696495890617,
-0.09992388635873795,
0.22249092161655426,
-0.1717512309551239,
0.15324412286281586,
0.1693725883960724,
-0.2613601088523865,
-0.00321327056735754,
-0.11238537728786469,
-0.043088942766189575,
0.009295504540205002,
0.07862338423728943,
-0.019248012453317642,
0.18846017122268677,
-0.024704594165086746,
0.1547020524740219,
-0.0420604906976223,
-0.019167570397257805,
-0.053339675068855286,
-0.053526509553194046,
0.010963832028210163,
0.0955488309264183,
0.07422846555709839,
-0.17057794332504272,
0.14995352923870087,
0.11448977887630463,
0.025826450437307358,
0.18554601073265076,
0.0344071090221405,
0.009024953469634056,
0.05566373094916344,
0.0012798765674233437,
-0.05079333484172821,
-0.09551405161619186,
-0.2473660111427307,
-0.03599588945508003,
0.07057926803827286,
0.039327144622802734,
0.11155766993761063,
-0.095262311398983,
-0.0323038250207901,
-0.00345270661637187,
-0.019096819683909416,
0.04389463737607002,
0.1383575201034546,
0.027028661221265793,
0.12271708995103836,
-0.004939850885421038,
-0.06384488940238953,
0.08437176793813705,
0.024072833359241486,
-0.08150897175073624,
0.17643330991268158,
-0.12510459125041962,
-0.3433491587638855,
-0.08589529991149902,
-0.17596058547496796,
-0.04937490075826645,
0.043006785213947296,
0.09271597117185593,
-0.11150427907705307,
-0.009940613992512226,
-0.0035708497744053602,
0.1051664650440216,
-0.09237244725227356,
0.0024620809126645327,
-0.025451648980379105,
-0.0017266932409256697,
-0.10948625952005386,
-0.09645318984985352,
-0.05898051708936691,
-0.039460230618715286,
-0.09123558551073074,
0.10820658504962921,
-0.15372101962566376,
0.02821517363190651,
0.22129233181476593,
0.06313106417655945,
0.0407613068819046,
-0.04518655687570572,
0.20467345416545868,
-0.12483115494251251,
0.005580754950642586,
0.19118843972682953,
-0.04867216944694519,
0.06629189848899841,
0.1332334727048874,
-0.01658903993666172,
-0.08516550064086914,
0.03706439957022667,
-0.000261458772001788,
-0.06249634176492691,
-0.18450161814689636,
-0.14060066640377045,
-0.12094302475452423,
0.08389969170093536,
0.011857498437166214,
0.04520833119750023,
0.16159583628177643,
0.06057247519493103,
-0.0450085885822773,
0.007878337986767292,
0.05405242741107941,
0.06697170436382294,
0.293707937002182,
-0.07373307645320892,
0.14900828897953033,
-0.028586676344275475,
-0.16960962116718292,
0.074469193816185,
0.06986881047487259,
0.10203137248754501,
0.04394988715648651,
0.0770474523305893,
0.026077017188072205,
0.04837460443377495,
0.12428250163793564,
0.05111200734972954,
0.01961067132651806,
-0.04381952062249184,
-0.04017775505781174,
-0.037657808512449265,
-0.03982086107134819,
0.03808734193444252,
0.07314322888851166,
-0.15096083283424377,
-0.024468468502163887,
-0.002897705649957061,
0.09878864884376526,
0.07096045464277267,
0.07674040645360947,
-0.17593248188495636,
-0.01765606552362442,
0.05360311642289162,
-0.04604111984372139,
-0.11184624582529068,
0.08820207417011261,
-0.030954157933592796,
-0.13542738556861877,
0.07341893762350082,
-0.03778993338346481,
0.11149277538061142,
-0.08933088928461075,
0.0957326740026474,
-0.10486369580030441,
-0.053554024547338486,
0.021180037409067154,
0.10393543541431427,
-0.31328290700912476,
0.21571700274944305,
-0.006923637818545103,
-0.045571692287921906,
-0.11855340003967285,
-0.018137745559215546,
0.026456603780388832,
0.1353229433298111,
0.12192675471305847,
-0.016341740265488625,
0.02904888615012169,
0.03761445730924606,
-0.03362042084336281,
0.028738338500261307,
0.10161621123552322,
-0.028257593512535095,
-0.014658739790320396,
-0.05249655246734619,
0.0066593000665307045,
0.00009383261203765869,
-0.014238893985748291,
0.0012637064792215824,
-0.20400594174861908,
0.09488482773303986,
0.04011421650648117,
0.04654265195131302,
0.00895311776548624,
-0.035868071019649506,
-0.07308115810155869,
0.2732709050178528,
0.048476699739694595,
-0.09600204974412918,
-0.09144194424152374,
-0.05702851340174675,
0.05416436493396759,
-0.0621357299387455,
0.025628237053751945,
-0.05805616453289986,
0.004636483732610941,
-0.08118614554405212,
-0.17482192814350128,
0.14575712382793427,
-0.09916405379772186,
-0.03829802945256233,
-0.011890866793692112,
0.21226760745048523,
-0.017771797254681587,
0.012484155595302582,
0.04555041715502739,
0.004826509393751621,
-0.09381820261478424,
-0.09027477353811264,
-0.006360442377626896,
0.021848930045962334,
-0.045456722378730774,
0.019532501697540283,
-0.03919267654418945,
-0.041411593556404114,
-0.054533980786800385,
-0.034763891249895096,
0.3110589385032654,
0.16747529804706573,
-0.02552909590303898,
0.1798096001148224,
0.1004147082567215,
-0.06727021932601929,
-0.30141958594322205,
-0.10648101568222046,
-0.07545971125364304,
-0.05633239820599556,
-0.06410694867372513,
-0.18927551805973053,
0.060708023607730865,
-0.038157787173986435,
-0.015610725618898869,
0.0888005867600441,
-0.2935272753238678,
-0.09966711699962616,
0.1876988559961319,
-0.05170860514044762,
0.3958381414413452,
-0.09540335088968277,
-0.08028489351272583,
-0.04786328971385956,
-0.14677423238754272,
0.18063992261886597,
0.003420506836846471,
0.1062668040394783,
-0.00041852760477922857,
0.18559814989566803,
0.0542595311999321,
0.021689191460609436,
0.07531560957431793,
-0.01869187131524086,
-0.04778243601322174,
-0.09565197676420212,
-0.09530628472566605,
0.010400352999567986,
0.024499943479895592,
0.027723656967282295,
-0.03655565530061722,
0.014363480731844902,
-0.12359220534563065,
-0.07119839638471603,
-0.08280103653669357,
0.056618012487888336,
0.019700733944773674,
-0.07585794478654861,
-0.015263987705111504,
-0.04859965667128563,
-0.009876762516796589,
0.017245465889573097,
0.20224687457084656,
-0.08194630593061447,
0.16302897036075592,
0.06403183937072754,
0.16547024250030518,
-0.11617128551006317,
0.02271180972456932,
-0.054197315126657486,
-0.05556511506438255,
0.05544591695070267,
-0.04547008499503136,
0.0273482333868742,
0.10746752470731735,
-0.046175502240657806,
0.06462696194648743,
0.08875371515750885,
-0.0031208842992782593,
0.007982239127159119,
0.08786991983652115,
-0.2514045834541321,
-0.12804128229618073,
-0.07625124603509903,
0.024830663576722145,
0.054238636046648026,
0.11714198440313339,
0.216729074716568,
-0.002820422174409032,
-0.04560040310025215,
0.007681878283619881,
0.03550894558429718,
-0.055795446038246155,
0.04568938910961151,
-0.0038420376367866993,
0.010851256549358368,
-0.13173311948776245,
0.05599440261721611,
0.0029524399433285,
-0.11300373822450638,
0.04240945726633072,
0.17089655995368958,
-0.1004059687256813,
-0.13336066901683807,
-0.08897822350263596,
0.10241923481225967,
-0.13650865852832794,
-0.004143760073930025,
-0.01850394904613495,
-0.13032782077789307,
0.07053431123495102,
0.10052203387022018,
0.06100240722298622,
0.06272612512111664,
-0.08836527168750763,
-0.012628531083464622,
0.006049397401511669,
-0.003408020129427314,
0.02794834040105343,
-0.026183146983385086,
-0.024670520797371864,
0.07999970763921738,
-0.05538715794682503,
0.13509750366210938,
-0.10011227428913116,
-0.09411034733057022,
-0.15237297117710114,
0.040847763419151306,
-0.09952858835458755,
-0.11145863682031631,
-0.11520515382289886,
-0.037667956203222275,
-0.002469573402777314,
-0.022214902564883232,
-0.03853351250290871,
-0.04364999011158943,
-0.1192995011806488,
0.05336318910121918,
-0.041509151458740234,
0.011677378788590431,
-0.0737106129527092,
0.03380920737981796,
0.054678186774253845,
0.0007113275933079422,
0.16130119562149048,
0.1249273344874382,
-0.11543772369623184,
0.09334101527929306,
-0.12217098474502563,
-0.06641712784767151,
0.08916585147380829,
0.017182914540171623,
0.0409182645380497,
0.05962600186467171,
-0.007270091213285923,
0.06728872656822205,
0.0632074847817421,
0.052301134914159775,
0.02011493779718876,
-0.08153489232063293,
0.02785392664372921,
-0.014357281848788261,
-0.13426019251346588,
-0.032892849296331406,
-0.005032397340983152,
0.014016501605510712,
0.032063987106084824,
0.08124170452356339,
-0.07059010863304138,
0.06338285654783249,
-0.055573754012584686,
0.04329277575016022,
0.012656007893383503,
-0.16143986582756042,
0.004632053896784782,
-0.07864459604024887,
0.05625574663281441,
0.0043294369243085384,
0.22082220017910004,
0.001480978331528604,
-0.029055673629045486,
0.03273403272032738,
0.032853614538908005,
0.019589342176914215,
0.00013346056221053004,
0.17621509730815887,
0.0976133793592453,
-0.04978656768798828,
-0.09647376835346222,
0.0696677565574646,
0.026065897196531296,
0.04815475270152092,
0.1282365471124649,
-0.0263964980840683,
0.016006695106625557,
0.09942839294672012,
-0.027164921164512634,
0.011567702516913414,
-0.10813530534505844,
-0.14448125660419464,
-0.023236362263560295,
0.05299312248826027,
-0.05937531590461731,
0.1340596079826355,
0.1506727933883667,
0.005373812280595303,
0.03864869475364685,
-0.014830082654953003,
-0.06479159742593765,
-0.19100980460643768,
-0.2214026302099228,
-0.07200463116168976,
-0.13652077317237854,
-0.007615959271788597,
-0.1324627846479416,
0.03572588786482811,
0.014769898727536201,
0.0991395115852356,
-0.07480844110250473,
0.12303148210048676,
0.006129086948931217,
-0.10985352098941803,
0.08524668961763382,
-0.04467283561825752,
0.09468524158000946,
-0.011191202327609062,
0.00012174008588772267,
-0.07349327951669693,
0.0442722849547863,
0.011514784768223763,
0.03213267773389816,
-0.07713673263788223,
0.016140056774020195,
-0.1085866391658783,
-0.06172790378332138,
-0.05480597913265228,
0.06497702747583389,
-0.0008966407622210681,
0.14206331968307495,
0.016211898997426033,
-0.0383906252682209,
0.029286250472068787,
0.23416996002197266,
-0.0662141889333725,
-0.08684176951646805,
-0.07056921720504761,
0.23664815723896027,
-0.009686049073934555,
0.1155640184879303,
-0.038282610476017,
0.02538086473941803,
-0.07774124294519424,
0.3280858099460602,
0.30641859769821167,
-0.12570367753505707,
0.015680436044931412,
0.024147972464561462,
0.047731075435876846,
0.1053573414683342,
0.07562167942523956,
0.10922856628894806,
0.27657872438430786,
-0.06123781204223633,
-0.026866821572184563,
-0.010253844782710075,
-0.031208543106913567,
-0.07407443970441818,
0.051081180572509766,
0.05543908849358559,
-0.03432699292898178,
-0.013873335905373096,
0.10116352885961533,
-0.25697511434555054,
0.09419649839401245,
-0.2098301649093628,
-0.17210549116134644,
-0.0925576388835907,
0.002015189966186881,
0.07609374821186066,
0.035248033702373505,
0.09225982427597046,
0.0019511099671944976,
-0.06707517057657242,
0.08019427955150604,
0.03811857849359512,
-0.16900122165679932,
-0.006992659531533718,
0.09643405675888062,
-0.055753618478775024,
-0.06994890421628952,
-0.014585437253117561,
0.04138186573982239,
0.0828985646367073,
0.03475894033908844,
-0.0018503045430406928,
0.04409753158688545,
0.004729387350380421,
-0.04275057837367058,
0.01692810095846653,
0.03762208670377731,
0.024687325581908226,
-0.09145068377256393,
0.09066849201917648,
-0.13502103090286255,
0.054129283875226974,
0.0164783988147974,
-0.04289356991648674,
-0.02182120271027088,
0.04256850853562355,
-0.09791437536478043,
0.06900252401828766,
0.1116136908531189,
-0.011501231230795383,
-0.006458559539169073,
-0.03087715245783329,
-0.017005881294608116,
-0.021021248772740364,
-0.0851270854473114,
-0.10273642092943192,
-0.1591300070285797,
-0.09813956916332245,
0.0923658087849617,
0.0022800746373832226,
-0.11596924811601639,
0.007124061696231365,
-0.12847644090652466,
0.061121366918087006,
-0.13749761879444122,
0.0960918590426445,
0.07789473980665207,
0.01885366253554821,
0.014243635348975658,
-0.023159818723797798,
0.043173275887966156,
0.06946318596601486,
-0.1400681585073471,
-0.06052631884813309
] |
null | null | transformers |
# Model Trained Using AutoNLP
- Problem type: Summarization
- Model ID: 14502562
## Usage
You can use cURL to access this model:
```
$ curl -X POST -H "Authorization: Bearer YOUR_HUGGINGFACE_API_KEY" -H "Content-Type: application/json" -d '{"inputs": "I love AutoNLP", "parameters":{"max_length":1000}}' https://api-inference.huggingface.co/Radvian/autonlp-indo_summarization-14502562
``` | {"language": "unk", "tags": "autonlp", "datasets": ["Radvian/autonlp-data-indo_summarization"], "widget": [{"text": "I love AutoNLP \ud83e\udd17"}]} | text2text-generation | Radvian/t5_liputan6_finetuned_indonesia_summarization | [
"transformers",
"pytorch",
"t5",
"text2text-generation",
"autonlp",
"unk",
"dataset:Radvian/autonlp-data-indo_summarization",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [
"unk"
] | TAGS
#transformers #pytorch #t5 #text2text-generation #autonlp #unk #dataset-Radvian/autonlp-data-indo_summarization #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
# Model Trained Using AutoNLP
- Problem type: Summarization
- Model ID: 14502562
## Usage
You can use cURL to access this model:
| [
"# Model Trained Using AutoNLP\n\n- Problem type: Summarization\n- Model ID: 14502562",
"## Usage\n\nYou can use cURL to access this model:"
] | [
"TAGS\n#transformers #pytorch #t5 #text2text-generation #autonlp #unk #dataset-Radvian/autonlp-data-indo_summarization #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n",
"# Model Trained Using AutoNLP\n\n- Problem type: Summarization\n- Model ID: 14502562",
"## Usage\n\nYou can use cURL to access this model:"
] | [
73,
24,
13
] | [
"passage: TAGS\n#transformers #pytorch #t5 #text2text-generation #autonlp #unk #dataset-Radvian/autonlp-data-indo_summarization #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n# Model Trained Using AutoNLP\n\n- Problem type: Summarization\n- Model ID: 14502562## Usage\n\nYou can use cURL to access this model:"
] | [
-0.07716599106788635,
0.10927131026983261,
-0.001351655344478786,
0.06050832197070122,
0.15804219245910645,
0.02250809594988823,
0.09200165420770645,
0.06684646010398865,
-0.04762311279773712,
-0.01340350043028593,
0.18563899397850037,
0.22855159640312195,
-0.003100347938016057,
0.16925105452537537,
-0.07516823709011078,
-0.26727405190467834,
0.029166672378778458,
0.041526325047016144,
0.08320815861225128,
0.11803768575191498,
0.1260174661874771,
-0.04577678069472313,
0.1159071996808052,
0.0126888332888484,
-0.1697675883769989,
0.010071821510791779,
0.009454882703721523,
-0.0955394059419632,
0.11248055100440979,
0.10014817863702774,
0.09546032547950745,
0.03306019306182861,
0.09702374041080475,
-0.11515882611274719,
0.026903267949819565,
-0.007007820997387171,
-0.03287913277745247,
0.09566602855920792,
0.09044360369443893,
-0.06668131798505783,
0.1664971262216568,
-0.026818685233592987,
-0.0012974189594388008,
0.02278210036456585,
-0.0797322615981102,
0.055095281451940536,
-0.0019206585129722953,
0.0934717059135437,
0.13172762095928192,
0.10034317523241043,
-0.007608120795339346,
0.16288381814956665,
-0.1011466458439827,
0.12546658515930176,
-0.003644991898909211,
-0.28153109550476074,
-0.023182231932878494,
0.148525208234787,
0.006816245149821043,
-0.06428732722997665,
0.00665676686912775,
0.061402786523103714,
0.059685979038476944,
0.01979968510568142,
-0.013944335281848907,
-0.06047823280096054,
-0.1839662492275238,
0.01653946377336979,
-0.10790770500898361,
-0.029596835374832153,
0.21593225002288818,
0.0007674195221625268,
0.01317487470805645,
-0.026003066450357437,
-0.09460542351007462,
0.005564227234572172,
-0.04280726984143257,
-0.023595327511429787,
-0.05447923764586449,
0.021987397223711014,
-0.025970177724957466,
0.0038954662159085274,
-0.09466001391410828,
-0.025072045624256134,
-0.15446783602237701,
0.05075596645474434,
0.005362963769584894,
0.04388868808746338,
-0.12871494889259338,
0.07263249158859253,
-0.03818802163004875,
-0.07482234388589859,
0.04804886877536774,
-0.07721585035324097,
-0.03819519281387329,
-0.07447441667318344,
-0.06686338037252426,
-0.09824112057685852,
0.05648285523056984,
0.1447475552558899,
0.09496250003576279,
0.03869021311402321,
0.02097938396036625,
0.03775172308087349,
0.061424627900123596,
0.10908970236778259,
-0.11786908656358719,
-0.049503497779369354,
0.08650222420692444,
-0.0792762041091919,
0.015500271692872047,
-0.02330673113465309,
-0.13739681243896484,
-0.04465315490961075,
0.05875367298722267,
0.04606664553284645,
0.020212413743138313,
0.12182523310184479,
-0.021830210462212563,
-0.04816097021102905,
0.016407225281000137,
-0.05663945898413658,
0.00115001923404634,
-0.007247071713209152,
-0.04878561571240425,
0.08587785810232162,
0.07642240077257156,
0.03946968913078308,
-0.104188933968544,
0.06540726125240326,
-0.0770307257771492,
-0.03787599503993988,
-0.054389938712120056,
-0.08512257784605026,
0.0392407551407814,
0.0020856796763837337,
0.04050478711724281,
-0.17669707536697388,
-0.2611427903175354,
-0.019663119688630104,
0.057126134634017944,
-0.0546344630420208,
-0.1030900627374649,
-0.08415880799293518,
0.022542933002114296,
0.03518487513065338,
-0.03301157057285309,
0.08461526781320572,
-0.030367709696292877,
0.04055371135473251,
-0.04123082011938095,
0.08207135647535324,
-0.12438434362411499,
0.05395709350705147,
-0.10278242081403732,
-0.014136696234345436,
-0.0408034585416317,
0.12085842341184616,
-0.00001202925432153279,
0.0860249251127243,
-0.12055455893278122,
-0.023683303967118263,
-0.028348052874207497,
0.030911190435290337,
0.05896315723657608,
0.16898342967033386,
-0.12838168442249298,
-0.04863185063004494,
0.1232367530465126,
-0.06896189600229263,
-0.11593171209096909,
0.10283166170120239,
-0.040156230330467224,
0.09469874203205109,
0.0931316390633583,
0.07667448371648788,
0.08671026676893234,
0.0022973972372710705,
-0.011993277817964554,
0.09532401710748672,
-0.1129847913980484,
-0.10847281664609909,
0.005272245034575462,
0.06754657626152039,
-0.2305394411087036,
0.04900474473834038,
0.1205303743481636,
0.08481863141059875,
-0.07057689875364304,
-0.07250472903251648,
-0.04966749623417854,
-0.058904435485601425,
0.01644681580364704,
0.021581239998340607,
0.1276036947965622,
0.03195876628160477,
-0.02723664417862892,
-0.020296458154916763,
0.0863630548119545,
0.010596174746751785,
-0.024624425917863846,
-0.07501336187124252,
0.08931012451648712,
-0.10775873810052872,
0.03303072229027748,
-0.20427724719047546,
-0.06152289733290672,
-0.05954718962311745,
0.11618880182504654,
0.018956517800688744,
0.0050450656563043594,
0.03994017466902733,
0.0126182334497571,
0.009905119426548481,
0.022787434980273247,
0.21052907407283783,
0.0039239199832081795,
-0.12147054076194763,
-0.08847459405660629,
0.008748253807425499,
-0.04064551740884781,
0.1782664805650711,
-0.1798747330904007,
-0.024956967681646347,
-0.08893200755119324,
0.13069464266300201,
-0.03133991360664368,
0.054414052516222,
0.014295942150056362,
0.030717909336090088,
-0.0633997842669487,
0.004570294171571732,
0.07823627442121506,
-0.011468905955553055,
-0.11422276496887207,
0.09536918997764587,
-0.09060657024383545,
0.1093311607837677,
0.15860526263713837,
-0.21021641790866852,
-0.07176921516656876,
-0.02774837240576744,
-0.02642972394824028,
-0.007463272660970688,
-0.01835690811276436,
0.007765650283545256,
0.1136452704668045,
0.02366781421005726,
0.15431009232997894,
-0.01988331414759159,
-0.0012003652518615127,
0.0040673003531992435,
-0.08151514828205109,
-0.006231563165783882,
0.12901747226715088,
0.18348100781440735,
-0.24785324931144714,
0.09628786146640778,
0.045602697879076004,
-0.07974641770124435,
0.09514237195253372,
0.03809206932783127,
-0.029150573536753654,
0.03164638951420784,
-0.032342296093702316,
-0.0033971064258366823,
-0.055455662310123444,
-0.07897917926311493,
-0.003539987141266465,
0.10221491754055023,
-0.06015733629465103,
0.07291792333126068,
-0.08157572895288467,
-0.0037181905936449766,
0.005114198662340641,
0.032182298600673676,
-0.07863450050354004,
0.09258120507001877,
0.02334536425769329,
0.12024439126253128,
0.010621733963489532,
-0.09496980905532837,
0.09161041676998138,
0.023200131952762604,
-0.12685437500476837,
0.2117786407470703,
-0.07562320679426193,
-0.2827230989933014,
-0.13787078857421875,
-0.11336639523506165,
-0.056192900985479355,
0.006395883858203888,
0.03015969507396221,
-0.08990715444087982,
-0.08363284915685654,
-0.00027070901705883443,
-0.02788545936346054,
-0.02117816172540188,
-0.007261284161359072,
-0.050520334392786026,
-0.009438174776732922,
-0.023706411942839622,
-0.08289880305528641,
-0.027731619775295258,
-0.026934094727039337,
-0.0239882729947567,
0.140844464302063,
-0.14549176394939423,
0.09613502025604248,
0.17115743458271027,
-0.03449395298957825,
0.043508004397153854,
0.0021710945293307304,
0.20732946693897247,
-0.045354463160037994,
0.0015576925361528993,
0.1605772078037262,
-0.04414215683937073,
0.04674617946147919,
0.14427673816680908,
0.007265905849635601,
-0.06462686508893967,
0.030854158103466034,
-0.0061108930967748165,
-0.0720146894454956,
-0.20619231462478638,
-0.19226959347724915,
-0.08644825965166092,
0.015194308012723923,
0.09109978377819061,
0.040007010102272034,
0.07165998220443726,
0.14368823170661926,
0.039718735963106155,
0.10215356945991516,
-0.05030856654047966,
0.08409052342176437,
0.19679486751556396,
-0.023504583165049553,
0.1374131739139557,
-0.06056889891624451,
-0.15112629532814026,
0.07758784294128418,
-0.021957281976938248,
0.12467845529317856,
0.04850287735462189,
-0.021572766825556755,
0.007267593406140804,
0.057081855833530426,
0.08942509442567825,
0.17463907599449158,
0.021087761968374252,
-0.0375666618347168,
-0.02413882315158844,
-0.02598966844379902,
-0.10096899420022964,
0.07441599667072296,
-0.006680469028651714,
-0.07274135947227478,
-0.06355393677949905,
0.0910244733095169,
0.014181491918861866,
0.09262504428625107,
0.09380332380533218,
-0.37534379959106445,
-0.022518770769238472,
0.010647508315742016,
-0.03993471711874008,
-0.09894061833620071,
0.04015343636274338,
-0.09929578751325607,
-0.12773436307907104,
0.06898198276758194,
-0.025431480258703232,
0.1365356743335724,
-0.07141663134098053,
0.04037865251302719,
-0.08749260753393173,
0.019825207069516182,
-0.0036236834712326527,
0.14612141251564026,
-0.2560102343559265,
0.2053668051958084,
0.02267972007393837,
-0.020396802574396133,
-0.12320563197135925,
-0.011728583835065365,
0.027393579483032227,
0.11255533993244171,
0.09973667562007904,
-0.0017366341780871153,
-0.09360913932323456,
-0.13042226433753967,
-0.08999623358249664,
0.059369321912527084,
0.006451777648180723,
0.04935837909579277,
0.03421887755393982,
-0.03843874856829643,
-0.024698032066226006,
-0.01155005767941475,
0.0059934076853096485,
-0.10155873745679855,
-0.13981249928474426,
0.021930715069174767,
0.08831097185611725,
0.049811337143182755,
0.01760931871831417,
-0.0588068850338459,
-0.011590482667088509,
0.23357753455638885,
0.11345355957746506,
-0.07650453597307205,
-0.14228659868240356,
0.06706427037715912,
0.05972456559538841,
-0.09909947961568832,
0.026751764118671417,
-0.04854936897754669,
0.03674367815256119,
-0.021860117092728615,
-0.17460189759731293,
0.09158290177583694,
-0.08716654032468796,
0.010328476317226887,
-0.04399385303258896,
0.056965410709381104,
-0.03088505193591118,
0.0072609963826835155,
0.07144571095705032,
0.014541100710630417,
-0.09105680137872696,
-0.08855843544006348,
-0.04448772221803665,
0.04577309638261795,
-0.009295593947172165,
0.06224079057574272,
-0.051058243960142136,
-0.08449038863182068,
0.0159299373626709,
-0.0032182286959141493,
0.2924164831638336,
0.014686346985399723,
-0.06963726878166199,
0.05597541481256485,
0.1329413503408432,
-0.05643439665436745,
-0.30572962760925293,
-0.09746910631656647,
0.001109944423660636,
0.027400514110922813,
-0.08508066833019257,
-0.12906543910503387,
0.1541750431060791,
0.004559645429253578,
-0.021949125453829765,
-0.002671025926247239,
-0.18219488859176636,
-0.10081436485052109,
0.24334190785884857,
0.011396299116313457,
0.2764066755771637,
-0.05885988473892212,
-0.03790297359228134,
-0.12542398273944855,
-0.10730487108230591,
0.16201390326023102,
-0.053630728274583817,
0.08320767432451248,
-0.03128399699926376,
0.17568255960941315,
0.055208172649145126,
-0.05982007086277008,
0.04685712978243828,
0.05717714875936508,
-0.024971352890133858,
-0.09158559143543243,
-0.05737931281328201,
-0.025812257081270218,
-0.015106900595128536,
0.14241547882556915,
-0.05183497816324234,
0.07973407208919525,
-0.20978903770446777,
-0.04698094725608826,
-0.02799292467534542,
0.06528914719820023,
0.017892971634864807,
-0.0528583787381649,
0.024281134828925133,
-0.02612767554819584,
0.02437049336731434,
-0.0023822584189474583,
0.02715485170483589,
-0.027634272351861,
0.05011240020394325,
0.14004583656787872,
0.14441092312335968,
-0.09354153275489807,
0.06400700658559799,
-0.015282929874956608,
-0.07838559150695801,
0.08339086920022964,
-0.12704871594905853,
0.053183138370513916,
0.13920699059963226,
-0.012115078046917915,
0.0764862522482872,
0.0667099729180336,
0.03214012086391449,
-0.010549765080213547,
0.1170034185051918,
-0.1876377910375595,
0.015064140781760216,
-0.05631212145090103,
-0.03433860465884209,
-0.01792924664914608,
0.010519301518797874,
0.14652319252490997,
-0.04331439733505249,
-0.03241374343633652,
0.01704748161137104,
0.01213581021875143,
-0.05447673425078392,
0.12471765279769897,
0.04891101270914078,
0.03072548285126686,
-0.13343344628810883,
0.014630884863436222,
0.0591900497674942,
-0.034951869398355484,
0.03055386058986187,
0.07788509875535965,
-0.1353675276041031,
-0.13079838454723358,
-0.01029165368527174,
0.18490096926689148,
-0.21745118498802185,
-0.06067543849349022,
-0.011504650115966797,
-0.10845424234867096,
0.07788269221782684,
0.07789535820484161,
0.055789679288864136,
0.02600737474858761,
-0.06436922401189804,
-0.09381654113531113,
-0.1364690214395523,
0.04714952036738396,
0.140534907579422,
0.033167462795972824,
-0.09843505918979645,
0.043313875794410706,
-0.0504315122961998,
0.10366401076316833,
-0.07802953571081161,
-0.013963976874947548,
-0.16758160293102264,
0.013387092389166355,
-0.23380500078201294,
0.00036204056232236326,
-0.07912102341651917,
0.00978279858827591,
0.0029810848645865917,
0.017121996730566025,
-0.043882668018341064,
-0.03318821266293526,
-0.08509942889213562,
0.009578987956047058,
-0.002582188230007887,
0.057713259011507034,
-0.06220196932554245,
-0.02617202140390873,
0.033065419644117355,
0.014674441888928413,
0.09527871757745743,
0.10454345494508743,
-0.04924703389406204,
0.060460325330495834,
-0.1512877494096756,
-0.06652439385652542,
0.07179266959428787,
0.05704876780509949,
0.08633062988519669,
-0.025773944333195686,
0.055486395955085754,
0.07001104205846786,
0.014954045414924622,
0.038835883140563965,
0.0528620146214962,
-0.10125056654214859,
0.02331378310918808,
-0.0630546510219574,
-0.07365027815103531,
-0.07087226212024689,
0.01826656609773636,
0.009336904622614384,
0.04862639680504799,
0.12896867096424103,
-0.09088172763586044,
0.08463691174983978,
-0.10347660630941391,
0.029383713379502296,
-0.0364236906170845,
-0.0823761597275734,
-0.07710421085357666,
-0.11117320507764816,
0.05231931433081627,
-0.026859339326620102,
0.17648592591285706,
0.06873778998851776,
0.059651996940374374,
-0.0063997721299529076,
0.12950778007507324,
0.05094064772129059,
-0.025779500603675842,
0.16435971856117249,
0.09194902330636978,
0.01277807354927063,
-0.06404192000627518,
0.09376630932092667,
0.05367613956332207,
0.023413993418216705,
0.055777657777071,
-0.011191231198608875,
-0.058730266988277435,
0.08505097031593323,
0.020416615530848503,
-0.021355358883738518,
-0.056357283145189285,
-0.05386139824986458,
-0.10811150074005127,
0.07313654571771622,
-0.03278125450015068,
0.01732591539621353,
0.09927568584680557,
-0.01125633530318737,
-0.02493963949382305,
-0.03594103455543518,
-0.06517980992794037,
-0.18949218094348907,
-0.1834087371826172,
-0.13281023502349854,
-0.08821499347686768,
0.023271469399333,
-0.10396160185337067,
-0.021660130470991135,
0.03812354430556297,
0.05746045336127281,
-0.04023444280028343,
0.03410376235842705,
-0.021234937012195587,
-0.024829113855957985,
0.025797031819820404,
-0.03319689631462097,
0.03500448539853096,
-0.05822509154677391,
0.011309918947517872,
-0.05480855703353882,
0.03658775985240936,
0.003857966512441635,
0.009855217300355434,
0.013198926113545895,
0.07784530520439148,
-0.05086697265505791,
-0.0628780797123909,
-0.06740639358758926,
0.0653998851776123,
0.04040057584643364,
0.08718279749155045,
0.020480094477534294,
-0.0006096803699620068,
-0.006280668079853058,
0.155672088265419,
-0.060210637748241425,
-0.10657979547977448,
-0.12601840496063232,
0.3354245722293854,
-0.020854201167821884,
0.04987813159823418,
0.01219434104859829,
-0.015525550581514835,
-0.013273220509290695,
0.3324041962623596,
0.26748064160346985,
-0.07944655418395996,
-0.0020577297545969486,
-0.0055857086554169655,
0.00991417933255434,
0.03754260018467903,
0.10925213247537613,
0.08064628392457962,
0.22567768394947052,
-0.09114823490381241,
-0.017512451857328415,
-0.06668875366449356,
-0.029526391997933388,
-0.040829043835401535,
0.07423262298107147,
0.06455628573894501,
-0.0647890567779541,
-0.011957146227359772,
0.12947580218315125,
-0.2460930198431015,
0.10903483629226685,
-0.09244632720947266,
-0.08275004476308823,
-0.12173581123352051,
-0.016532789915800095,
-0.01573672890663147,
0.05410125479102135,
0.0742136612534523,
-0.07645037770271301,
-0.027418872341513634,
0.08482533693313599,
-0.012507285922765732,
-0.1804255247116089,
-0.0030249699484556913,
0.049501530826091766,
0.020793624222278595,
0.08726957440376282,
0.008067674934864044,
0.09802086651325226,
0.11864051967859268,
0.045586299151182175,
-0.10249968618154526,
0.03952939435839653,
-0.014258156530559063,
-0.039206139743328094,
0.11773081123828888,
-0.004875532351434231,
-0.012633809819817543,
0.010843847878277302,
0.03716582432389259,
-0.1608375608921051,
0.017880799248814583,
-0.020550785586237907,
0.0205924641340971,
-0.06661097705364227,
0.07666011154651642,
-0.04555860534310341,
0.11845848709344864,
0.18594099581241608,
-0.04072296619415283,
0.005452845711261034,
-0.07994213700294495,
0.054465051740407944,
-0.007475967984646559,
-0.10703632980585098,
-0.05611070245504379,
-0.11092258244752884,
-0.04126592352986336,
-0.02528579905629158,
-0.002086870837956667,
-0.1991511434316635,
0.0077368649654090405,
-0.13175001740455627,
-0.01744033209979534,
-0.08704886585474014,
0.10366678982973099,
0.13608567416667938,
0.0106267798691988,
-0.012968990951776505,
-0.03760337829589844,
0.0168712567538023,
0.05697926878929138,
-0.09913398325443268,
-0.11048130691051483
] |
null | null | transformers |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# wav2vec2-base-timit-demo-colab
This model is a fine-tuned version of [facebook/wav2vec2-base](https://huggingface.co/facebook/wav2vec2-base) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.4229
- Wer: 0.2386
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 32
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 1000
- num_epochs: 30
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Wer |
|:-------------:|:-----:|:----:|:---------------:|:------:|
| 3.5486 | 4.0 | 500 | 2.1672 | 0.9876 |
| 0.6819 | 8.0 | 1000 | 0.4502 | 0.3301 |
| 0.2353 | 12.0 | 1500 | 0.4352 | 0.2841 |
| 0.1427 | 16.0 | 2000 | 0.4237 | 0.2584 |
| 0.0945 | 20.0 | 2500 | 0.4409 | 0.2545 |
| 0.0671 | 24.0 | 3000 | 0.4257 | 0.2413 |
| 0.0492 | 28.0 | 3500 | 0.4229 | 0.2386 |
### Framework versions
- Transformers 4.11.3
- Pytorch 1.10.0+cu111
- Datasets 1.18.3
- Tokenizers 0.10.3
| {"license": "apache-2.0", "tags": ["generated_from_trainer"], "model-index": [{"name": "wav2vec2-base-timit-demo-colab", "results": []}]} | automatic-speech-recognition | Rafat/wav2vec2-base-timit-demo-colab | [
"transformers",
"pytorch",
"tensorboard",
"wav2vec2",
"automatic-speech-recognition",
"generated_from_trainer",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [] | TAGS
#transformers #pytorch #tensorboard #wav2vec2 #automatic-speech-recognition #generated_from_trainer #license-apache-2.0 #endpoints_compatible #region-us
| wav2vec2-base-timit-demo-colab
==============================
This model is a fine-tuned version of facebook/wav2vec2-base on the None dataset.
It achieves the following results on the evaluation set:
* Loss: 0.4229
* Wer: 0.2386
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 0.0001
* train\_batch\_size: 32
* eval\_batch\_size: 8
* seed: 42
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* lr\_scheduler\_warmup\_steps: 1000
* num\_epochs: 30
* mixed\_precision\_training: Native AMP
### Training results
### Framework versions
* Transformers 4.11.3
* Pytorch 1.10.0+cu111
* Datasets 1.18.3
* Tokenizers 0.10.3
| [
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 0.0001\n* train\\_batch\\_size: 32\n* eval\\_batch\\_size: 8\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 1000\n* num\\_epochs: 30\n* mixed\\_precision\\_training: Native AMP",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.11.3\n* Pytorch 1.10.0+cu111\n* Datasets 1.18.3\n* Tokenizers 0.10.3"
] | [
"TAGS\n#transformers #pytorch #tensorboard #wav2vec2 #automatic-speech-recognition #generated_from_trainer #license-apache-2.0 #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 0.0001\n* train\\_batch\\_size: 32\n* eval\\_batch\\_size: 8\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 1000\n* num\\_epochs: 30\n* mixed\\_precision\\_training: Native AMP",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.11.3\n* Pytorch 1.10.0+cu111\n* Datasets 1.18.3\n* Tokenizers 0.10.3"
] | [
56,
130,
4,
35
] | [
"passage: TAGS\n#transformers #pytorch #tensorboard #wav2vec2 #automatic-speech-recognition #generated_from_trainer #license-apache-2.0 #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 0.0001\n* train\\_batch\\_size: 32\n* eval\\_batch\\_size: 8\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 1000\n* num\\_epochs: 30\n* mixed\\_precision\\_training: Native AMP### Training results### Framework versions\n\n\n* Transformers 4.11.3\n* Pytorch 1.10.0+cu111\n* Datasets 1.18.3\n* Tokenizers 0.10.3"
] | [
-0.10833754390478134,
0.10381042957305908,
-0.003447136841714382,
0.05311182141304016,
0.10943093150854111,
-0.02224290370941162,
0.12992502748966217,
0.1490924060344696,
-0.11156157404184341,
0.07095726579427719,
0.12520445883274078,
0.1469612419605255,
0.044384390115737915,
0.1459488868713379,
-0.05123339593410492,
-0.2855369746685028,
0.048294976353645325,
0.03550826758146286,
-0.020840534940361977,
0.12408933788537979,
0.08524196594953537,
-0.1261489987373352,
0.05181831493973732,
0.03754477575421333,
-0.1591220498085022,
-0.001641957787796855,
-0.008117086254060268,
-0.10824380815029144,
0.11797899007797241,
0.013362843543291092,
0.07320088893175125,
0.048765409737825394,
0.06339815258979797,
-0.21467654407024384,
0.008721605874598026,
0.045480094850063324,
0.027293900027871132,
0.07399290800094604,
0.06101059168577194,
-0.0253707654774189,
0.12154541909694672,
-0.07785171270370483,
0.08432452380657196,
0.03452400863170624,
-0.10040441900491714,
-0.295693039894104,
-0.0883895605802536,
0.047700464725494385,
0.07843475788831711,
0.08981457352638245,
-0.00999368354678154,
0.1470525562763214,
-0.057681191712617874,
0.11329855024814606,
0.2798192799091339,
-0.31179121136665344,
-0.04599817469716072,
-0.05289574712514877,
0.05597834661602974,
0.05841030925512314,
-0.0901239812374115,
-0.02046792581677437,
0.010743708349764347,
0.046851977705955505,
0.13231885433197021,
-0.01715417020022869,
-0.06198609992861748,
-0.008344883099198341,
-0.1534324288368225,
-0.06298980861902237,
0.11046526581048965,
0.017656773328781128,
-0.042628876864910126,
-0.09404584765434265,
-0.05194579064846039,
-0.2004159539937973,
-0.06980933248996735,
-0.01500130258500576,
0.039956334978342056,
-0.04952618107199669,
-0.10413790494203568,
-0.019491255283355713,
-0.06758825480937958,
-0.07009370625019073,
-0.03837838023900986,
0.19532173871994019,
0.06178545951843262,
-0.0007504495442844927,
-0.04200323671102524,
0.06930477917194366,
-0.014736226759850979,
-0.13804151117801666,
-0.023672964423894882,
0.036250852048397064,
-0.022838842123746872,
-0.01682872325181961,
-0.04348614066839218,
-0.06593196094036102,
0.018360575661063194,
0.1567915380001068,
-0.1088852807879448,
0.09793650358915329,
-0.01537051610648632,
0.03874713182449341,
-0.10357552021741867,
0.20873264968395233,
-0.04153716564178467,
0.03293122723698616,
-0.005830306094139814,
0.055414408445358276,
0.033529847860336304,
-0.026014741510152817,
-0.09795874357223511,
0.034854013472795486,
0.11659786105155945,
0.053310833871364594,
-0.04302902892231941,
0.05821622163057327,
-0.027089765295386314,
-0.009910321794450283,
0.011593430303037167,
-0.11522748321294785,
0.03396046161651611,
0.0198811162263155,
-0.06172381713986397,
0.0008120397687889636,
0.019153296947479248,
0.004520639777183533,
-0.06453731656074524,
0.08428143709897995,
-0.056282371282577515,
0.033882591873407364,
-0.05637597292661667,
-0.12755036354064941,
0.02759174071252346,
-0.10902155190706253,
-0.001358338282443583,
-0.10306747257709503,
-0.09193158149719238,
-0.010619371198117733,
0.036999672651290894,
-0.03549756854772568,
-0.03275611996650696,
-0.07301835715770721,
-0.09623170644044876,
0.04175805300474167,
-0.03570253774523735,
0.0764346495270729,
-0.07133547961711884,
0.09405636042356491,
0.03081537038087845,
0.08494142442941666,
-0.01318286918103695,
0.062569260597229,
-0.06405647844076157,
0.029027704149484634,
-0.20785629749298096,
0.078687384724617,
-0.09378376603126526,
0.058948077261447906,
-0.12374458461999893,
-0.1170143187046051,
0.03827769681811333,
-0.004956687800586224,
0.10379257798194885,
0.0937594622373581,
-0.16922356188297272,
-0.08996674418449402,
0.2025158554315567,
-0.08362291753292084,
-0.08466292172670364,
0.12438537180423737,
-0.023574335500597954,
-0.012047374621033669,
0.05270986631512642,
0.25722435116767883,
0.0563923642039299,
-0.12386839836835861,
0.01153150387108326,
-0.03621745854616165,
0.047043293714523315,
-0.04501413181424141,
0.05954015627503395,
-0.02173132449388504,
0.07572626322507858,
0.01326675247400999,
-0.006562749855220318,
0.042281605303287506,
-0.08780118823051453,
-0.07798930257558823,
-0.040403641760349274,
-0.07652655988931656,
0.013507777824997902,
0.034905679523944855,
0.06404134631156921,
-0.11733686923980713,
-0.11073767393827438,
0.04709266126155853,
0.08484742790460587,
-0.10454373061656952,
0.07569947093725204,
-0.11945994943380356,
0.08855628222227097,
-0.012427026405930519,
-0.0042078010737895966,
-0.19148027896881104,
0.033684469759464264,
0.03369207680225372,
-0.027014397084712982,
0.03843504935503006,
-0.06565430760383606,
0.07286848872900009,
0.04831041023135185,
-0.024084001779556274,
-0.04726380854845047,
-0.008630751632153988,
0.012781241908669472,
-0.09038025140762329,
-0.20807726681232452,
-0.040402818471193314,
-0.04182978719472885,
0.07309912890195847,
-0.13454800844192505,
0.034716520458459854,
0.07227864861488342,
0.09292402863502502,
0.02967613935470581,
-0.028521638363599777,
0.0027323609683662653,
0.09046582877635956,
-0.017737697809934616,
-0.06717314571142197,
0.05653621628880501,
0.023511258885264397,
-0.08707185834646225,
0.048796478658914566,
-0.1481570303440094,
0.127961665391922,
0.14512650668621063,
-0.008458556607365608,
-0.0681370198726654,
0.0027188167441636324,
-0.05006382241845131,
-0.0315980389714241,
-0.0025538518093526363,
0.04147781804203987,
0.22176256775856018,
0.01608957350254059,
0.14620628952980042,
-0.09077949076890945,
-0.04409495368599892,
0.049091413617134094,
-0.02334122359752655,
-0.009143802337348461,
0.12483556568622589,
0.04845994710922241,
-0.05674070864915848,
0.11428955942392349,
0.08967925608158112,
-0.08586719632148743,
0.11837322264909744,
-0.06838078796863556,
-0.07681573182344437,
-0.016253173351287842,
0.006750784814357758,
0.028568439185619354,
0.09584370255470276,
-0.15449927747249603,
-0.04031454026699066,
0.02691691555082798,
0.020981546491384506,
0.02508392371237278,
-0.20947007834911346,
0.014041672460734844,
0.03178508207201958,
-0.08192425966262817,
-0.043465156108140945,
-0.0011847163550555706,
0.012034800834953785,
0.09432540088891983,
0.013446008786559105,
-0.09667441248893738,
0.009430745616555214,
0.0037322519347071648,
-0.07600316405296326,
0.17992286384105682,
-0.12140516191720963,
-0.17771458625793457,
-0.10324431955814362,
-0.0862940177321434,
-0.032839421182870865,
-0.006773955188691616,
0.0887315422296524,
-0.09486573934555054,
-0.044363152235746384,
-0.08358942717313766,
-0.023079875856637955,
-0.03151819482445717,
0.04283427074551582,
0.03156427666544914,
-0.01136570330709219,
0.06314032524824142,
-0.11243854463100433,
-0.019515544176101685,
-0.041744768619537354,
0.004032604396343231,
0.05496735870838165,
0.03658017888665199,
0.10614565014839172,
0.1565544754266739,
-0.015423845499753952,
0.04914018139243126,
-0.04671413451433182,
0.1867409497499466,
-0.07426898181438446,
-0.041470639407634735,
0.1136881560087204,
-0.007811855059117079,
0.06949979066848755,
0.10878996551036835,
0.04568083956837654,
-0.09368357807397842,
-0.013869465328752995,
-0.000707953586243093,
-0.04555567353963852,
-0.22215522825717926,
-0.036037545651197433,
-0.04656601697206497,
-0.00568003486841917,
0.10165924578905106,
0.040871743112802505,
0.02505088411271572,
0.018389305099844933,
0.028121553361415863,
0.00035212599323131144,
0.0012278348440304399,
0.09916964918375015,
0.1341795027256012,
0.0387304350733757,
0.1326872706413269,
-0.043069735169410706,
-0.03335773944854736,
0.03271381929516792,
-0.0015795581275597215,
0.23355889320373535,
0.014797404408454895,
0.18411597609519958,
0.05663689598441124,
0.16338348388671875,
0.04172950237989426,
0.06686992943286896,
-0.004308757837861776,
-0.011605213396251202,
0.012266881763935089,
-0.051825493574142456,
-0.042994026094675064,
0.022489888593554497,
0.0273785088211298,
0.004465919919312,
-0.1159159392118454,
0.0005170528893359005,
0.04267645999789238,
0.3521466553211212,
0.026302076876163483,
-0.33115461468696594,
-0.0937834158539772,
-0.011363771744072437,
-0.09160836786031723,
-0.029828879982233047,
0.04430842027068138,
0.08963862806558609,
-0.07562659680843353,
0.06577971577644348,
-0.06103985011577606,
0.09144850075244904,
-0.059319667518138885,
0.029836803674697876,
0.03289255127310753,
0.07434683293104172,
0.005700880195945501,
0.03577127307653427,
-0.2962503433227539,
0.28073421120643616,
0.005631123203784227,
0.07630942016839981,
-0.059538017958402634,
0.012447638437151909,
0.02244623191654682,
0.021201057359576225,
0.0854242816567421,
-0.025091901421546936,
-0.12549014389514923,
-0.16572368144989014,
-0.09539511799812317,
0.015275818295776844,
0.12291479855775833,
0.03043687902390957,
0.11055338382720947,
-0.008221535012125969,
-0.016779381781816483,
0.04930062219500542,
-0.10247119516134262,
-0.0565626323223114,
-0.09930874407291412,
0.013917908072471619,
0.06958311051130295,
0.017841244116425514,
-0.07698749750852585,
-0.10803275555372238,
-0.07963237911462784,
0.161455899477005,
-0.04690762236714363,
-0.049646005034446716,
-0.12043671309947968,
0.009213562123477459,
0.10760517418384552,
-0.08037063479423523,
0.0627606213092804,
0.007560367230325937,
0.1034381240606308,
0.003693344769999385,
-0.06942233443260193,
0.11578889191150665,
-0.06958215683698654,
-0.16740162670612335,
-0.023777656257152557,
0.14403222501277924,
0.029652034863829613,
0.06261475384235382,
-0.010333992540836334,
0.03588103502988815,
-0.02198963798582554,
-0.0782666876912117,
0.03668055683374405,
0.0313185378909111,
0.04941844940185547,
-0.018752507865428925,
-0.014451628550887108,
-0.005778694525361061,
-0.0897565484046936,
-0.01813792996108532,
0.20751960575580597,
0.24517950415611267,
-0.09391327947378159,
0.095774345099926,
0.06509755551815033,
-0.03955508768558502,
-0.17117023468017578,
-0.009669424965977669,
0.07201457023620605,
-0.00040477776201441884,
-0.03234190493822098,
-0.1950286626815796,
0.02182387374341488,
0.06428606063127518,
-0.02105681411921978,
0.07620948553085327,
-0.3114224076271057,
-0.1389889419078827,
0.14483876526355743,
0.11684533208608627,
0.057372041046619415,
-0.14682094752788544,
-0.05427340418100357,
-0.009698581881821156,
-0.08959914743900299,
0.09872198104858398,
-0.07368794083595276,
0.13339248299598694,
-0.02151283621788025,
0.0900125801563263,
0.011481883004307747,
-0.05909395590424538,
0.10904435813426971,
0.006878409069031477,
0.05564282089471817,
-0.04371855780482292,
0.02109719254076481,
0.04945603385567665,
-0.06575894355773926,
0.05426900461316109,
-0.07870833575725555,
0.0321306437253952,
-0.08992088586091995,
-0.030698301270604134,
-0.08440285176038742,
0.012920956127345562,
-0.012694328092038631,
-0.027571629732847214,
-0.038240376859903336,
0.00040720109245739877,
0.06439678370952606,
-0.012324657291173935,
0.15859998762607574,
-0.0258988868445158,
0.1213768869638443,
0.16440238058567047,
0.10472052544355392,
-0.10338187217712402,
-0.06646968424320221,
0.006159121636301279,
-0.03442716598510742,
0.05600771680474281,
-0.12481767684221268,
0.0331452377140522,
0.13678844273090363,
0.02906477451324463,
0.11560565233230591,
0.0657036304473877,
-0.07196593284606934,
0.029690509662032127,
0.03940979763865471,
-0.14030630886554718,
-0.1259399950504303,
0.012432526797056198,
0.04283227026462555,
-0.07060881704092026,
0.07352157682180405,
0.11225481331348419,
-0.05890776589512825,
-0.019077425822615623,
-0.0010647890157997608,
0.014384094625711441,
-0.039235200732946396,
0.19945017993450165,
0.04253912717103958,
0.06556674838066101,
-0.12472614645957947,
0.07962489128112793,
0.04067164659500122,
-0.13785240054130554,
0.06680858135223389,
0.11523443460464478,
-0.09564115107059479,
-0.029312387108802795,
0.03305184841156006,
0.1058652251958847,
-0.027327246963977814,
-0.07625725865364075,
-0.14180098474025726,
-0.14805257320404053,
0.11542604118585587,
0.20982274413108826,
0.05477139726281166,
0.011962365359067917,
-0.05966893583536148,
0.016742343083024025,
-0.12094023823738098,
0.07404458522796631,
0.040687933564186096,
0.06161949783563614,
-0.12236526608467102,
0.15302594006061554,
0.01823774166405201,
0.04901929199695587,
-0.014212665148079395,
-0.008479558862745762,
-0.11560764163732529,
0.04105975478887558,
-0.1377730667591095,
0.007889210246503353,
-0.06813781708478928,
0.002953618997707963,
0.002498693997040391,
-0.04447924718260765,
-0.062049854546785355,
0.03951378911733627,
-0.12002760171890259,
-0.02218621037900448,
-0.004193393047899008,
0.029725441709160805,
-0.12637798488140106,
-0.009144372306764126,
0.007749427575618029,
-0.09551648050546646,
0.09743473678827286,
0.08704204112291336,
-0.02983301691710949,
0.050036896020174026,
-0.04546830430626869,
-0.03167468309402466,
0.08094117045402527,
-0.003110236721113324,
0.055044252425432205,
-0.13397149741649628,
-0.019748948514461517,
0.014943324960768223,
0.03051268868148327,
0.02191765606403351,
0.11163926869630814,
-0.11216187477111816,
0.002342303516343236,
-0.02661878988146782,
-0.052631352096796036,
-0.0695110633969307,
0.0566021203994751,
0.10603443533182144,
0.028557132929563522,
0.16374637186527252,
-0.09526465833187103,
0.030032064765691757,
-0.16133320331573486,
0.004723858553916216,
-0.02056591957807541,
-0.12526042759418488,
-0.043614841997623444,
-0.031058959662914276,
0.08091603964567184,
-0.06501792371273041,
0.12357719242572784,
-0.027396967634558678,
0.03133884072303772,
0.039567429572343826,
-0.08330715447664261,
-0.04500983655452728,
0.04368012025952339,
0.19865919649600983,
0.037938669323921204,
-0.04089481383562088,
0.07326071709394455,
0.017733758315443993,
0.07938048988580704,
0.12459861487150192,
0.1737319976091385,
0.15788210928440094,
0.060173243284225464,
0.11847540736198425,
0.05435815453529358,
-0.058412231504917145,
-0.16708436608314514,
0.08628037571907043,
-0.06032026931643486,
0.13355810940265656,
-0.011683795601129532,
0.23349842429161072,
0.126515194773674,
-0.15185151994228363,
0.06547676026821136,
-0.01775580458343029,
-0.08892745524644852,
-0.11879414319992065,
-0.059978779405355453,
-0.08449370414018631,
-0.17035658657550812,
0.007223862688988447,
-0.10407434403896332,
0.060791682451963425,
0.04036923497915268,
0.0406450591981411,
0.017503537237644196,
0.13356520235538483,
0.025533415377140045,
0.0011981537099927664,
0.0938468649983406,
-0.0034534884616732597,
-0.05139409005641937,
-0.0654342845082283,
-0.08168738335371017,
0.03930104151368141,
-0.011124776676297188,
0.05700472742319107,
-0.0044067357666790485,
-0.06600939482450485,
0.05390038341283798,
-0.035257499665021896,
-0.09521207958459854,
0.02477937377989292,
0.02138591930270195,
0.07421143352985382,
0.053345803171396255,
0.0343724749982357,
-0.03974883630871773,
-0.0016492705326527357,
0.19061097502708435,
-0.0947212427854538,
-0.09959877282381058,
-0.10897103697061539,
0.2683177888393402,
0.03826966509222984,
-0.01721738465130329,
0.022094130516052246,
-0.058050334453582764,
-0.03629877790808678,
0.2044251561164856,
0.17119856178760529,
-0.010132716968655586,
0.004274469800293446,
-0.01581609807908535,
-0.005809308495372534,
-0.043228887021541595,
0.08381844311952591,
0.15583012998104095,
0.06372498720884323,
-0.06269604712724686,
-0.06358547508716583,
-0.05333370715379715,
-0.034645576030015945,
-0.06843351572751999,
0.07628190517425537,
0.014270270243287086,
-0.02650071680545807,
-0.03774745762348175,
0.0622498095035553,
-0.09407172352075577,
-0.08780978620052338,
0.01707332581281662,
-0.1899011880159378,
-0.1541675627231598,
0.007431644015014172,
0.06914526224136353,
0.013699430041015148,
0.03485763445496559,
0.0046659428626298904,
-0.013051481917500496,
0.08807174861431122,
0.0005368085112422705,
-0.08228840678930283,
-0.060809750109910965,
0.092787005007267,
-0.14782628417015076,
0.15854524075984955,
-0.03908930718898773,
0.04669244587421417,
0.12287257611751556,
0.08951910585165024,
-0.08050762861967087,
0.08849873393774033,
0.04622596129775047,
-0.10895267128944397,
0.02583940513432026,
0.15606917440891266,
-0.03488616645336151,
0.0890420526266098,
0.02996581420302391,
-0.11539477854967117,
0.010171609930694103,
-0.10265477001667023,
-0.03983833268284798,
-0.03537425026297569,
-0.04617121443152428,
-0.04696659743785858,
0.10657443851232529,
0.1665657013654709,
-0.045781467109918594,
0.004395944532006979,
-0.053576916456222534,
0.008421660400927067,
0.046719495207071304,
0.003148264018818736,
-0.05753806233406067,
-0.2782512605190277,
0.011577482335269451,
0.027842320501804352,
0.00722676794975996,
-0.2543206810951233,
-0.08786150068044662,
0.010264093987643719,
-0.04437977075576782,
-0.08825569599866867,
0.08789321780204773,
0.07012148946523666,
0.04342355951666832,
-0.058009400963783264,
-0.04866177216172218,
-0.03920764848589897,
0.18731571733951569,
-0.17453256249427795,
-0.0540112666785717
] |
null | null | transformers |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# wav2vec2-base-timit-demo-colab
This model is a fine-tuned version of [facebook/wav2vec2-base](https://huggingface.co/facebook/wav2vec2-base) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.4526
- Wer: 0.3411
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 32
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 1000
- num_epochs: 30
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Wer |
|:-------------:|:-----:|:----:|:---------------:|:------:|
| 3.7503 | 4.0 | 500 | 2.4125 | 1.0006 |
| 0.9595 | 8.0 | 1000 | 0.4833 | 0.4776 |
| 0.3018 | 12.0 | 1500 | 0.4333 | 0.4062 |
| 0.1751 | 16.0 | 2000 | 0.4474 | 0.3697 |
| 0.1288 | 20.0 | 2500 | 0.4445 | 0.3558 |
| 0.1073 | 24.0 | 3000 | 0.4695 | 0.3464 |
| 0.0816 | 28.0 | 3500 | 0.4526 | 0.3411 |
### Framework versions
- Transformers 4.11.3
- Pytorch 1.9.0+cu111
- Datasets 1.13.3
- Tokenizers 0.10.3
| {"license": "apache-2.0", "tags": ["generated_from_trainer"], "model-index": [{"name": "wav2vec2-base-timit-demo-colab", "results": []}]} | automatic-speech-recognition | Raintree/wav2vec2-base-timit-demo-colab | [
"transformers",
"pytorch",
"tensorboard",
"wav2vec2",
"automatic-speech-recognition",
"generated_from_trainer",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [] | TAGS
#transformers #pytorch #tensorboard #wav2vec2 #automatic-speech-recognition #generated_from_trainer #license-apache-2.0 #endpoints_compatible #region-us
| wav2vec2-base-timit-demo-colab
==============================
This model is a fine-tuned version of facebook/wav2vec2-base on the None dataset.
It achieves the following results on the evaluation set:
* Loss: 0.4526
* Wer: 0.3411
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 0.0001
* train\_batch\_size: 32
* eval\_batch\_size: 8
* seed: 42
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* lr\_scheduler\_warmup\_steps: 1000
* num\_epochs: 30
* mixed\_precision\_training: Native AMP
### Training results
### Framework versions
* Transformers 4.11.3
* Pytorch 1.9.0+cu111
* Datasets 1.13.3
* Tokenizers 0.10.3
| [
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 0.0001\n* train\\_batch\\_size: 32\n* eval\\_batch\\_size: 8\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 1000\n* num\\_epochs: 30\n* mixed\\_precision\\_training: Native AMP",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.11.3\n* Pytorch 1.9.0+cu111\n* Datasets 1.13.3\n* Tokenizers 0.10.3"
] | [
"TAGS\n#transformers #pytorch #tensorboard #wav2vec2 #automatic-speech-recognition #generated_from_trainer #license-apache-2.0 #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 0.0001\n* train\\_batch\\_size: 32\n* eval\\_batch\\_size: 8\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 1000\n* num\\_epochs: 30\n* mixed\\_precision\\_training: Native AMP",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.11.3\n* Pytorch 1.9.0+cu111\n* Datasets 1.13.3\n* Tokenizers 0.10.3"
] | [
56,
130,
4,
34
] | [
"passage: TAGS\n#transformers #pytorch #tensorboard #wav2vec2 #automatic-speech-recognition #generated_from_trainer #license-apache-2.0 #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 0.0001\n* train\\_batch\\_size: 32\n* eval\\_batch\\_size: 8\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 1000\n* num\\_epochs: 30\n* mixed\\_precision\\_training: Native AMP### Training results### Framework versions\n\n\n* Transformers 4.11.3\n* Pytorch 1.9.0+cu111\n* Datasets 1.13.3\n* Tokenizers 0.10.3"
] | [
-0.10640588402748108,
0.09771128743886948,
-0.0034335532691329718,
0.057541269809007645,
0.1105281189084053,
-0.021963700652122498,
0.12820348143577576,
0.1463197022676468,
-0.11004442721605301,
0.06768353283405304,
0.1260748654603958,
0.15099292993545532,
0.040930747985839844,
0.14701002836227417,
-0.05065707117319107,
-0.2882441282272339,
0.04405807703733444,
0.038903746753931046,
-0.021033817902207375,
0.1266198456287384,
0.0861131101846695,
-0.12706004083156586,
0.05149313062429428,
0.03845204785466194,
-0.16357487440109253,
-0.0018441422143951058,
-0.006707940250635147,
-0.10699240863323212,
0.12129245698451996,
0.008919361047446728,
0.07451750338077545,
0.047646522521972656,
0.06402324885129929,
-0.21334382891654968,
0.008781387470662594,
0.04449189081788063,
0.03004845231771469,
0.07366485148668289,
0.05680985748767853,
-0.025479624047875404,
0.11165855824947357,
-0.07480572909116745,
0.08517567813396454,
0.032661713659763336,
-0.10087691992521286,
-0.3011643886566162,
-0.08567805588245392,
0.043587576597929,
0.07214099913835526,
0.08923671394586563,
-0.008419583551585674,
0.1485128551721573,
-0.05827116221189499,
0.11257786303758621,
0.2787940800189972,
-0.31155991554260254,
-0.04564674198627472,
-0.04588843137025833,
0.05812520161271095,
0.061157334595918655,
-0.09401797503232956,
-0.022297414019703865,
0.009937435388565063,
0.04667022451758385,
0.1355445235967636,
-0.015669137239456177,
-0.06422367691993713,
-0.009230914525687695,
-0.15015335381031036,
-0.06244037300348282,
0.11709445714950562,
0.016277402639389038,
-0.04013063758611679,
-0.09751667082309723,
-0.05472748726606369,
-0.20284095406532288,
-0.06831064075231552,
-0.016451053321361542,
0.04098217189311981,
-0.047472886741161346,
-0.1058269739151001,
-0.01422939170151949,
-0.06743406504392624,
-0.07247733324766159,
-0.040944766253232956,
0.19562356173992157,
0.061776719987392426,
0.0013483073562383652,
-0.04288426414132118,
0.07316536456346512,
-0.019338475540280342,
-0.14017395675182343,
-0.021631380543112755,
0.03525954484939575,
-0.018385576084256172,
-0.014312688261270523,
-0.04624345153570175,
-0.07036095857620239,
0.018238574266433716,
0.1578802615404129,
-0.1090763732790947,
0.09871631860733032,
-0.014398466795682907,
0.0389573760330677,
-0.10216960310935974,
0.20683316886425018,
-0.03815002739429474,
0.028429314494132996,
-0.007571348454803228,
0.05297616496682167,
0.029055306687951088,
-0.027179136872291565,
-0.09340192377567291,
0.035700999200344086,
0.1197645366191864,
0.045321281999349594,
-0.0488336980342865,
0.0640021413564682,
-0.027382684871554375,
-0.010363583452999592,
0.0034343143925070763,
-0.11505693942308426,
0.03481103479862213,
0.018726415932178497,
-0.06466829031705856,
0.0017118332907557487,
0.01944267936050892,
0.005134872160851955,
-0.0597236230969429,
0.0853591188788414,
-0.05684848129749298,
0.03206822648644447,
-0.060851484537124634,
-0.127508282661438,
0.023264024406671524,
-0.10862229764461517,
0.000018451528376317583,
-0.10122515261173248,
-0.09540826827287674,
-0.011269135400652885,
0.04148660972714424,
-0.03353361040353775,
-0.029293352738022804,
-0.0715332180261612,
-0.09367574006319046,
0.04343336075544357,
-0.037043359130620956,
0.07542571425437927,
-0.07133171707391739,
0.0951678454875946,
0.03330225870013237,
0.08396153897047043,
-0.016232365742325783,
0.06149870902299881,
-0.06770718842744827,
0.026836765930056572,
-0.20918627083301544,
0.07701179385185242,
-0.0917908102273941,
0.05700808763504028,
-0.12207861989736557,
-0.11878596246242523,
0.031026426702737808,
-0.006307382136583328,
0.1000051349401474,
0.08978221565485,
-0.17136774957180023,
-0.08744776248931885,
0.20677068829536438,
-0.08283834904432297,
-0.0808715969324112,
0.12201759964227676,
-0.02802157960832119,
-0.005424817558377981,
0.05528951808810234,
0.25901105999946594,
0.058696892112493515,
-0.12057512253522873,
0.016039883717894554,
-0.040200766175985336,
0.04574538394808769,
-0.04015674814581871,
0.05632656812667847,
-0.02192801982164383,
0.07210502028465271,
0.01774965599179268,
-0.006019867490977049,
0.04009164124727249,
-0.08762607723474503,
-0.07627437263727188,
-0.04357533156871796,
-0.07746562361717224,
0.021547146141529083,
0.03492262214422226,
0.0661718025803566,
-0.11783131957054138,
-0.10885369777679443,
0.04991709813475609,
0.08153003454208374,
-0.10595923662185669,
0.07270356267690659,
-0.1201249361038208,
0.08163260668516159,
-0.00867320504039526,
-0.004810233134776354,
-0.19425341486930847,
0.03621065244078636,
0.035835057497024536,
-0.028249740600585938,
0.040676966309547424,
-0.06176110357046127,
0.07649286836385727,
0.04797812178730965,
-0.02389243245124817,
-0.04567285254597664,
-0.007896256633102894,
0.012307683937251568,
-0.08763503283262253,
-0.203860342502594,
-0.03678692877292633,
-0.03940552473068237,
0.07178790122270584,
-0.13286733627319336,
0.03484516963362694,
0.07057645171880722,
0.09325028210878372,
0.02888607792556286,
-0.0298792514950037,
-0.002233546692878008,
0.09070941060781479,
-0.01697530597448349,
-0.06449723988771439,
0.05813680961728096,
0.02202554978430271,
-0.08340513706207275,
0.04000019282102585,
-0.15013442933559418,
0.1301785707473755,
0.14514361321926117,
-0.0107007110491395,
-0.06625023484230042,
0.0001875104062492028,
-0.04801234230399132,
-0.032888833433389664,
-0.001175106968730688,
0.03853281959891319,
0.22919690608978271,
0.012960275635123253,
0.14455290138721466,
-0.09023592621088028,
-0.04139864072203636,
0.04819865897297859,
-0.022212866693735123,
-0.0024588708765804768,
0.11922735720872879,
0.04443280026316643,
-0.05828293785452843,
0.113211490213871,
0.08805679529905319,
-0.08464011549949646,
0.12084656208753586,
-0.06769972294569016,
-0.07189048826694489,
-0.01732848770916462,
0.006361868232488632,
0.024256374686956406,
0.09589849412441254,
-0.1547175496816635,
-0.03819519653916359,
0.02667940780520439,
0.022897880524396896,
0.022295698523521423,
-0.2106976956129074,
0.0136019391939044,
0.033869419246912,
-0.08043181896209717,
-0.04615025967359543,
-0.00021831842605024576,
0.014255208894610405,
0.0938047543168068,
0.01039956510066986,
-0.09635376185178757,
0.009430361911654472,
0.003474973142147064,
-0.0722615122795105,
0.18064726889133453,
-0.12151099741458893,
-0.178731307387352,
-0.10134101659059525,
-0.09060904383659363,
-0.03378106281161308,
-0.0058166468515992165,
0.08953768759965897,
-0.09286920726299286,
-0.041621431708335876,
-0.081912562251091,
-0.019546931609511375,
-0.023570802062749863,
0.04165470972657204,
0.0324203297495842,
-0.013369633816182613,
0.06506458669900894,
-0.11818241328001022,
-0.018049893900752068,
-0.04207919165492058,
0.00041315355338156223,
0.057757116854190826,
0.03970518708229065,
0.105654276907444,
0.15844613313674927,
-0.01657518744468689,
0.04869111627340317,
-0.043518856167793274,
0.18891584873199463,
-0.07464897632598877,
-0.04123559594154358,
0.1161477193236351,
-0.0068075694143772125,
0.06927806884050369,
0.11264190822839737,
0.048625648021698,
-0.09347929805517197,
-0.01276784110814333,
0.000046683162508998066,
-0.04793641343712807,
-0.21658442914485931,
-0.03590233996510506,
-0.045677196234464645,
-0.01133507676422596,
0.10377474874258041,
0.041035450994968414,
0.027647756040096283,
0.020697973668575287,
0.03380267694592476,
0.0017526031006127596,
0.005862936843186617,
0.09560928493738174,
0.13224172592163086,
0.03919536992907524,
0.13437016308307648,
-0.03982815518975258,
-0.03893489018082619,
0.029791386798024178,
-0.0001800684694899246,
0.23521730303764343,
0.013407692313194275,
0.17888042330741882,
0.05867830663919449,
0.1679839938879013,
0.043027833104133606,
0.06906022131443024,
-0.0062215132638812065,
-0.010554403066635132,
0.010916383937001228,
-0.050435975193977356,
-0.043181661516427994,
0.019345011562108994,
0.022811638191342354,
0.008107840083539486,
-0.11129123717546463,
-0.007947501726448536,
0.043302785605192184,
0.34869182109832764,
0.022401053458452225,
-0.33552753925323486,
-0.09278880804777145,
-0.01251181960105896,
-0.09067121148109436,
-0.03115379624068737,
0.04532421752810478,
0.088435597717762,
-0.07880674302577972,
0.06529868394136429,
-0.06190440058708191,
0.09082548320293427,
-0.05823845416307449,
0.03233180567622185,
0.03736070543527603,
0.07563943415880203,
0.007575155235826969,
0.03392147272825241,
-0.2950969338417053,
0.27588948607444763,
0.0028785683680325747,
0.07531854510307312,
-0.060946762561798096,
0.010985375382006168,
0.02178710326552391,
0.019592003896832466,
0.07998023182153702,
-0.025404002517461777,
-0.12191168963909149,
-0.1675414890050888,
-0.09168579429388046,
0.016866102814674377,
0.12645313143730164,
0.02612100914120674,
0.10946983844041824,
-0.008734775707125664,
-0.014122897759079933,
0.05109573155641556,
-0.10221074521541595,
-0.06128668040037155,
-0.09769386053085327,
0.012325284071266651,
0.07038436830043793,
0.023432813584804535,
-0.07543168216943741,
-0.10609906166791916,
-0.08295411616563797,
0.15618997812271118,
-0.042347196489572525,
-0.048327092081308365,
-0.11996669322252274,
0.004912687931209803,
0.11011766642332077,
-0.07749336957931519,
0.06498318165540695,
0.010059035383164883,
0.1028977707028389,
0.008652159944176674,
-0.07126186043024063,
0.11814344674348831,
-0.0646694153547287,
-0.16545817255973816,
-0.026054196059703827,
0.1427847146987915,
0.031616292893886566,
0.061293985694646835,
-0.009213453158736229,
0.03712616115808487,
-0.02372128702700138,
-0.08020585775375366,
0.03510110080242157,
0.0256874468177557,
0.0485478974878788,
-0.017406124621629715,
-0.016933687031269073,
-0.004238099791109562,
-0.08727806061506271,
-0.015175157226622105,
0.20685283839702606,
0.2379787117242813,
-0.0953323170542717,
0.0957922488451004,
0.06328006088733673,
-0.0384887158870697,
-0.17185711860656738,
-0.009605915285646915,
0.07351277768611908,
0.00037806396721862257,
-0.027554845437407494,
-0.19317594170570374,
0.025285327807068825,
0.0660187304019928,
-0.020979929715394974,
0.0784062072634697,
-0.31843462586402893,
-0.14132806658744812,
0.14413057267665863,
0.11953861266374588,
0.05501333251595497,
-0.14563286304473877,
-0.05179552361369133,
-0.011198657564818859,
-0.09895560145378113,
0.10181966423988342,
-0.07804198563098907,
0.13386157155036926,
-0.022614730522036552,
0.09509599953889847,
0.010499450378119946,
-0.06038369610905647,
0.10584000498056412,
0.0060570440255105495,
0.05835656821727753,
-0.04650159925222397,
0.022400571033358574,
0.050065796822309494,
-0.06280914694070816,
0.05315794423222542,
-0.07800884544849396,
0.03235083818435669,
-0.08522062748670578,
-0.030850041657686234,
-0.08659930527210236,
0.012173490598797798,
-0.010321114212274551,
-0.0300687737762928,
-0.03952309116721153,
0.0002494970103725791,
0.06482293456792831,
-0.013244900852441788,
0.15721343457698822,
-0.023917188867926598,
0.12576250731945038,
0.1658579409122467,
0.10229071229696274,
-0.10426083207130432,
-0.07592219859361649,
0.00642378581687808,
-0.031161248683929443,
0.055758751928806305,
-0.12279412895441055,
0.03744971752166748,
0.13685686886310577,
0.03014572523534298,
0.11683246493339539,
0.06862305104732513,
-0.0672721117734909,
0.030456043779850006,
0.039927106350660324,
-0.1400262713432312,
-0.12703654170036316,
0.015178284607827663,
0.03806147351861,
-0.06858588010072708,
0.07535747438669205,
0.11128807067871094,
-0.05709114670753479,
-0.01643543131649494,
-0.001228052075020969,
0.01279000099748373,
-0.03955138474702835,
0.2031201422214508,
0.039545685052871704,
0.06405656039714813,
-0.1256726086139679,
0.07743068784475327,
0.04033442586660385,
-0.13092999160289764,
0.06164563074707985,
0.10930138826370239,
-0.09659130871295929,
-0.02714560367166996,
0.037367235869169235,
0.10454843938350677,
-0.026293322443962097,
-0.0731661468744278,
-0.14125937223434448,
-0.1491105556488037,
0.11141172051429749,
0.205085888504982,
0.05567866563796997,
0.010295015759766102,
-0.06159357354044914,
0.016225507482886314,
-0.11900831013917923,
0.07173449546098709,
0.04133453220129013,
0.06150741130113602,
-0.12358707934617996,
0.15171471238136292,
0.020250022411346436,
0.04270557686686516,
-0.015975553542375565,
-0.009086045436561108,
-0.11366970837116241,
0.041813526302576065,
-0.13140425086021423,
0.004951505456119776,
-0.06299158185720444,
0.0037018402945250273,
0.0011602211743593216,
-0.04741319641470909,
-0.06240374967455864,
0.03812229260802269,
-0.12044806778430939,
-0.020884212106466293,
0.00018444182933308184,
0.02935892343521118,
-0.12602365016937256,
-0.011495986022055149,
0.012894248589873314,
-0.09291935712099075,
0.09455393999814987,
0.08733458071947098,
-0.03368100896477699,
0.052464794367551804,
-0.05295100435614586,
-0.03293954208493233,
0.08180329948663712,
-0.0019604337867349386,
0.0526864118874073,
-0.13419924676418304,
-0.019534463062882423,
0.015528376214206219,
0.03214772790670395,
0.021061887964606285,
0.11334100365638733,
-0.1140793040394783,
0.0009906282648444176,
-0.030028102919459343,
-0.051941994577646255,
-0.07041387259960175,
0.05105661228299141,
0.1064593493938446,
0.031913094222545624,
0.16713285446166992,
-0.09319768100976944,
0.027813319116830826,
-0.1639295369386673,
0.0031311260536313057,
-0.01597680151462555,
-0.1275714784860611,
-0.04541504755616188,
-0.02791511081159115,
0.0787658765912056,
-0.06478916853666306,
0.1318107694387436,
-0.02907685935497284,
0.02602233551442623,
0.03713091462850571,
-0.07810395210981369,
-0.048124101012945175,
0.040964193642139435,
0.20325621962547302,
0.03787298873066902,
-0.03905526176095009,
0.07691480964422226,
0.020969225093722343,
0.07848162949085236,
0.12166932225227356,
0.17433315515518188,
0.16070599853992462,
0.060576241463422775,
0.11233191192150116,
0.054408494383096695,
-0.054204944521188736,
-0.1696946769952774,
0.08760861307382584,
-0.059450868517160416,
0.13082678616046906,
-0.014051316305994987,
0.23533952236175537,
0.12472600489854813,
-0.15240222215652466,
0.06893642991781235,
-0.018605897203087807,
-0.0919085443019867,
-0.11656323820352554,
-0.061820678412914276,
-0.08443712443113327,
-0.1736930012702942,
0.0066584637388587,
-0.10433949530124664,
0.06072872877120972,
0.04600616171956062,
0.037953607738018036,
0.01771189086139202,
0.137831911444664,
0.019372500479221344,
0.0005868753651157022,
0.08849018067121506,
-0.003388514742255211,
-0.05332401394844055,
-0.06731586158275604,
-0.0815291628241539,
0.03592732176184654,
-0.009441417641937733,
0.05801645293831825,
-0.00553080765530467,
-0.06847043335437775,
0.049337439239025116,
-0.03670436888933182,
-0.09336602687835693,
0.023424459621310234,
0.022909704595804214,
0.07216547429561615,
0.04732304438948631,
0.033614613115787506,
-0.04061725735664368,
-0.003745136084035039,
0.19170570373535156,
-0.09393376857042313,
-0.10174193978309631,
-0.10633710771799088,
0.25861942768096924,
0.04082251340150833,
-0.01715751737356186,
0.023206403478980064,
-0.058455392718315125,
-0.036404017359018326,
0.20730118453502655,
0.1769707053899765,
-0.011753653176128864,
0.005337652284651995,
-0.015791291370987892,
-0.0063401791267097,
-0.03948437422513962,
0.0863228514790535,
0.1559089869260788,
0.06519223749637604,
-0.06195035204291344,
-0.057215407490730286,
-0.05186375603079796,
-0.03217488154768944,
-0.06983182579278946,
0.07437513768672943,
0.011950604617595673,
-0.026249436661601067,
-0.04062502458691597,
0.06199711933732033,
-0.09421223402023315,
-0.08527499437332153,
0.018194066360592842,
-0.1911003440618515,
-0.15332934260368347,
0.00648321071639657,
0.07374507933855057,
0.007925632409751415,
0.03417455777525902,
0.004493322689086199,
-0.009067763574421406,
0.08154499530792236,
0.0004704966559074819,
-0.07994125783443451,
-0.06237576901912689,
0.09134040027856827,
-0.14275211095809937,
0.16057169437408447,
-0.03871241584420204,
0.05013453587889671,
0.121788389980793,
0.0890921950340271,
-0.0756404921412468,
0.08962827920913696,
0.044020019471645355,
-0.11242163181304932,
0.02239614725112915,
0.1542891263961792,
-0.03450491279363632,
0.09236074984073639,
0.030090294778347015,
-0.11805007606744766,
0.01199942734092474,
-0.09854710847139359,
-0.037966288626194,
-0.03843193128705025,
-0.0487128384411335,
-0.04689279571175575,
0.10637468099594116,
0.16688768565654755,
-0.04306791350245476,
0.006232090760022402,
-0.056046146899461746,
0.011638638563454151,
0.044987235218286514,
0.000837084196973592,
-0.06141325831413269,
-0.2794519364833832,
0.010977210476994514,
0.03251880779862404,
0.0033882218413054943,
-0.2511526346206665,
-0.09287230670452118,
0.011513251811265945,
-0.04673747718334198,
-0.08506464213132858,
0.08696034550666809,
0.07247745245695114,
0.04534013196825981,
-0.05637137591838837,
-0.05226455628871918,
-0.040988482534885406,
0.18743151426315308,
-0.17597970366477966,
-0.05593551695346832
] |
null | null | transformers |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# pegasus-sports-titles
This model is a fine-tuned pegasus on some **sports news articles scraped from the internet. (For educational purposes only)**. The model can generate titles for sports articles. Try it out using the inference API.
## Model description
A Pegasus model tuned on generating scientific titles has been further fine-tuned to generate titles for sports articles. While training articles on **Tennis, Football (Soccer), Cricket , Athletics and Rugby** were used to generate titles. I experimented training the Tokenizer from scratch but it did not give good results compared to the pre-trained tokenizer.
## Usage
```python
from transformers import pipeline
#Feel free to play around with the generation parameters.
#Reduce the beam width for faster inference
#Note that the maximum length for the generated titles is 64
gen_kwargs = {"length_penalty": 0.6, "num_beams":4, "num_return_sequences": 4,"num_beam_groups":4,"diversity_penalty":2.0}
pipe = pipeline("summarization", model="RajSang/pegasus-sports-titles")
#Change the article according to your wish
article="""
Coutinho was just about to be introduced by Villa boss Gerrard midway through the second half when Bruno Fernandes slammed home
his second goal of the game off the underside of the bar. But the Brazilian proved the catalyst for a memorable response.
First he drove at the United defence, helping to create the space which Jacob Ramsey exploited to halve the deficit. Then Ramsey slid over an excellent
cross from the left which Raphael Varane was unable to intercept as he slid back, leaving Coutinho to finish into an empty net.
The goal brought celebrations at both ends of the pitch as Emiliano Martinez also went into the crowd in relief - it was the Argentine's horrible sixth-minute error that had gifted Fernandes the visitors' opener.
Given his background - with Liverpool, Barcelona and Bayern Munich - Coutinho is a bold loan signing by Villa, and underlines the pedigree of the man they appointed as manager in November.
Gerrard is not at Villa to learn how to avoid relegation.
His demands remain as high as they were as a player and Coutinho's arrival is an example of that.
Villa are a better team since Gerrard's arrival and, after a sluggish start against opponents they dominated but lost to in the FA Cup five days ago, they grew into the game.
The club's other newboy, Lucas Digne, was among those denied by United keeper David de Gea at the end of the first half - in unorthodox fashion, with his knees.
Ollie Watkins did not really test the Spain keeper when Villa broke after Edinson Cavani lost possession in his own half. However, Emi Buendia certainly did with a near-post header. Rooted to his line, De Gea's reactions were up to the job as he beat Buendia's effort away.
When De Gea produced more saves after half-time to deny Ramsey and Digne again, it appeared the image of the night for Villa would be midfielder Morgan Sanson kicking a drinks bottle in fury after his error in gifting Fred possession to set up Fernandes for the visitors' second had been followed immediately by his substitution.
However, as it was the prelude to Coutinho's arrival, it was the moment that changed the course of the game - and the acclaim for the Brazilian at the final whistle indicated Villa's fans are already firmly behind him.
"""
result=pipe(article, **gen_kwargs)[0]["summary_text"]
print(result)
''' Output
Title 1 :
Coutinho's arrival sparks Villa comeback
Title 2 :
Philippe Coutinho marked his debut for Aston Villa with a goal and an assist as Steven Gerrard's side came from two goals down to draw with Manchester United.
Title 3 :
Steven Gerrard's first game in charge of Aston Villa ended in a dramatic draw against Manchester United - but it was the arrival of Philippe Coutinho that marked the night.
Title 4 :
Liverpool loanee Philippe Coutinho marked his first appearance for Aston Villa with two goals as Steven Gerrard's side came from two goals down to draw 2-2.'''
```
## Training procedure
While training, **short titles were combined with the subtitles for the articles to improve the quality of the generated titles and the subtitles were removed from the main body of the articles.**
##Limitations
In rare cases, if the opening few lines of a passage/article are descriptive enough, the model often just copies these lines instead of looking for information further down the articles, which may not be conducive in some cases.
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 2
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 8
- total_train_batch_size: 16
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 100
- num_epochs: 2
### Training results
**Rouge1:38.2315**
**Rouge2: 18.6598**
**RougueL: 31.7393**
**RougeLsum: 31.7086**
### Framework versions
- Transformers 4.15.0
- Pytorch 1.10.0+cu111
- Datasets 1.17.0
- Tokenizers 0.10.3
| {"language": "en", "tags": ["generated_from_trainer"], "widget": [{"text": "Coutinho was just about to be introduced by Villa boss Gerrard midway through the second half when Bruno Fernandes slammed home his second goal of the game off the underside of the bar. But the Brazilian proved the catalyst for a memorable response. First he drove at the United defence, helping to create the space which Jacob Ramsey exploited to halve the deficit. Then Ramsey slid over an excellent cross from the left which Raphael Varane was unable to intercept as he slid back, leaving Coutinho to finish into an empty net. The goal brought celebrations at both ends of the pitch as Emiliano Martinez also went into the crowd in relief - it was the Argentine's horrible sixth-minute error that had gifted Fernandes the visitors' opener. Given his background - with Liverpool, Barcelona and Bayern Munich - Coutinho is a bold loan signing by Villa, and underlines the pedigree of the man they appointed as manager in November. Gerrard is not at Villa to learn how to avoid relegation. His demands remain as high as they were as a player and Coutinho's arrival is an example of that. Villa are a better team since Gerrard's arrival and, after a sluggish start against opponents they dominated but lost to in the FA Cup five days ago, they grew into the game. The club's other newboy, Lucas Digne, was among those denied by United keeper David de Gea at the end of the first half - in unorthodox fashion, with his knees. Ollie Watkins did not really test the Spain keeper when Villa broke after Edinson Cavani lost possession in his own half. However, Emi Buendia certainly did with a near-post header. Rooted to his line, De Gea's reactions were up to the job as he beat Buendia's effort away. When De Gea produced more saves after half-time to deny Ramsey and Digne again, it appeared the image of the night for Villa would be midfielder Morgan Sanson kicking a drinks bottle in fury after his error in gifting Fred possession to set up Fernandes for the visitors' second had been followed immediately by his substitution. However, as it was the prelude to Coutinho's arrival, it was the moment that changed the course of the game - and the acclaim for the Brazilian at the final whistle indicated Villa's fans are already firmly behind him."}]} | text2text-generation | RajSang/pegasus-sports-titles | [
"transformers",
"pytorch",
"tensorboard",
"pegasus",
"text2text-generation",
"generated_from_trainer",
"en",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [
"en"
] | TAGS
#transformers #pytorch #tensorboard #pegasus #text2text-generation #generated_from_trainer #en #autotrain_compatible #endpoints_compatible #has_space #region-us
|
# pegasus-sports-titles
This model is a fine-tuned pegasus on some sports news articles scraped from the internet. (For educational purposes only). The model can generate titles for sports articles. Try it out using the inference API.
## Model description
A Pegasus model tuned on generating scientific titles has been further fine-tuned to generate titles for sports articles. While training articles on Tennis, Football (Soccer), Cricket , Athletics and Rugby were used to generate titles. I experimented training the Tokenizer from scratch but it did not give good results compared to the pre-trained tokenizer.
## Usage
## Training procedure
While training, short titles were combined with the subtitles for the articles to improve the quality of the generated titles and the subtitles were removed from the main body of the articles.
##Limitations
In rare cases, if the opening few lines of a passage/article are descriptive enough, the model often just copies these lines instead of looking for information further down the articles, which may not be conducive in some cases.
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 2
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 8
- total_train_batch_size: 16
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 100
- num_epochs: 2
### Training results
Rouge1:38.2315
Rouge2: 18.6598
RougueL: 31.7393
RougeLsum: 31.7086
### Framework versions
- Transformers 4.15.0
- Pytorch 1.10.0+cu111
- Datasets 1.17.0
- Tokenizers 0.10.3
| [
"# pegasus-sports-titles\n\nThis model is a fine-tuned pegasus on some sports news articles scraped from the internet. (For educational purposes only). The model can generate titles for sports articles. Try it out using the inference API.",
"## Model description\n\nA Pegasus model tuned on generating scientific titles has been further fine-tuned to generate titles for sports articles. While training articles on Tennis, Football (Soccer), Cricket , Athletics and Rugby were used to generate titles. I experimented training the Tokenizer from scratch but it did not give good results compared to the pre-trained tokenizer.",
"## Usage",
"## Training procedure\nWhile training, short titles were combined with the subtitles for the articles to improve the quality of the generated titles and the subtitles were removed from the main body of the articles.",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 5e-05\n- train_batch_size: 2\n- eval_batch_size: 8\n- seed: 42\n- gradient_accumulation_steps: 8\n- total_train_batch_size: 16\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_steps: 100\n- num_epochs: 2",
"### Training results\n\nRouge1:38.2315\n\nRouge2: 18.6598\n\nRougueL: 31.7393\n\nRougeLsum: 31.7086",
"### Framework versions\n\n- Transformers 4.15.0\n- Pytorch 1.10.0+cu111\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] | [
"TAGS\n#transformers #pytorch #tensorboard #pegasus #text2text-generation #generated_from_trainer #en #autotrain_compatible #endpoints_compatible #has_space #region-us \n",
"# pegasus-sports-titles\n\nThis model is a fine-tuned pegasus on some sports news articles scraped from the internet. (For educational purposes only). The model can generate titles for sports articles. Try it out using the inference API.",
"## Model description\n\nA Pegasus model tuned on generating scientific titles has been further fine-tuned to generate titles for sports articles. While training articles on Tennis, Football (Soccer), Cricket , Athletics and Rugby were used to generate titles. I experimented training the Tokenizer from scratch but it did not give good results compared to the pre-trained tokenizer.",
"## Usage",
"## Training procedure\nWhile training, short titles were combined with the subtitles for the articles to improve the quality of the generated titles and the subtitles were removed from the main body of the articles.",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 5e-05\n- train_batch_size: 2\n- eval_batch_size: 8\n- seed: 42\n- gradient_accumulation_steps: 8\n- total_train_batch_size: 16\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_steps: 100\n- num_epochs: 2",
"### Training results\n\nRouge1:38.2315\n\nRouge2: 18.6598\n\nRougueL: 31.7393\n\nRougeLsum: 31.7086",
"### Framework versions\n\n- Transformers 4.15.0\n- Pytorch 1.10.0+cu111\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] | [
57,
57,
83,
3,
45,
128,
30,
33
] | [
"passage: TAGS\n#transformers #pytorch #tensorboard #pegasus #text2text-generation #generated_from_trainer #en #autotrain_compatible #endpoints_compatible #has_space #region-us \n# pegasus-sports-titles\n\nThis model is a fine-tuned pegasus on some sports news articles scraped from the internet. (For educational purposes only). The model can generate titles for sports articles. Try it out using the inference API.## Model description\n\nA Pegasus model tuned on generating scientific titles has been further fine-tuned to generate titles for sports articles. While training articles on Tennis, Football (Soccer), Cricket , Athletics and Rugby were used to generate titles. I experimented training the Tokenizer from scratch but it did not give good results compared to the pre-trained tokenizer.## Usage## Training procedure\nWhile training, short titles were combined with the subtitles for the articles to improve the quality of the generated titles and the subtitles were removed from the main body of the articles.### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 5e-05\n- train_batch_size: 2\n- eval_batch_size: 8\n- seed: 42\n- gradient_accumulation_steps: 8\n- total_train_batch_size: 16\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_steps: 100\n- num_epochs: 2### Training results\n\nRouge1:38.2315\n\nRouge2: 18.6598\n\nRougueL: 31.7393\n\nRougeLsum: 31.7086### Framework versions\n\n- Transformers 4.15.0\n- Pytorch 1.10.0+cu111\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] | [
-0.051204804331064224,
0.1533658802509308,
-0.003965212032198906,
0.10351960361003876,
0.12636764347553253,
0.010245691984891891,
0.06375756114721298,
0.14067721366882324,
-0.0801045149564743,
0.09147023409605026,
0.05975127965211868,
0.02341228350996971,
0.08634038269519806,
0.08342236280441284,
0.052331455051898956,
-0.28506967425346375,
0.025497136637568474,
-0.028933102265000343,
-0.09699571877717972,
0.09790021926164627,
0.14678363502025604,
-0.08974453061819077,
0.03733355179429054,
0.017084771767258644,
-0.09620308130979538,
0.07141076773405075,
0.022099420428276062,
-0.0429505854845047,
0.04876390099525452,
0.0567534863948822,
0.09228964149951935,
0.008414268493652344,
0.03896806761622429,
-0.3011874556541443,
0.016173074021935463,
0.08482731878757477,
0.020193468779325485,
0.04259411245584488,
0.11360904574394226,
0.08105073124170303,
0.17033833265304565,
-0.12390332669019699,
0.11086191982030869,
0.05006958171725273,
-0.10599063336849213,
-0.1762869954109192,
-0.132168248295784,
0.03803267329931259,
0.04873809963464737,
0.10910464823246002,
-0.058461327105760574,
0.09496629983186722,
-0.07602719217538834,
0.006690299138426781,
0.24415910243988037,
-0.29717913269996643,
-0.050928495824337006,
0.06830663979053497,
0.01721721701323986,
0.1037709042429924,
-0.14090798795223236,
-0.03889011591672897,
0.008492100983858109,
0.01929929479956627,
0.03835154324769974,
0.01870102807879448,
-0.0023703824263066053,
-0.003651183331385255,
-0.07315206527709961,
-0.07298579066991806,
0.040615588426589966,
0.06140352413058281,
-0.07026313990354538,
-0.16933095455169678,
0.020516412332654,
-0.06384347379207611,
-0.07631530612707138,
-0.06811977177858353,
0.0690809115767479,
-0.0019911592826247215,
0.03022916615009308,
-0.06497714668512344,
-0.08912385255098343,
-0.011678607203066349,
-0.005695635452866554,
0.07198204100131989,
0.04264694079756737,
0.005697836633771658,
-0.04548503831028938,
0.08231522142887115,
0.051265865564346313,
-0.10732749849557877,
-0.010328610427677631,
-0.05507670342922211,
-0.11468394845724106,
-0.03272511437535286,
-0.0814034715294838,
-0.12073300778865814,
-0.04307179898023605,
0.15852969884872437,
-0.05428345128893852,
0.06686630100011826,
0.05881551280617714,
-0.010991747491061687,
-0.04236113652586937,
0.12612609565258026,
-0.09321285784244537,
-0.0696774274110794,
0.009132607840001583,
0.07937709987163544,
-0.0006285727140493691,
-0.027891485020518303,
-0.026003938168287277,
0.03514189273118973,
0.005314412526786327,
0.009542889893054962,
0.00894242525100708,
0.03290160000324249,
-0.04564634710550308,
-0.05337221920490265,
0.10680519789457321,
-0.11047002673149109,
0.03646594658493996,
-0.027467329055070877,
-0.030047429725527763,
0.018346412107348442,
-0.007718958426266909,
0.029890334233641624,
-0.04822685942053795,
0.078607939183712,
-0.1329057514667511,
-0.013072596862912178,
-0.06541264057159424,
-0.05262697488069534,
0.005097031593322754,
0.008228226564824581,
-0.05247759073972702,
-0.028179384768009186,
-0.07956086844205856,
-0.10317204147577286,
0.02599547617137432,
-0.02575874887406826,
-0.09109053760766983,
-0.05085977539420128,
-0.03253442421555519,
0.0063898004591465,
0.05531087517738342,
0.009748842567205429,
-0.012302568182349205,
0.049836643040180206,
-0.12997345626354218,
0.048484839498996735,
-0.056583281606435776,
0.07154254615306854,
-0.11447302997112274,
-0.0028582930099219084,
-0.1856631189584732,
0.09072580933570862,
-0.10762651264667511,
-0.03693348541855812,
-0.11655531823635101,
-0.056898508220911026,
-0.04970043525099754,
0.032828155905008316,
0.035178523510694504,
0.08784875273704529,
-0.11179734766483307,
-0.09841187298297882,
0.20130622386932373,
-0.09686654061079025,
-0.04551083967089653,
0.10371405631303787,
-0.04862196370959282,
-0.007843118160963058,
0.09775650501251221,
0.10429754853248596,
0.13142433762550354,
-0.05323605611920357,
-0.004331636242568493,
-0.054873276501894,
0.061232972890138626,
0.11091543734073639,
0.06770476698875427,
-0.06528297811746597,
0.06827761232852936,
0.0244520865380764,
-0.07605323940515518,
-0.05124102532863617,
-0.028619172051548958,
-0.08357549458742142,
-0.003309510415419936,
-0.02582351118326187,
0.05124179646372795,
0.052634768187999725,
0.059049103409051895,
-0.022777395322918892,
-0.1269734650850296,
0.022913403809070587,
0.0785132497549057,
-0.04626181721687317,
0.09382534772157669,
-0.06291010230779648,
0.04004950448870659,
0.034430764615535736,
-0.023958241567015648,
-0.20820888876914978,
-0.1348661333322525,
0.06635401397943497,
-0.1500244140625,
0.08621900528669357,
0.021421989426016808,
0.0021016590762883425,
0.027880366891622543,
-0.060689207166433334,
0.019591964781284332,
-0.15839743614196777,
-0.04132993891835213,
-0.055217478424310684,
-0.14808852970600128,
-0.03224921599030495,
-0.008412521332502365,
0.17272529006004333,
-0.2160094678401947,
0.012442126870155334,
-0.009749148041009903,
0.10856939107179642,
0.07626868039369583,
-0.10331573337316513,
-0.02161966823041439,
0.056482166051864624,
0.02278764545917511,
-0.05736399069428444,
0.032253485172986984,
0.004343345761299133,
-0.07292017340660095,
0.028841672465205193,
-0.043228629976511,
-0.06592631340026855,
0.08249339461326599,
0.006956347730010748,
-0.08955700695514679,
0.004887670744210482,
-0.05651862546801567,
-0.0014991917414590716,
-0.08338578790426254,
-0.031538646668195724,
0.15422332286834717,
0.02750575728714466,
0.09871044009923935,
-0.08589907735586166,
-0.06310992687940598,
0.016121692955493927,
-0.005072044674307108,
0.0023115428630262613,
0.00043378688860684633,
0.13619521260261536,
-0.09113079309463501,
0.10426317900419235,
0.053298983722925186,
0.01725613884627819,
0.16841156780719757,
-0.03965502232313156,
-0.09504089504480362,
0.041855670511722565,
0.0029532541520893574,
-0.03641638159751892,
0.04286612197756767,
-0.015082012861967087,
0.04344436526298523,
0.023640528321266174,
0.03437057137489319,
0.03270166739821434,
-0.09595546871423721,
-0.005227640736848116,
0.07330586016178131,
-0.011528626084327698,
-0.08936966955661774,
0.002087045693770051,
0.03900856524705887,
0.09634404629468918,
0.02557647041976452,
0.08631176501512527,
-0.013274475000798702,
-0.03180934488773346,
-0.07417672127485275,
0.16191984713077545,
-0.10215970128774643,
-0.13449813425540924,
-0.20071497559547424,
0.013277885504066944,
-0.0855243131518364,
0.0041022528894245625,
0.04253444820642471,
-0.08539910614490509,
-0.06139787286520004,
-0.0725535973906517,
0.028615258634090424,
0.041662923991680145,
-0.012301001697778702,
-0.002675158903002739,
0.02619285322725773,
0.021788155660033226,
-0.1067189946770668,
0.03417328745126724,
-0.040012016892433167,
-0.07144229859113693,
-0.055627621710300446,
-0.01871177740395069,
0.12004198879003525,
0.12584643065929413,
-0.034971751272678375,
-0.03746558353304863,
-0.06760764867067337,
0.21864032745361328,
-0.12711338698863983,
0.06916166841983795,
0.1013905256986618,
-0.005809422116726637,
0.10394515842199326,
0.09181145578622818,
0.021889474242925644,
-0.057598188519477844,
0.034140560775995255,
0.10550587624311447,
0.003922896459698677,
-0.2780395746231079,
-0.05371535196900368,
-0.06506774574518204,
-0.06888657808303833,
0.08966155350208282,
0.041437458246946335,
-0.024142269045114517,
0.05847477540373802,
-0.04814961180090904,
0.05725441128015518,
0.042990557849407196,
0.0630234032869339,
0.0525512658059597,
0.06792508810758591,
0.10313937813043594,
-0.05775786191225052,
-0.010403137654066086,
0.10829371213912964,
-0.02291114814579487,
0.22532053291797638,
-0.004564223345369101,
0.17947907745838165,
0.0764961764216423,
0.026329131796956062,
0.002967983949929476,
0.07237748801708221,
0.013002642430365086,
-0.002121082041412592,
-0.05339543893933296,
-0.01540912315249443,
-0.03424067422747612,
0.010441400110721588,
0.003913356456905603,
-0.028573011979460716,
-0.08796732127666473,
-0.023412063717842102,
0.008252676576375961,
0.24663938581943512,
0.0150910010561347,
-0.130192831158638,
-0.04736635088920593,
0.016177687793970108,
-0.09242845326662064,
-0.07427598536014557,
0.014809192158281803,
0.07767031341791153,
-0.174552783370018,
0.06218494847416878,
-0.020419634878635406,
0.06902030855417252,
-0.012127154506742954,
0.0003705417620949447,
0.07856491208076477,
0.027327872812747955,
-0.024848343804478645,
0.07378362119197845,
-0.11197482794523239,
0.1429140567779541,
-0.009467639029026031,
0.026208583265542984,
-0.053847070783376694,
0.054641325026750565,
0.0252385251224041,
0.006010363344103098,
0.10284507274627686,
0.023322682827711105,
-0.1507502794265747,
-0.13180868327617645,
-0.001943129813298583,
0.02014204114675522,
0.1386905312538147,
-0.07152332365512848,
0.14275839924812317,
-0.07296422868967056,
0.053865253925323486,
-0.01596144773066044,
-0.019907638430595398,
-0.08303531259298325,
-0.24204538762569427,
0.029266398400068283,
-0.06513593345880508,
0.007353520952165127,
-0.0693465992808342,
-0.051435552537441254,
-0.048289671540260315,
0.10127738863229752,
-0.009848274290561676,
-0.06657974421977997,
-0.1468980610370636,
0.10931013524532318,
0.07506498694419861,
-0.08814094215631485,
0.0999036356806755,
0.010901771485805511,
0.18329164385795593,
-0.045208241790533066,
-0.009023620747029781,
0.03153834864497185,
-0.00818030908703804,
-0.19318248331546783,
-0.029537351801991463,
0.10975777357816696,
0.060649365186691284,
0.03345923125743866,
0.014323100447654724,
0.04750313609838486,
0.017701586708426476,
-0.12397992610931396,
-0.002030199859291315,
0.019578978419303894,
-0.007227619178593159,
0.012109166011214256,
-0.03643237426877022,
-0.014688480645418167,
-0.04836555942893028,
-0.003467393573373556,
0.13438786566257477,
0.24220241606235504,
-0.09372127056121826,
0.1547299474477768,
0.0442669652402401,
-0.09267912805080414,
-0.2206849902868271,
-0.03174987807869911,
0.10963693261146545,
-0.027961067855358124,
0.10795612633228302,
-0.1806597262620926,
0.14966034889221191,
0.05510576814413071,
-0.016001086682081223,
0.012804369442164898,
-0.14832627773284912,
-0.13131923973560333,
0.02389783039689064,
0.06606566160917282,
0.014400566928088665,
-0.1079910472035408,
-0.03062654845416546,
-0.0396624319255352,
-0.17463606595993042,
0.11683578789234161,
0.040074754506349564,
0.09137159585952759,
-0.012210126034915447,
0.007623705081641674,
0.021990563720464706,
0.0017248477088287473,
0.14352619647979736,
0.04730425029993057,
0.06491297483444214,
-0.02156568504869938,
0.023929674178361893,
0.18817050755023956,
-0.0009801097912713885,
0.031480822712183,
0.0376552976667881,
0.02546304278075695,
-0.20142196118831635,
-0.013543561100959778,
-0.08133640140295029,
-0.039599258452653885,
-0.0754779651761055,
-0.007010200992226601,
-0.09839333593845367,
0.0725373774766922,
0.10914256423711777,
0.009212030097842216,
0.10489621758460999,
-0.04740959405899048,
0.11270678788423538,
0.030000800266861916,
0.1155293807387352,
0.060700733214616776,
-0.1731625348329544,
-0.016401905566453934,
-0.01928783766925335,
0.08135022222995758,
-0.11935562640428543,
0.05897032096982002,
0.0654325932264328,
0.02749624289572239,
0.03582306578755379,
0.029953761026263237,
-0.07357720285654068,
0.019617795944213867,
0.06516566127538681,
-0.06506134569644928,
-0.08942293375730515,
-0.016987081617116928,
0.02865784987807274,
-0.14325858652591705,
-0.10527423769235611,
0.07003414630889893,
0.0075419251807034016,
-0.09489153325557709,
0.02679496258497238,
0.006055835634469986,
-0.0009226284455507994,
0.08348463475704193,
0.06298147886991501,
0.029687462374567986,
-0.07692944258451462,
0.1125626340508461,
0.08391325175762177,
-0.09151708334684372,
-0.02419486828148365,
0.1094101294875145,
-0.10404380410909653,
-0.002623539650812745,
0.0033102959860116243,
0.05945529416203499,
-0.012283370830118656,
-0.03401586040854454,
-0.05898505076766014,
-0.11568471789360046,
0.0764750987291336,
-0.003470713272690773,
0.0352705754339695,
0.013262217864394188,
-0.025945791974663734,
0.01570167765021324,
-0.08620268851518631,
0.06707151979207993,
0.07231781631708145,
0.05666184797883034,
-0.05392002686858177,
-0.04812147840857506,
0.006686978042125702,
-0.009139235131442547,
0.001366285141557455,
0.056723177433013916,
-0.07196204364299774,
-0.05472370609641075,
-0.0008426932035945356,
-0.019687717780470848,
-0.006827933248132467,
-0.02337816171348095,
-0.04424778372049332,
-0.014213284477591515,
0.012694001197814941,
0.026773864403367043,
-0.1019534021615982,
-0.07119546085596085,
-0.0481136292219162,
0.09312596172094345,
-0.1463170349597931,
0.06771976500749588,
0.07532178610563278,
-0.09566763788461685,
0.08283454179763794,
-0.047143518924713135,
0.04386283829808235,
0.05071195960044861,
-0.07737237960100174,
0.030659569427371025,
-0.05610254779458046,
0.003945459611713886,
-0.006477429065853357,
-0.09419821202754974,
0.010680438950657845,
-0.04261601343750954,
-0.006282733753323555,
0.0036110393702983856,
0.004611757583916187,
-0.11941079050302505,
0.048339132219552994,
-0.013878975063562393,
-0.0588967427611351,
-0.024747854098677635,
0.009562010876834393,
0.07881630957126617,
0.037122126668691635,
0.20169192552566528,
-0.039487484842538834,
0.020489221438765526,
-0.24610914289951324,
-0.023687653243541718,
0.006216591224074364,
0.02964053861796856,
-0.09243979305028915,
-0.08631965517997742,
0.04394599422812462,
0.023453325033187866,
0.12511508166790009,
0.010894232429564,
0.006616045720875263,
0.06584767252206802,
0.03583400323987007,
0.05180731415748596,
0.042590901255607605,
0.10196385532617569,
0.05738038942217827,
-0.023495934903621674,
0.08797796070575714,
-0.06440477073192596,
-0.02647523581981659,
0.04556814581155777,
0.20900313556194305,
0.16925525665283203,
0.05112192779779434,
0.04875912144780159,
-0.037931762635707855,
0.05003206431865692,
-0.14804165065288544,
0.15092460811138153,
0.004681417252868414,
-0.009951149113476276,
-0.060660067945718765,
0.10387475788593292,
0.16530218720436096,
-0.23100586235523224,
0.13090820610523224,
0.008581374771893024,
-0.026688843965530396,
-0.13033942878246307,
-0.0821392759680748,
-0.060228340327739716,
-0.16200891137123108,
0.014399646781384945,
-0.13305167853832245,
0.07324101775884628,
0.10471300780773163,
-0.01494764443486929,
0.021417781710624695,
0.13233788311481476,
-0.09424669295549393,
-0.08866886049509048,
0.10523512214422226,
0.000543076079338789,
0.010417154058814049,
0.0022072347346693277,
-0.019942142069339752,
0.02213577926158905,
0.033552177250385284,
0.055777184665203094,
0.001628119614906609,
-0.003934922628104687,
0.024522477760910988,
-0.07053408026695251,
-0.0639776960015297,
0.04265347123146057,
0.01753455586731434,
0.034608036279678345,
0.1117829829454422,
-0.0052814120426774025,
0.026976993307471275,
-0.0483265295624733,
0.24042093753814697,
-0.016259856522083282,
-0.06813944876194,
-0.16337597370147705,
0.17941568791866302,
0.09576726704835892,
0.0331493578851223,
0.02591993287205696,
-0.14323543012142181,
0.04306663200259209,
0.15421557426452637,
0.07719483226537704,
-0.014807130210101604,
-0.020937977358698845,
0.009259821847081184,
0.003094469429925084,
0.01614144630730152,
0.15331733226776123,
0.034658532589673996,
0.08622867614030838,
-0.07905925065279007,
-0.00856984406709671,
-0.047072526067495346,
-0.024943577125668526,
-0.017627345398068428,
0.2395598441362381,
0.07703756541013718,
-0.013132895343005657,
-0.07251139730215073,
0.09384942799806595,
-0.02581721730530262,
-0.20662187039852142,
0.027765434235334396,
-0.11768858134746552,
-0.1329224854707718,
-0.005925533827394247,
-0.007924379780888557,
-0.009469589218497276,
0.08248991519212723,
0.07724139839410782,
-0.0092273885384202,
0.18079760670661926,
0.024267448112368584,
-0.038959044963121414,
-0.09451963752508163,
0.11610668152570724,
-0.15205222368240356,
0.2459089159965515,
0.008830167353153229,
0.023069359362125397,
0.0492839552462101,
0.028665408492088318,
-0.07381969690322876,
-0.0012809612089768052,
0.03276650980114937,
0.04200900346040726,
-0.024221127852797508,
0.14116178452968597,
0.006237372290343046,
0.06843998283147812,
0.05333347246050835,
-0.10475920140743256,
0.07490263879299164,
-0.1065886914730072,
-0.13046345114707947,
-0.06073246896266937,
0.090157151222229,
-0.10175168514251709,
0.11747384816408157,
0.2034623622894287,
-0.03620896488428116,
0.05163038522005081,
-0.0426299087703228,
0.00926101766526699,
0.01784079149365425,
0.12238649278879166,
-0.01762167178094387,
-0.13068412244319916,
0.03278927505016327,
0.03187327831983566,
0.018930554389953613,
-0.21756528317928314,
-0.038558345288038254,
0.04128814861178398,
-0.021634938195347786,
-0.0388740636408329,
0.11996010690927505,
-0.003944301512092352,
-0.04189774766564369,
-0.0355246439576149,
-0.16695831716060638,
-0.01834658533334732,
0.1274738758802414,
-0.08000975847244263,
-0.008833781816065311
] |
null | null | transformers |
# NepaliBERT(Phase 1)
NEPALIBERT is a state-of-the-art language model for Nepali based on the BERT model. The model is trained using a masked language modeling (MLM).
# Loading the model and tokenizer
1. clone the model repo
```
git lfs install
git clone https://huggingface.co/Rajan/NepaliBERT
```
2. Loading the Tokenizer
```
from transformers import BertTokenizer
vocab_file_dir = './NepaliBERT/'
tokenizer = BertTokenizer.from_pretrained(vocab_file_dir,
strip_accents=False,
clean_text=False )
```
3. Loading the model:
```
from transformers import BertForMaskedLM
model = BertForMaskedLM.from_pretrained('./NepaliBERT')
```
The easiest way to check whether our language model is learning anything interesting is via the ```FillMaskPipeline```.
Pipelines are simple wrappers around tokenizers and models, and the 'fill-mask' one will let you input a sequence containing a masked token (here, [mask]) and return a list of the most probable filled sequences, with their probabilities.
```
from transformers import pipeline
fill_mask = pipeline(
"fill-mask",
model=model,
tokenizer=tokenizer
)
```
For more info visit the [GITHUB🤗](https://github.com/R4j4n/NepaliBERT) | {} | fill-mask | Rajan/NepaliBERT | [
"transformers",
"pytorch",
"bert",
"fill-mask",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [] | TAGS
#transformers #pytorch #bert #fill-mask #autotrain_compatible #endpoints_compatible #region-us
|
# NepaliBERT(Phase 1)
NEPALIBERT is a state-of-the-art language model for Nepali based on the BERT model. The model is trained using a masked language modeling (MLM).
# Loading the model and tokenizer
1. clone the model repo
2. Loading the Tokenizer
3. Loading the model:
The easiest way to check whether our language model is learning anything interesting is via the .
Pipelines are simple wrappers around tokenizers and models, and the 'fill-mask' one will let you input a sequence containing a masked token (here, [mask]) and return a list of the most probable filled sequences, with their probabilities.
For more info visit the GITHUB | [
"# NepaliBERT(Phase 1) \nNEPALIBERT is a state-of-the-art language model for Nepali based on the BERT model. The model is trained using a masked language modeling (MLM).",
"# Loading the model and tokenizer \n1. clone the model repo \n\n2. Loading the Tokenizer \n\n3. Loading the model:\n\n\nThe easiest way to check whether our language model is learning anything interesting is via the .\n\nPipelines are simple wrappers around tokenizers and models, and the 'fill-mask' one will let you input a sequence containing a masked token (here, [mask]) and return a list of the most probable filled sequences, with their probabilities.\n\n\nFor more info visit the GITHUB"
] | [
"TAGS\n#transformers #pytorch #bert #fill-mask #autotrain_compatible #endpoints_compatible #region-us \n",
"# NepaliBERT(Phase 1) \nNEPALIBERT is a state-of-the-art language model for Nepali based on the BERT model. The model is trained using a masked language modeling (MLM).",
"# Loading the model and tokenizer \n1. clone the model repo \n\n2. Loading the Tokenizer \n\n3. Loading the model:\n\n\nThe easiest way to check whether our language model is learning anything interesting is via the .\n\nPipelines are simple wrappers around tokenizers and models, and the 'fill-mask' one will let you input a sequence containing a masked token (here, [mask]) and return a list of the most probable filled sequences, with their probabilities.\n\n\nFor more info visit the GITHUB"
] | [
36,
48,
118
] | [
"passage: TAGS\n#transformers #pytorch #bert #fill-mask #autotrain_compatible #endpoints_compatible #region-us \n# NepaliBERT(Phase 1) \nNEPALIBERT is a state-of-the-art language model for Nepali based on the BERT model. The model is trained using a masked language modeling (MLM).# Loading the model and tokenizer \n1. clone the model repo \n\n2. Loading the Tokenizer \n\n3. Loading the model:\n\n\nThe easiest way to check whether our language model is learning anything interesting is via the .\n\nPipelines are simple wrappers around tokenizers and models, and the 'fill-mask' one will let you input a sequence containing a masked token (here, [mask]) and return a list of the most probable filled sequences, with their probabilities.\n\n\nFor more info visit the GITHUB"
] | [
-0.03967975452542305,
-0.02448505163192749,
-0.0026915003545582294,
0.11307047307491302,
0.08404086530208588,
-0.08446796238422394,
-0.041580889374017715,
0.08282642066478729,
0.07172533869743347,
0.056314773857593536,
0.21299290657043457,
0.10229816287755966,
0.008481394499540329,
0.2187144011259079,
0.046890512108802795,
-0.34163758158683777,
0.07813414931297302,
-0.0015810384647920728,
0.11857917159795761,
0.1124204769730568,
0.06465132534503937,
-0.08708413690328598,
0.08683200180530548,
0.05238307639956474,
-0.01486945804208517,
0.04153371602296829,
-0.048078250139951706,
-0.06837843358516693,
0.0525895357131958,
-0.05476333573460579,
0.07539565861225128,
0.04694969579577446,
-0.009568281471729279,
-0.006282031070441008,
0.03712854161858559,
0.00537295313552022,
0.032524943351745605,
-0.00995166227221489,
-0.04641294106841087,
-0.03663112223148346,
0.045423828065395355,
0.02036571130156517,
-0.009905676357448101,
0.008778516203165054,
-0.09018000215291977,
-0.01648557558655739,
0.04984834045171738,
0.027651170268654823,
0.0012568716192618012,
0.011653917841613293,
-0.042395997792482376,
0.2572624683380127,
-0.057188570499420166,
0.08666173368692398,
0.13109131157398224,
-0.20996958017349243,
-0.003521444508805871,
0.15056760609149933,
0.07457233220338821,
0.09001590311527252,
0.040026091039180756,
0.056263651698827744,
0.09473541378974915,
0.0402853786945343,
0.10532408207654953,
-0.10885807126760483,
-0.09217541664838791,
-0.03895273059606552,
-0.16206111013889313,
0.007725446484982967,
0.08968503028154373,
-0.06680743396282196,
-0.08334766328334808,
0.0412399098277092,
-0.06561286002397537,
0.11393006145954132,
0.011229781433939934,
-0.02466648817062378,
-0.012360426597297192,
0.03397351875901222,
0.044034652411937714,
-0.05505075305700302,
-0.06493532657623291,
-0.014279969967901707,
-0.06171898543834686,
0.06750357151031494,
0.02983107790350914,
0.066006138920784,
-0.14562630653381348,
-0.015454128384590149,
-0.17546124756336212,
-0.08319145441055298,
-0.05627613142132759,
-0.08407863974571228,
-0.030852435156702995,
-0.024784211069345474,
-0.0018585604848340154,
-0.06044059991836548,
0.04191265255212784,
0.02845754474401474,
-0.04447281360626221,
0.08817580342292786,
-0.04698037728667259,
0.05519434064626694,
0.10320030897855759,
0.06611909717321396,
-0.0655839592218399,
0.04335794225335121,
0.027461174875497818,
-0.058133259415626526,
0.04765407368540764,
-0.019522977992892265,
-0.08346046507358551,
-0.031023574993014336,
0.027012547478079796,
0.04720897972583771,
-0.08865994960069656,
0.12826448678970337,
0.03707083687186241,
-0.017847388982772827,
0.05709784850478172,
-0.08157174289226532,
-0.04794842377305031,
-0.05503353849053383,
-0.04140220955014229,
0.025324923917651176,
-0.00513832550495863,
-0.008083995431661606,
-0.06637431681156158,
-0.11446604877710342,
-0.07407066971063614,
0.008021143265068531,
-0.06622923910617828,
-0.14007745683193207,
-0.052778951823711395,
-0.27374663949012756,
-0.01700160838663578,
-0.16503697633743286,
-0.2490924745798111,
0.038962140679359436,
0.07737927883863449,
-0.07405099272727966,
0.007288448046892881,
-0.024363653734326363,
0.04081294313073158,
-0.008006023243069649,
-0.03452814370393753,
0.03317686542868614,
-0.02667936123907566,
-0.03328763693571091,
0.012963657267391682,
0.20761695504188538,
0.010332220233976841,
-0.004167730454355478,
-0.13220421969890594,
0.10213694721460342,
-0.293246865272522,
0.08068999648094177,
-0.06871794909238815,
0.06439681351184845,
-0.13730351626873016,
-0.04979270324110985,
0.1422833800315857,
0.024847451597452164,
0.030834758654236794,
0.17004670202732086,
-0.16413599252700806,
-0.03648212552070618,
0.19230085611343384,
-0.1698538213968277,
-0.13488535583019257,
0.1307895928621292,
-0.042054031044244766,
0.15717877447605133,
0.06898045539855957,
0.09451176226139069,
0.05943995714187622,
-0.10570415109395981,
0.13007524609565735,
0.03429858759045601,
-0.07972148060798645,
-0.015049495734274387,
0.06741082668304443,
0.05556771904230118,
-0.17318999767303467,
0.04493051767349243,
-0.04707876965403557,
0.007605669554322958,
-0.02766432613134384,
0.001460349652916193,
-0.02199077233672142,
-0.08107980340719223,
0.033627115190029144,
0.024027159437537193,
0.10564342886209488,
-0.06988954544067383,
0.028339607641100883,
0.06651508063077927,
0.048870109021663666,
-0.04387525096535683,
0.026780584827065468,
-0.10596942156553268,
0.147221177816391,
-0.12425397336483002,
-0.015024603344500065,
-0.14870041608810425,
0.0937962606549263,
0.0899859294295311,
-0.04735641926527023,
0.012878982350230217,
0.020455023273825645,
0.0633743405342102,
0.057131677865982056,
0.017586203292012215,
0.026422863826155663,
0.17904585599899292,
0.013639234937727451,
-0.029755719006061554,
-0.1017693355679512,
0.02368251048028469,
-0.11485114693641663,
-0.07654990255832672,
-0.04714156687259674,
-0.04457911103963852,
-0.14756177365779877,
0.08403488248586655,
0.012165499851107597,
0.05409358814358711,
0.08869817107915878,
0.024391746148467064,
-0.017924129962921143,
-0.027586447075009346,
0.07799045741558075,
0.01645304076373577,
-0.057277608662843704,
0.21580912172794342,
0.02841436117887497,
-0.1293361932039261,
0.0707729235291481,
-0.0887238010764122,
-0.04996396601200104,
-0.06993254274129868,
-0.003282328834757209,
-0.005753176286816597,
0.0024732020683586597,
0.094906285405159,
0.2036689668893814,
-0.01025276631116867,
0.14344465732574463,
-0.08742726594209671,
0.049794942140579224,
0.05794260650873184,
-0.07322342693805695,
-0.07509420812129974,
0.1002441793680191,
0.11256182938814163,
-0.22895823419094086,
0.10132678598165512,
-0.03367630019783974,
-0.05543489009141922,
0.1770469695329666,
0.011655043810606003,
-0.0691906288266182,
-0.015233229845762253,
0.01753491908311844,
-0.03085962124168873,
-0.023866163566708565,
-0.2219059020280838,
-0.1593698263168335,
0.04641585052013397,
-0.016582611948251724,
-0.0036968791391700506,
-0.019201086834073067,
0.017292294651269913,
-0.003440587781369686,
0.015240349806845188,
-0.07631349563598633,
0.0588281974196434,
-0.10036373138427734,
0.002565413946285844,
0.026016347110271454,
-0.054643236100673676,
0.03360960632562637,
0.0769878625869751,
-0.0877782329916954,
0.20800305902957916,
-0.09049510955810547,
-0.3148511052131653,
-0.07576663047075272,
-0.19863146543502808,
-0.01906208135187626,
0.06283991038799286,
0.002178770024329424,
-0.17163392901420593,
-0.08395611494779587,
-0.07952189445495605,
0.046579983085393906,
-0.055835723876953125,
0.0652557983994484,
-0.05402868613600731,
-0.03129148110747337,
-0.04636583477258682,
-0.059056494385004044,
-0.029134761542081833,
-0.0233224555850029,
-0.0023483792319893837,
0.16419915854930878,
-0.18407636880874634,
0.02032465673983097,
0.08628398925065994,
-0.0019091255962848663,
0.06674899160861969,
-0.011370575986802578,
0.17969131469726562,
-0.011629843153059483,
0.11798883974552155,
0.12975072860717773,
-0.015493138693273067,
0.053250618278980255,
0.17566914856433868,
-0.017131002619862556,
-0.019284727051854134,
0.1189107820391655,
-0.015604148618876934,
-0.10441035777330399,
-0.13715489208698273,
-0.04572904482483864,
-0.027715137228369713,
0.027120783925056458,
0.08836659789085388,
0.033086325973272324,
-0.028552865609526634,
0.1625661998987198,
0.0819854810833931,
0.02771313488483429,
-0.073428675532341,
0.06191471591591835,
-0.09449059516191483,
-0.052994512021541595,
0.11444679647684097,
0.0025770594365894794,
-0.034387193620204926,
0.06084189936518669,
0.05467889830470085,
0.188627690076828,
0.007958213798701763,
0.035769276320934296,
0.12056773900985718,
0.039063163101673126,
0.1553996056318283,
0.08609235286712646,
-0.13309910893440247,
-0.041567422449588776,
-0.0320722833275795,
-0.04614248871803284,
0.033349037170410156,
0.058558184653520584,
0.00666617089882493,
0.007992231287062168,
0.0024846033193171024,
0.1183185949921608,
0.02524980530142784,
0.20598207414150238,
0.1423741579055786,
-0.2144734114408493,
-0.06899996101856232,
0.05873025581240654,
-0.016125528141856194,
-0.02507278323173523,
0.0578489676117897,
0.01912192441523075,
-0.06652030348777771,
0.059552740305662155,
0.010213415138423443,
0.07873781770467758,
-0.0029137791134417057,
0.006582243368029594,
-0.13650697469711304,
0.04801083356142044,
-0.03972182050347328,
0.06135542318224907,
-0.2639557123184204,
0.2649858593940735,
0.006727127358317375,
-0.02784646302461624,
-0.09849093854427338,
-0.012808151543140411,
0.009571236558258533,
0.1186295673251152,
0.1608874499797821,
0.023347657173871994,
0.0066151549108326435,
-0.10120171308517456,
0.0005635943380184472,
0.046617291867733,
0.017940528690814972,
0.0698385238647461,
0.04104591906070709,
0.0363154262304306,
-0.011980152688920498,
-0.0475785993039608,
0.07713902741670609,
-0.1573946624994278,
0.022173356264829636,
-0.009962255135178566,
-0.017729956656694412,
-0.12962763011455536,
-0.030170930549502373,
-0.09498768299818039,
-0.04970899969339371,
0.16790808737277985,
0.10024310648441315,
-0.14044564962387085,
-0.06262195855379105,
-0.04112626984715462,
0.11237125098705292,
-0.10805176198482513,
0.06817654520273209,
-0.10511620342731476,
0.11613131314516068,
-0.051459651440382004,
-0.0767500177025795,
0.08491455018520355,
-0.11702686548233032,
0.018941517919301987,
-0.03255578503012657,
0.027458431199193,
0.13864600658416748,
-0.007116088178008795,
0.049293842166662216,
0.031350042670965195,
-0.021916141733527184,
-0.08826310187578201,
0.015744425356388092,
-0.011444193311035633,
0.09334942698478699,
0.04062405973672867,
-0.04965338483452797,
-0.0312348660081625,
-0.014699247665703297,
-0.03805898502469063,
0.15230992436408997,
0.06341822445392609,
-0.03613518178462982,
0.18205998837947845,
0.3690935969352722,
-0.021695146337151527,
-0.21979400515556335,
-0.021335246041417122,
0.0412721112370491,
0.027324317023158073,
-0.1226217970252037,
-0.1694483906030655,
0.11624410003423691,
-0.05837151035666466,
-0.00005320499258232303,
-0.0711623802781105,
-0.08079689741134644,
-0.10592830181121826,
0.21294577419757843,
0.1071702316403389,
0.3212128281593323,
-0.073526531457901,
-0.04877696931362152,
-0.019534196704626083,
-0.06226665899157524,
0.07782161980867386,
-0.1675974577665329,
0.03463068604469299,
-0.04660927504301071,
0.1027592197060585,
0.002606214489787817,
-0.07561823725700378,
0.09302035719156265,
-0.02792898193001747,
-0.06970807909965515,
-0.09979142248630524,
-0.03840411826968193,
0.06620560586452484,
-0.0406838059425354,
0.11912571638822556,
0.017229527235031128,
0.0629410594701767,
-0.06265061348676682,
-0.08544421195983887,
-0.07267148047685623,
0.02443963848054409,
0.018505733460187912,
-0.14214178919792175,
-0.019608376547694206,
0.13731370866298676,
0.02137300744652748,
-0.005580119322985411,
0.05835732817649841,
-0.05026469752192497,
0.034711770713329315,
0.18967387080192566,
0.08677813410758972,
-0.1827232837677002,
0.07965747267007828,
0.053755566477775574,
-0.04595618695020676,
0.0833788588643074,
-0.11709143966436386,
0.014627521857619286,
0.032701924443244934,
0.028127629309892654,
0.13641244173049927,
0.024602385237812996,
-0.06398072093725204,
0.08056493103504181,
0.025372304022312164,
-0.04197275638580322,
-0.0033327650744467974,
0.029040317982435226,
-0.07146427780389786,
0.031061992049217224,
0.0045691016130149364,
0.08035674691200256,
-0.09144825488328934,
-0.021009819582104683,
-0.022967500612139702,
-0.011532525531947613,
-0.02558938041329384,
0.06636656820774078,
0.07769698649644852,
0.04884323850274086,
-0.0737203061580658,
0.07627228647470474,
0.042045969516038895,
-0.028439125046133995,
0.02196214534342289,
0.07356151193380356,
-0.19447104632854462,
-0.09561175853013992,
0.022525615990161896,
0.09571606665849686,
-0.06266289204359055,
-0.06434985995292664,
-0.006303020287305117,
-0.02272876538336277,
0.06588909775018692,
0.14856161177158356,
0.0609995536506176,
-0.10378818958997726,
-0.07365194708108902,
-0.009225345216691494,
0.0010984288528561592,
0.03574907034635544,
0.0926547721028328,
0.02392074652016163,
-0.08096031844615936,
0.06256640702486038,
0.046028025448322296,
0.14645612239837646,
-0.10524626821279526,
-0.08495418727397919,
-0.1663556694984436,
0.07115889340639114,
-0.2201714813709259,
0.12082947790622711,
-0.14942672848701477,
-0.04601413384079933,
0.0013267653994262218,
-0.08815692365169525,
-0.05516156926751137,
0.0243546012789011,
-0.06715507805347443,
0.08753661066293716,
0.038650352507829666,
0.058911338448524475,
-0.02813306450843811,
-0.0717940628528595,
0.10866566747426987,
-0.0146041763946414,
0.03748749941587448,
-0.002692095236852765,
-0.10616759210824966,
0.07244569063186646,
-0.07425566762685776,
-0.01223954651504755,
0.0528763048350811,
0.01960756629705429,
0.09866707026958466,
-0.08695927262306213,
0.011324295774102211,
0.0008097986574284732,
0.014779086224734783,
-0.023629235103726387,
0.14611788094043732,
0.008114004507660866,
0.042280178517103195,
0.02807089127600193,
-0.07548106461763382,
-0.06292714178562164,
0.059076979756355286,
0.039700526744127274,
0.06980656832456589,
0.04205252230167389,
-0.09825290739536285,
0.004043994937092066,
0.0077333166263997555,
-0.0037224935367703438,
-0.01824699528515339,
-0.050085507333278656,
0.02536870166659355,
-0.02320491522550583,
-0.012409895658493042,
-0.11451492458581924,
0.19763940572738647,
0.014550126157701015,
0.06824896484613419,
-0.006640654522925615,
-0.029079321771860123,
0.05190326273441315,
0.00045506178867071867,
0.12583677470684052,
0.06864640861749649,
0.0152995390817523,
-0.06522965431213379,
0.09024516493082047,
0.05097545310854912,
0.12028919160366058,
0.047241244465112686,
0.017946355044841766,
0.039912570267915726,
0.07486464083194733,
0.004236950539052486,
0.078310027718544,
-0.07649630308151245,
-0.13424447178840637,
-0.07371736317873001,
0.07394152879714966,
0.05684865266084671,
0.10894496738910675,
0.22409489750862122,
-0.05556828901171684,
0.03156398981809616,
0.0451282300055027,
-0.09151089936494827,
-0.1058063954114914,
-0.2654191851615906,
-0.08959496766328812,
-0.032092925161123276,
0.034759726375341415,
-0.06718776375055313,
-0.04221027344465256,
-0.03686521202325821,
0.12925252318382263,
0.012464222498238087,
0.06174963340163231,
0.08240392059087753,
-0.0625629872083664,
0.02000425197184086,
-0.023029783740639687,
0.016602981835603714,
0.07047843188047409,
0.0058798459358513355,
-0.04694637656211853,
-0.013651133514940739,
-0.06966212391853333,
-0.01403428427875042,
0.03540591895580292,
0.029123803600668907,
-0.039561960846185684,
-0.022805441170930862,
-0.07036539912223816,
-0.01635861024260521,
-0.0483396090567112,
0.08646103739738464,
0.03436121344566345,
-0.05234025791287422,
-0.01014427188783884,
0.032891009002923965,
0.019213082268834114,
-0.058010976761579514,
-0.14807146787643433,
0.3984776437282562,
-0.06489350646734238,
0.039277344942092896,
0.04075208678841591,
-0.020826835185289383,
-0.19115786254405975,
0.30082792043685913,
0.18759840726852417,
0.027375193312764168,
0.0015931951347738504,
-0.00039297761395573616,
0.008264685980975628,
0.008943557739257812,
0.17378677427768707,
0.03133842349052429,
0.12862668931484222,
-0.05297448858618736,
-0.007686087861657143,
-0.1088455468416214,
-0.06059392914175987,
-0.12785036861896515,
-0.007517295423895121,
0.08653829991817474,
0.0030674408189952374,
-0.04945392534136772,
0.06755032390356064,
-0.21790555119514465,
-0.04876801744103432,
-0.09391694515943527,
-0.017571618780493736,
-0.09763239324092865,
-0.07828448712825775,
-0.13029852509498596,
0.046729013323783875,
0.04029003903269768,
-0.0037518811877816916,
0.07988179475069046,
-0.09250439703464508,
0.06372227519750595,
-0.08961094170808792,
-0.09722449630498886,
0.18231581151485443,
0.034901347011327744,
0.11501850187778473,
-0.020699692890048027,
0.03704164922237396,
0.09793443232774734,
0.03554658964276314,
-0.06261676549911499,
0.0956987515091896,
0.03142955154180527,
-0.005887443199753761,
-0.018788015469908714,
0.08754745125770569,
-0.020020820200443268,
-0.10224927216768265,
0.012498934753239155,
-0.00555089395493269,
0.016341906040906906,
-0.03195813670754433,
0.05905025079846382,
-0.14980322122573853,
0.12744268774986267,
-0.09917401522397995,
0.0721500813961029,
0.07449636608362198,
-0.004170621745288372,
-0.0242705550044775,
-0.1292683184146881,
0.08515264093875885,
-0.057434916496276855,
-0.1735266149044037,
-0.1270393431186676,
-0.12619617581367493,
-0.0022018204908818007,
-0.017304586246609688,
0.013875378295779228,
-0.23906786739826202,
-0.005531894508749247,
-0.049042873084545135,
0.02761259116232395,
0.004683065228164196,
0.03466913476586342,
0.015944087877869606,
0.11419875919818878,
-0.023726098239421844,
-0.04917312040925026,
0.027399368584156036,
0.011147296987473965,
-0.14490385353565216,
-0.1101047545671463
] |
null | null | null | ERROR: type should be string, got "\r\nhttps://github.com/R4j4n/Nepali-Word2Vec-from-scratch\r\n\r\nHow to clone : \r\n```\r\ngit lfs install\r\ngit clone https://huggingface.co/Rajan/Nepali_Word2Vec\r\n```" | {"license": "mit"} | null | Rajan/Nepali_Word2Vec | [
"license:mit",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [] | TAGS
#license-mit #region-us
|
URL
How to clone :
| [] | [
"TAGS\n#license-mit #region-us \n"
] | [
11
] | [
"passage: TAGS\n#license-mit #region-us \n"
] | [
0.026221778243780136,
-0.033018264919519424,
-0.008281232789158821,
-0.05295303836464882,
0.052470896393060684,
0.06768012046813965,
0.1598525494337082,
0.04655371606349945,
0.23683255910873413,
-0.05407243221998215,
0.11752297729253769,
0.08923697471618652,
0.004284696187824011,
-0.0009730930323712528,
0.014216204173862934,
-0.17134642601013184,
0.04864625632762909,
-0.02878100797533989,
0.08764812350273132,
0.032233644276857376,
-0.006205103360116482,
-0.03845774009823799,
-0.0022142508532851934,
-0.03178790956735611,
-0.057939812541007996,
0.03869890421628952,
0.045729056000709534,
-0.02754949778318405,
0.14189864695072174,
-0.021783310920000076,
0.13335508108139038,
0.046146418899297714,
-0.011738095432519913,
-0.2486042082309723,
0.008575023151934147,
-0.07252951711416245,
-0.11333522200584412,
0.016201216727495193,
0.035761721432209015,
-0.010069100186228752,
0.032174937427043915,
0.11049123108386993,
-0.011680051684379578,
0.06288356333971024,
-0.2015703022480011,
-0.20486389100551605,
-0.07508610188961029,
-0.07555478066205978,
0.0589042492210865,
0.030872387811541557,
0.05628744140267372,
0.1426718831062317,
-0.18022038042545319,
-0.0018841808196157217,
0.04129622131586075,
-0.3510737717151642,
0.09011197835206985,
0.19666501879692078,
0.06407395005226135,
0.07872317731380463,
-0.04774639382958412,
0.06726468354463577,
0.07745297998189926,
-0.02402484230697155,
-0.10679105669260025,
-0.06142130121588707,
0.040939174592494965,
0.15604156255722046,
-0.03852643445134163,
-0.10356393456459045,
0.2591084837913513,
-0.023262828588485718,
-0.04234466329216957,
0.08201269060373306,
-0.02980397455394268,
-0.040379155427217484,
0.04404358193278313,
0.044016025960445404,
0.036236923187971115,
0.182089164853096,
0.1260262131690979,
-0.03375067934393883,
-0.16269677877426147,
-0.030629513785243034,
-0.2528207004070282,
0.07418664544820786,
-0.003647059667855501,
0.10666298121213913,
-0.20037521421909332,
0.03286786004900932,
-0.15483668446540833,
-0.009493621066212654,
-0.02952384203672409,
-0.059835705906152725,
0.05229754373431206,
-0.0237403754144907,
-0.04600388556718826,
0.07238677144050598,
0.08390641957521439,
0.2046167105436325,
0.023024363443255424,
0.016697337850928307,
-0.10405295342206955,
0.15052515268325806,
0.019140364602208138,
0.024860305711627007,
0.179348424077034,
0.07677878439426422,
-0.04891882464289665,
-0.2251969277858734,
0.027894439175724983,
-0.03671982139348984,
-0.1441805064678192,
0.015881337225437164,
-0.1542915552854538,
0.1736440360546112,
-0.04078168794512749,
-0.06919530034065247,
-0.08578147739171982,
0.09790384024381638,
0.07768166810274124,
-0.021921472623944283,
-0.023105677217245102,
-0.01381723117083311,
0.03522264584898949,
-0.048196230083703995,
-0.11687057465314865,
0.018241960555315018,
0.11869648098945618,
0.12573401629924774,
-0.1483907401561737,
-0.008189842104911804,
-0.017200417816638947,
0.019065292552113533,
0.09696817398071289,
-0.112403005361557,
0.028845038264989853,
-0.09672309458255768,
-0.13033071160316467,
0.036653537303209305,
0.017736904323101044,
-0.019008556380867958,
0.1340927630662918,
0.061849117279052734,
0.056560322642326355,
-0.011025321669876575,
-0.07250872999429703,
-0.14035539329051971,
-0.08679798245429993,
0.1058693379163742,
-0.046787332743406296,
0.010320915840566158,
-0.24556252360343933,
-0.014234079979360104,
-0.14995723962783813,
0.059662189334630966,
-0.0037668521981686354,
-0.08819212019443512,
-0.07740068435668945,
0.21408265829086304,
0.0018596589798107743,
0.04301392287015915,
-0.1078512966632843,
0.054903753101825714,
-0.06764797121286392,
0.10065380483865738,
-0.12895582616329193,
-0.06441528350114822,
0.1613781899213791,
-0.13135331869125366,
-0.14002031087875366,
0.0033312994055449963,
-0.009472889825701714,
0.12053907662630081,
0.0802001804113388,
0.44566696882247925,
-0.058881040662527084,
-0.16201181709766388,
0.1270403116941452,
0.17969723045825958,
-0.13685379922389984,
-0.25928929448127747,
0.12393020838499069,
-0.1636963188648224,
-0.16647985577583313,
0.0040023741312325,
-0.006962866988033056,
0.08049977570772171,
-0.03446655720472336,
-0.056274134665727615,
0.042339932173490524,
0.024350708350539207,
0.029094615951180458,
0.01740112341940403,
0.07037191838026047,
-0.1023021712899208,
0.08444856107234955,
0.058610700070858,
-0.014111426658928394,
0.15077349543571472,
0.011494536884129047,
-0.05393160134553909,
0.014761670492589474,
0.044013332575559616,
-0.015627963468432426,
-0.05899091437458992,
-0.09661509096622467,
0.019826244562864304,
-0.031149597838521004,
0.08229395002126694,
0.1699674129486084,
0.023824702948331833,
-0.02797185815870762,
0.028922779485583305,
0.028606392443180084,
0.1009954959154129,
0.06960704177618027,
0.03099375218153,
-0.04839283227920532,
0.04952205345034599,
-0.0417071171104908,
-0.11430390179157257,
-0.004862460307776928,
-0.011735930107533932,
0.11975742131471634,
-0.08906009048223495,
-0.01223952230066061,
0.05951591953635216,
-0.04513183981180191,
0.0019881438929587603,
0.0428374819457531,
0.0035966038703918457,
0.1388600617647171,
0.004440935328602791,
-0.04352007433772087,
0.17440910637378693,
-0.05288633331656456,
0.15533447265625,
0.1715822070837021,
-0.07049662619829178,
0.015605369582772255,
-0.1273636519908905,
0.003230511210858822,
-0.014480113983154297,
0.05292887985706329,
-0.05400136485695839,
-0.05201306566596031,
-0.01274962443858385,
0.014292534440755844,
-0.03134604170918465,
0.01711403578519821,
-0.06057267636060715,
-0.08167021721601486,
-0.10849859565496445,
0.018649224191904068,
0.20683221518993378,
-0.22544461488723755,
0.1609548032283783,
0.40251004695892334,
0.15190774202346802,
0.21155193448066711,
-0.12478897720575333,
-0.002471078187227249,
-0.06630261242389679,
0.026115071028470993,
-0.024814706295728683,
0.13782677054405212,
-0.13174867630004883,
-0.01413064356893301,
0.03880728408694267,
0.0454997681081295,
0.0661163181066513,
-0.17195898294448853,
-0.15260353684425354,
-0.0034879595041275024,
-0.020591814070940018,
-0.1749730259180069,
0.04874620959162712,
-0.07595308125019073,
0.02181261032819748,
0.018216799944639206,
-0.10832522064447403,
0.16837291419506073,
-0.033566512167453766,
-0.06695768237113953,
0.052613962441682816,
-0.20581911504268646,
-0.07900715619325638,
-0.17772749066352844,
-0.18375012278556824,
0.06050071492791176,
0.05760138854384422,
0.07903145253658295,
-0.05951719731092453,
-0.01922747679054737,
0.061719246208667755,
-0.009363299235701561,
-0.13802112638950348,
-0.04235544428229332,
-0.06993678212165833,
0.08744155615568161,
-0.09474305808544159,
-0.07518411427736282,
-0.07833878695964813,
-0.046996138989925385,
-0.020961694419384003,
0.08125963062047958,
-0.1039251759648323,
0.08903530240058899,
0.1493726521730423,
0.03651920333504677,
0.05440247058868408,
-0.08271230012178421,
0.12693379819393158,
-0.037743739783763885,
-0.09459595382213593,
0.07307634502649307,
0.004350725095719099,
0.04920351505279541,
0.24039287865161896,
0.08962162584066391,
-0.10578162968158722,
-0.01780811697244644,
-0.0968487411737442,
-0.16405464708805084,
-0.2553846538066864,
-0.06823288649320602,
-0.08744750916957855,
0.14417944848537445,
0.014636521227657795,
0.10712126642465591,
0.14313316345214844,
0.01343101728707552,
0.10255914181470871,
-0.08983208239078522,
-0.018939344212412834,
0.031209396198391914,
0.2135104089975357,
-0.05208220332860947,
0.00838248711079359,
-0.13684824109077454,
-0.0256142970174551,
0.14601100981235504,
0.13798639178276062,
0.14503207802772522,
0.31421369314193726,
0.15292863547801971,
0.13410434126853943,
0.13474710285663605,
0.12333164364099503,
0.07403261214494705,
0.03444362059235573,
-0.015304201282560825,
-0.06035377085208893,
-0.003846159903332591,
0.02816268615424633,
0.05421729013323784,
0.06724072247743607,
-0.22906480729579926,
0.041139665991067886,
-0.2661744952201843,
0.03544611483812332,
-0.0854712724685669,
0.1161833181977272,
-0.028890252113342285,
0.11051984131336212,
0.11386284977197647,
0.05553818494081497,
-0.023278791457414627,
0.16036942601203918,
0.032686375081539154,
-0.07703183591365814,
0.020292721688747406,
0.024695809930562973,
0.06633034348487854,
0.08606193959712982,
0.09550496190786362,
-0.020778406411409378,
-0.1831783503293991,
0.025963841006159782,
0.12212833017110825,
-0.20747940242290497,
0.289523184299469,
0.013651901856064796,
-0.0743619054555893,
-0.01690039224922657,
-0.06958060711622238,
0.008433517068624496,
0.12829731404781342,
0.10406835377216339,
0.05508929491043091,
-0.2613787055015564,
-0.13299626111984253,
0.046764206141233444,
-0.00873907096683979,
0.11356569826602936,
-0.0052223424427211285,
-0.14201195538043976,
-0.06640999764204025,
0.05814211815595627,
-0.006591420155018568,
0.13023322820663452,
-0.018290361389517784,
-0.08173255622386932,
-0.010230090469121933,
0.055564697831869125,
-0.001312803477048874,
-0.04580084979534149,
0.07523149996995926,
0.009008137509226799,
0.02259289287030697,
-0.08178020268678665,
0.03887253627181053,
-0.08071476966142654,
-0.25375792384147644,
0.019298138096928596,
-0.04987313598394394,
0.004092312417924404,
-0.04684043675661087,
-0.15448936820030212,
-0.1129264086484909,
-0.15445278584957123,
0.13100723922252655,
-0.03675999864935875,
0.091565802693367,
-0.0817658007144928,
0.13736046850681305,
-0.08521489799022675,
0.05375019088387489,
0.00614814180880785,
0.03918716683983803,
-0.017955513671040535,
-0.1031481996178627,
0.09334362298250198,
-0.1874227225780487,
0.023863423615694046,
0.010427716188132763,
-0.056847453117370605,
-0.01354232057929039,
0.03918023407459259,
-0.08763083070516586,
0.21879427134990692,
0.3331502079963684,
-0.011948764324188232,
0.22546616196632385,
0.35863226652145386,
-0.13763751089572906,
-0.23258967697620392,
-0.1205512136220932,
-0.3263251483440399,
-0.09005610644817352,
0.17321562767028809,
-0.18057219684123993,
0.04850830137729645,
0.16150830686092377,
-0.10868281871080399,
0.22499866783618927,
-0.22723928093910217,
-0.04793389141559601,
0.1823979914188385,
-0.038322996348142624,
0.4527989625930786,
-0.1144307404756546,
-0.1784561723470688,
-0.03637253865599632,
-0.16285361349582672,
0.12426037341356277,
-0.026553882285952568,
0.06700495630502701,
0.02416347898542881,
-0.011372359469532967,
-0.009014161303639412,
-0.04529716446995735,
0.2216065675020218,
0.0522729866206646,
0.10468899458646774,
-0.09159468114376068,
-0.17199653387069702,
0.1907423883676529,
-0.0004908236442133784,
-0.003372655250132084,
-0.05411549657583237,
-0.04850282520055771,
-0.06871756166219711,
0.033092137426137924,
-0.0334564633667469,
0.06195882335305214,
0.03364093229174614,
-0.11903523653745651,
-0.10248823463916779,
0.034111104905605316,
-0.13155671954154968,
-0.054850947111845016,
0.26421889662742615,
-0.02080743946135044,
0.09609334170818329,
0.04959092289209366,
-0.05474294349551201,
-0.13538943231105804,
0.005736751481890678,
-0.07534020394086838,
-0.05711410939693451,
0.06573604047298431,
-0.11453206837177277,
-0.024341827258467674,
0.1293732225894928,
-0.029497180134058,
0.09674722701311111,
0.08061115443706512,
-0.07585363835096359,
0.02032829262316227,
0.15617427229881287,
-0.07247176766395569,
-0.10849180817604065,
0.04999847710132599,
0.04640531167387962,
0.17256882786750793,
0.004101871978491545,
0.02018604800105095,
0.08726977556943893,
0.045959215611219406,
-0.007486662827432156,
0.007311292923986912,
-0.11321697384119034,
-0.04241771996021271,
0.0387241393327713,
-0.005273692775517702,
-0.10946331918239594,
0.16008898615837097,
0.056837860494852066,
0.004653505515307188,
-0.06027700752019882,
0.09720424562692642,
-0.06709636747837067,
-0.07046061009168625,
-0.1753035932779312,
0.018511172384023666,
-0.12734080851078033,
-0.09874535351991653,
0.06846235692501068,
-0.09371624886989594,
-0.04084605351090431,
0.08152704685926437,
0.046927981078624725,
0.14401860535144806,
-0.006597559433430433,
-0.023080874234437943,
0.149825319647789,
-0.0884878933429718,
-0.2241756170988083,
0.01969664730131626,
-0.04083063453435898,
-0.07065816223621368,
-0.0007070365245454013,
0.06069544702768326,
-0.0663156732916832,
-0.11958606541156769,
-0.20477768778800964,
0.10412076860666275,
-0.12043121457099915,
-0.03954985365271568,
-0.1041841059923172,
-0.053260523825883865,
0.07891252636909485,
-0.02613759972155094,
-0.04122013971209526,
-0.047595683485269547,
-0.16630595922470093,
0.054254453629255295,
0.07140932232141495,
0.11125344783067703,
-0.0759999230504036,
-0.018354382365942,
0.1398727148771286,
0.048581548035144806,
0.08479110151529312,
0.07578440010547638,
0.026255371049046516,
0.16728560626506805,
-0.1708206981420517,
-0.0542997270822525,
0.1068294569849968,
-0.026716172695159912,
0.01994573324918747,
0.10631280392408371,
-0.04839588701725006,
0.07042654603719711,
-0.05095988139510155,
0.05859163776040077,
-0.15704534947872162,
-0.13073866069316864,
-0.04184387996792793,
0.023728877305984497,
-0.2260182797908783,
0.015071595087647438,
-0.1769561767578125,
0.19692228734493256,
-0.024228032678365707,
0.11490963399410248,
0.08052190393209457,
0.02052290178835392,
0.03539382666349411,
-0.006019921973347664,
0.00946811307221651,
-0.10524865239858627,
-0.05784677714109421,
-0.07560300827026367,
-0.1168874129652977,
-0.009665017947554588,
0.36614301800727844,
0.02430291846394539,
-0.19682736694812775,
0.051222387701272964,
0.18285293877124786,
0.023639049381017685,
-0.0073763905093073845,
0.26180747151374817,
0.08150359988212585,
-0.023175053298473358,
-0.1782374382019043,
0.0396091528236866,
-0.08699734508991241,
-0.15269799530506134,
0.11385007947683334,
0.09347525984048843,
0.05813581123948097,
0.022930078208446503,
0.10404518246650696,
-0.035940010100603104,
-0.05509711429476738,
-0.13301853835582733,
0.13368983566761017,
-0.001790675800293684,
0.0193882267922163,
0.0897885113954544,
0.19249756634235382,
-0.045275162905454636,
0.05437124893069267,
-0.07336640357971191,
-0.001598604372702539,
-0.15740543603897095,
-0.13358698785305023,
0.06194563955068588,
-0.08269550651311874,
0.06342913210391998,
0.050261519849300385,
0.04341990500688553,
0.31786394119262695,
0.039095040410757065,
-0.046439893543720245,
0.003166865324601531,
-0.14845187962055206,
-0.08075450360774994,
-0.06024569645524025,
-0.03110554814338684,
0.028620192781090736,
-0.13928957283496857,
-0.09898591786623001,
-0.06917677819728851,
-0.130235955119133,
-0.06539803743362427,
0.025270747020840645,
0.014251931570470333,
-0.053083837032318115,
-0.17625881731510162,
-0.04808593541383743,
-0.06644169986248016,
0.10105955600738525,
-0.08462738990783691,
0.1516820639371872,
0.0022449472453445196,
0.030281953513622284,
0.07627002149820328,
0.09585131704807281,
0.018900424242019653,
-0.06975197046995163,
0.05599058046936989,
0.12436293810606003,
0.01323844213038683,
0.1259988248348236,
-0.06034265458583832,
-0.019420607015490532,
-0.014145253226161003,
0.14038437604904175,
0.304447740316391,
-0.01856905221939087,
-0.013814439997076988,
-0.022110093384981155,
0.021388787776231766,
0.10893569141626358,
0.19800719618797302,
-0.03437356278300285,
0.2551359534263611,
-0.058974795043468475,
0.0756678432226181,
-0.013180435635149479,
-0.005362013820558786,
-0.053146667778491974,
0.06074550002813339,
0.06268858164548874,
-0.06877048313617706,
-0.10191375762224197,
0.15178529918193817,
-0.14985080063343048,
0.13306055963039398,
0.14678068459033966,
-0.06057753041386604,
0.03797250986099243,
0.0007459368789568543,
0.19896264374256134,
-0.03570213168859482,
0.0984780564904213,
-0.10653308779001236,
-0.10261140763759613,
-0.14764924347400665,
0.037690844386816025,
-0.36797797679901123,
-0.1756322830915451,
0.11731542646884918,
0.14115898311138153,
0.1759258657693863,
-0.012341637164354324,
0.056479312479496,
0.0033020609989762306,
0.08296097069978714,
-0.04232487455010414,
0.1519634872674942,
0.0612073615193367,
-0.017103128135204315,
-0.15296664834022522,
-0.20328094065189362,
-0.0012039330322295427,
-0.058561209589242935,
0.055583830922842026,
-0.02269243635237217,
0.025347469374537468,
0.07746459543704987,
-0.06768939644098282,
-0.029180381447076797,
-0.02352982573211193,
-0.13262848556041718,
0.052229251712560654,
-0.04354005306959152,
0.0320255309343338,
-0.03958037868142128,
-0.022394726052880287,
-0.039987675845623016,
0.10721533745527267,
-0.22402705252170563,
-0.08517231047153473,
0.1422796994447708,
-0.03421911224722862,
0.1542559564113617,
-0.02848726324737072,
-0.12159585952758789,
-0.024955326691269875,
-0.06977712363004684,
0.10887379199266434,
-0.1419300138950348,
0.038592495024204254,
0.13747453689575195,
0.008710617199540138,
0.031119761988520622,
-0.2533661723136902,
0.050644006580114365,
-0.03556957095861435,
-0.016733208671212196,
-0.057031940668821335
] |
null | null | transformers |
# metrics:
# - accuracy
# model-index:
# - name: FacialEmoRecog
# results:
# - task:
# name: Image Classification
# type: image-classification
# - metrics:
# name: Accuracy
# type: accuracy
# value: 0.9189583659172058
# FacialEmoRecog
Create your own image classifier for **anything** by running this repo
## Example Images | {"language": ["en"], "license": "mit", "tags": ["image CLassification", "pytorch"], "datasets": ["Jeneral/fer2013"], "metrics": ["accuracy"], "inference": true, "pipeline_tag": "image-classification"} | image-classification | Rajaram1996/FacialEmoRecog | [
"transformers",
"pytorch",
"vit",
"image-classification",
"image CLassification",
"en",
"dataset:Jeneral/fer2013",
"license:mit",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [
"en"
] | TAGS
#transformers #pytorch #vit #image-classification #image CLassification #en #dataset-Jeneral/fer2013 #license-mit #autotrain_compatible #endpoints_compatible #has_space #region-us
|
# metrics:
# - accuracy
# model-index:
# - name: FacialEmoRecog
# results:
# - task:
# name: Image Classification
# type: image-classification
# - metrics:
# name: Accuracy
# type: accuracy
# value: 0.9189583659172058
# FacialEmoRecog
Create your own image classifier for anything by running this repo
## Example Images | [
"# metrics:",
"# - accuracy",
"# model-index:",
"# - name: FacialEmoRecog",
"# results:\n # - task:\n # name: Image Classification\n # type: image-classification\n # - metrics:\n # name: Accuracy\n # type: accuracy\n # value: 0.9189583659172058",
"# FacialEmoRecog \nCreate your own image classifier for anything by running this repo \n\n ## Example Images"
] | [
"TAGS\n#transformers #pytorch #vit #image-classification #image CLassification #en #dataset-Jeneral/fer2013 #license-mit #autotrain_compatible #endpoints_compatible #has_space #region-us \n",
"# metrics:",
"# - accuracy",
"# model-index:",
"# - name: FacialEmoRecog",
"# results:\n # - task:\n # name: Image Classification\n # type: image-classification\n # - metrics:\n # name: Accuracy\n # type: accuracy\n # value: 0.9189583659172058",
"# FacialEmoRecog \nCreate your own image classifier for anything by running this repo \n\n ## Example Images"
] | [
63,
4,
5,
5,
10,
48,
23
] | [
"passage: TAGS\n#transformers #pytorch #vit #image-classification #image CLassification #en #dataset-Jeneral/fer2013 #license-mit #autotrain_compatible #endpoints_compatible #has_space #region-us \n# metrics:# - accuracy# model-index:# - name: FacialEmoRecog# results:\n # - task:\n # name: Image Classification\n # type: image-classification\n # - metrics:\n # name: Accuracy\n # type: accuracy\n # value: 0.9189583659172058# FacialEmoRecog \nCreate your own image classifier for anything by running this repo \n\n ## Example Images"
] | [
-0.13722386956214905,
0.1353887915611267,
-0.0037381837610155344,
0.07260707020759583,
0.18626387417316437,
0.007657169364392757,
0.03903733938932419,
0.08268575370311737,
0.07102369517087936,
0.03084646910429001,
0.09749290347099304,
0.19278205931186676,
0.051571473479270935,
0.17586013674736023,
-0.0596994124352932,
-0.26616498827934265,
-0.0010126325068995357,
0.06445051729679108,
0.1660148948431015,
0.08534199744462967,
0.09674110263586044,
-0.14232349395751953,
0.1728510558605194,
0.04658208042383194,
-0.31794095039367676,
0.025636542588472366,
0.01410242635756731,
-0.09237509965896606,
0.06538921594619751,
0.022930772975087166,
0.06495533138513565,
0.08352310955524445,
0.05496290698647499,
-0.004273685626685619,
0.021329665556550026,
-0.019539956003427505,
-0.09209853410720825,
0.047920551151037216,
0.16138014197349548,
-0.06936806440353394,
0.12198121100664139,
-0.030346600338816643,
0.0051416195929050446,
0.010138719342648983,
-0.08440326154232025,
-0.04800663888454437,
0.0023393898736685514,
0.034613076597452164,
0.1896730661392212,
0.07011457532644272,
-0.015543500892817974,
0.09277410805225372,
-0.09591083973646164,
0.12643250823020935,
0.08579779416322708,
-0.12110407650470734,
-0.08417731523513794,
0.11678598076105118,
-0.10043905675411224,
-0.08027482032775879,
-0.07818131893873215,
0.04733213782310486,
-0.011672448366880417,
0.015698732808232307,
0.10199137032032013,
-0.05659902095794678,
-0.13690586388111115,
-0.056881967931985855,
-0.10208182781934738,
-0.030544858425855637,
0.07766767591238022,
0.035044748336076736,
-0.002735558431595564,
-0.08194972574710846,
-0.043132197111845016,
-0.05201782286167145,
-0.11922065168619156,
0.0346677303314209,
0.00577011751011014,
-0.05296540632843971,
-0.10866193473339081,
0.07804640382528305,
-0.11116297543048859,
-0.06758946180343628,
-0.1157568022608757,
-0.02014349400997162,
0.03569332882761955,
0.05613890290260315,
-0.06379339098930359,
0.06079072132706642,
-0.15943822264671326,
-0.0830986276268959,
-0.04879431426525116,
-0.05163152143359184,
-0.06437183171510696,
-0.0667969286441803,
-0.03198499232530594,
-0.07277484983205795,
0.04550442472100258,
-0.012003703974187374,
0.024851417168974876,
0.032232966274023056,
0.006016463972628117,
0.07961095869541168,
0.018471257761120796,
0.16828641295433044,
-0.07810036838054657,
-0.022923190146684647,
-0.010929363779723644,
-0.07947710156440735,
-0.024650895968079567,
-0.0124489301815629,
-0.09579059481620789,
-0.11144940555095673,
0.06468841433525085,
0.022943323478102684,
-0.03634975850582123,
0.06480873376131058,
-0.025876525789499283,
-0.061912424862384796,
0.19019606709480286,
-0.054295461624860764,
-0.0021515870466828346,
-0.02691645175218582,
-0.08000127971172333,
0.006998468190431595,
0.08871110528707504,
-0.022068142890930176,
-0.09098539501428604,
0.04558641463518143,
-0.04496384039521217,
0.08624004572629929,
-0.057664904743433,
-0.04904528334736824,
-0.006751164328306913,
-0.10922612249851227,
-0.018778624013066292,
-0.1752849966287613,
-0.05220892280340195,
-0.025499703362584114,
0.041132524609565735,
-0.059354010969400406,
-0.025736549869179726,
-0.03887307271361351,
0.009943639859557152,
-0.044772952795028687,
-0.014055239968001842,
-0.01834004372358322,
-0.03313864767551422,
0.04974733293056488,
-0.011160224676132202,
0.14298750460147858,
-0.10119431465864182,
0.08854959905147552,
-0.10216591507196426,
-0.05101919174194336,
-0.1996336132287979,
0.09038893133401871,
-0.0001564425037940964,
0.10875079035758972,
-0.07371152192354202,
-0.10576426237821579,
-0.016057075932621956,
-0.03732718899846077,
0.0025553926825523376,
0.14844875037670135,
-0.09048432856798172,
-0.07228673249483109,
0.13242651522159576,
-0.1104571521282196,
-0.12740451097488403,
0.07979318499565125,
-0.006417101714760065,
-0.016659481450915337,
0.11307661980390549,
0.14294248819351196,
0.10674868524074554,
-0.06910954415798187,
0.03065025992691517,
-0.03989645838737488,
-0.11491651833057404,
-0.14323534071445465,
0.017441287636756897,
0.07081194221973419,
-0.038594335317611694,
0.0638255625963211,
-0.11383732408285141,
0.13973769545555115,
-0.08321899175643921,
-0.05375361815094948,
0.02991149201989174,
-0.062140293419361115,
0.011875705793499947,
0.09053733944892883,
0.07369780540466309,
0.02100599929690361,
0.019427943974733353,
-0.04182541370391846,
0.014730422757565975,
-0.0706380307674408,
-0.029956981539726257,
-0.12060297280550003,
0.15974512696266174,
-0.04633033275604248,
-0.023300563916563988,
-0.17672127485275269,
-0.050953127443790436,
0.00739240413531661,
0.035271450877189636,
0.07532699406147003,
-0.10223347693681717,
0.02137666940689087,
0.04774777591228485,
-0.01766195334494114,
-0.029938317835330963,
0.08241993188858032,
-0.043820593506097794,
-0.04556885361671448,
-0.03271494805812836,
-0.024395301938056946,
-0.01678609475493431,
0.16618052124977112,
-0.10178066790103912,
-0.010504152625799179,
0.07545049488544464,
0.051214367151260376,
0.06522192806005478,
-0.022647175937891006,
0.015313881449401379,
-0.002767650643363595,
0.01696239784359932,
-0.016870249062776566,
0.09486223757266998,
-0.03583776578307152,
0.01981635019183159,
0.08934858441352844,
-0.023225605487823486,
0.13337182998657227,
0.16658270359039307,
-0.247871994972229,
-0.08386854082345963,
-0.1181676909327507,
0.004605270456522703,
0.034914880990982056,
-0.06702100485563278,
0.07917717099189758,
0.03710048273205757,
-0.0030058654956519604,
0.09169038385152817,
-0.0535358190536499,
0.006788052152842283,
0.04205614700913429,
0.0019441000185906887,
-0.04875301569700241,
0.04814537987112999,
0.17187349498271942,
-0.23018144071102142,
0.05672144144773483,
0.13826468586921692,
-0.08358485251665115,
0.05637440085411072,
0.11126124113798141,
-0.047003123909235,
0.05511735379695892,
0.006377258338034153,
0.02248254604637623,
0.05634547397494316,
-0.21659056842327118,
-0.09088683128356934,
0.07352995872497559,
-0.14562593400478363,
0.013470461592078209,
-0.10433737188577652,
0.024032508954405785,
-0.01713964343070984,
-0.004836807493120432,
-0.004694553557783365,
0.07029716670513153,
0.043529871851205826,
0.10841471701860428,
-0.020233582705259323,
-0.11189557611942291,
-0.010085086338222027,
-0.007505140732973814,
-0.04387057200074196,
0.19280047714710236,
0.002286080038174987,
-0.2887410819530487,
-0.08514466881752014,
-0.0991506576538086,
-0.13951390981674194,
0.04666026309132576,
0.03642677888274193,
-0.07222728431224823,
-0.08955813944339752,
-0.02397448755800724,
-0.017012016847729683,
0.05926309898495674,
-0.014463929459452629,
-0.14427092671394348,
-0.00838544126600027,
-0.016598964110016823,
-0.03737476468086243,
-0.06552188098430634,
-0.05169204995036125,
-0.08850078284740448,
0.22858679294586182,
-0.07042711973190308,
0.10486894845962524,
0.07321709394454956,
-0.08454437553882599,
0.08288109302520752,
-0.03628149256110191,
0.2161061018705368,
-0.10454914718866348,
0.005804419983178377,
0.21740788221359253,
0.09654668718576431,
0.045240774750709534,
0.1359618604183197,
-0.018263882026076317,
-0.13770824670791626,
-0.001212686998769641,
0.02254904806613922,
-0.08117693662643433,
-0.006113003008067608,
-0.14905869960784912,
-0.08118331432342529,
-0.012068672105669975,
0.20484931766986847,
0.06372558325529099,
0.07530899345874786,
0.16925381124019623,
-0.010354199446737766,
-0.02202945575118065,
0.03499918803572655,
0.09276777505874634,
0.14766691625118256,
0.024065222591161728,
0.12312400341033936,
-0.007157638669013977,
-0.05796343460679054,
0.0273203756660223,
-0.036718789488077164,
0.22529828548431396,
-0.003566307481378317,
-0.079743891954422,
0.03214999660849571,
0.16459469497203827,
0.0916159376502037,
0.07970280945301056,
0.011629125103354454,
-0.01983409747481346,
0.04023734852671623,
-0.04616812616586685,
-0.06715051829814911,
-0.02232307568192482,
0.14451564848423004,
-0.13901738822460175,
-0.0611601360142231,
-0.046278759837150574,
0.021953072398900986,
0.13353213667869568,
0.04615018144249916,
-0.4732579290866852,
0.007414094638079405,
-0.06430485099554062,
0.017956089228391647,
-0.15169236063957214,
-0.0121229849755764,
0.005554873961955309,
-0.08232210576534271,
0.0955892875790596,
-0.06448161602020264,
0.09867854416370392,
-0.002438304014503956,
-0.02773861587047577,
0.044821128249168396,
-0.07064907997846603,
0.05311331897974014,
0.0698707103729248,
-0.03913312032818794,
0.20449933409690857,
-0.01274905540049076,
-0.018458649516105652,
-0.08233042061328888,
-0.018172627314925194,
0.08230104297399521,
0.2534055709838867,
0.2074536234140396,
0.012994702905416489,
0.057664599269628525,
-0.1843785047531128,
0.019489796832203865,
-0.004018326289951801,
0.02781517431139946,
-0.08043759316205978,
0.00003401594585739076,
0.04613473266363144,
-0.061660025268793106,
-0.0581134632229805,
0.03578067570924759,
-0.09186450392007828,
-0.09437042474746704,
-0.0007401974871754646,
0.002519384492188692,
0.09639942646026611,
0.02145850844681263,
-0.021242158487439156,
-0.15330788493156433,
0.079828180372715,
0.10160208493471146,
-0.06358814239501953,
-0.12080087512731552,
0.10782773047685623,
0.022944532334804535,
-0.13304241001605988,
0.09119973331689835,
-0.054083164781332016,
0.1419801414012909,
0.03556660935282707,
-0.1525905877351761,
0.061458487063646317,
-0.019441349431872368,
0.06838053464889526,
0.022003253921866417,
-0.0033655983861535788,
-0.02377183362841606,
-0.02860283851623535,
0.093214251101017,
0.08502501249313354,
-0.032095372676849365,
-0.05375280976295471,
0.020722979679703712,
-0.04514624550938606,
0.08197380602359772,
0.07936795800924301,
-0.009898732416331768,
-0.26106172800064087,
-0.09826585650444031,
0.11004175990819931,
0.17662018537521362,
0.10888318717479706,
-0.08452831953763962,
-0.023407749831676483,
0.10384701192378998,
0.03371642157435417,
-0.3453267514705658,
0.028855865821242332,
0.04982062801718712,
0.03031008690595627,
-0.006537959910929203,
-0.1310896873474121,
0.16221246123313904,
0.10224287211894989,
-0.03790626302361488,
0.0037228951696306467,
-0.1838979423046112,
-0.11735986918210983,
0.16818740963935852,
0.18434195220470428,
-0.017970746383070946,
-0.12075692415237427,
-0.018602045252919197,
-0.056879956275224686,
-0.09632366895675659,
0.17588068544864655,
0.027781374752521515,
0.07125206291675568,
-0.05147242173552513,
0.0624457411468029,
0.021348604932427406,
-0.026044636964797974,
0.08133889734745026,
0.04796658456325531,
0.08298052102327347,
-0.059110477566719055,
-0.10382094234228134,
-0.01965141110122204,
-0.05005963519215584,
0.14497314393520355,
0.09500003606081009,
0.024364948272705078,
-0.07707587629556656,
-0.01583150401711464,
-0.10770734399557114,
0.11076056957244873,
0.06538792699575424,
0.004438550211489201,
-0.05722717568278313,
0.05431108549237251,
-0.0035909966100007296,
0.024664293974637985,
0.11951308697462082,
-0.11928781867027283,
0.04447988048195839,
0.09252399951219559,
0.03243473917245865,
-0.15561538934707642,
-0.009029501117765903,
-0.08385671675205231,
-0.01756652258336544,
0.12239119410514832,
-0.10534030944108963,
0.10056374222040176,
0.08330236375331879,
0.01679845154285431,
0.08061287552118301,
0.040388692170381546,
0.05176179111003876,
0.032238222658634186,
0.1593712419271469,
-0.09855332970619202,
0.015188060700893402,
-0.044654667377471924,
0.03701687976717949,
0.0430513471364975,
0.03270482271909714,
0.030865350738167763,
-0.01877700909972191,
-0.051218658685684204,
-0.015486164949834347,
0.022902095690369606,
-0.0275074765086174,
0.07759438455104828,
0.021172182634472847,
-0.007507219910621643,
-0.167337104678154,
0.02590205892920494,
0.08386926352977753,
-0.19192220270633698,
-0.1127096563577652,
0.07414489984512329,
-0.11012889444828033,
-0.15950867533683777,
-0.004007634706795216,
0.11323589831590652,
-0.19068075716495514,
0.00587874511256814,
0.03638209402561188,
-0.0888640359044075,
0.031940143555402756,
0.167147696018219,
0.10889722406864166,
0.018116822466254234,
0.028450295329093933,
-0.03746706619858742,
-0.09121967852115631,
-0.02108120545744896,
0.04057450219988823,
0.08941640704870224,
-0.1296519637107849,
0.05931463837623596,
0.007244685664772987,
0.11771515011787415,
-0.07625721395015717,
-0.06661512702703476,
-0.06327646225690842,
0.025425037369132042,
0.0734168067574501,
0.09039422869682312,
-0.09868791699409485,
0.024277620017528534,
-0.0050271968357264996,
0.031234806403517723,
-0.04486452415585518,
-0.011677698232233524,
-0.14105439186096191,
-0.0240186620503664,
-0.009627873077988625,
0.07624849677085876,
-0.08016607165336609,
-0.04939454421401024,
0.020765533670783043,
-0.01707942597568035,
0.08872497081756592,
0.06119685620069504,
-0.04496794566512108,
0.02102752774953842,
-0.16147996485233307,
-0.18097908794879913,
0.12430553883314133,
0.03343778848648071,
0.06415767222642899,
0.005038547795265913,
0.11109083890914917,
0.06030617654323578,
-0.02067594602704048,
0.01915178820490837,
0.08508017659187317,
-0.10080771893262863,
-0.029121510684490204,
-0.12758347392082214,
-0.10628900676965714,
-0.07108557224273682,
0.09984654188156128,
0.08639056235551834,
0.05419429764151573,
0.08345309644937515,
-0.08356122672557831,
0.03134533390402794,
-0.11545822024345398,
0.018119681626558304,
-0.029338551685214043,
-0.10752485692501068,
-0.07916232198476791,
-0.10587015002965927,
0.08011418581008911,
0.008880932815372944,
0.0664476826786995,
0.13743574917316437,
0.017303427681326866,
0.008729793131351471,
0.15362904965877533,
0.016290007159113884,
0.005314951296895742,
0.0668129026889801,
0.003946561831980944,
0.015777556225657463,
0.11316962540149689,
0.10829154402017593,
0.03038981556892395,
0.07603896409273148,
-0.0059785498306155205,
0.11755192279815674,
-0.011972326785326004,
0.05997456982731819,
0.08450835198163986,
0.01203559897840023,
-0.01050653401762247,
-0.027868561446666718,
-0.08045342564582825,
0.08728153258562088,
-0.03285107761621475,
0.01690053939819336,
0.13708947598934174,
-0.1337077021598816,
0.042672231793403625,
0.026985332369804382,
-0.0551849827170372,
-0.024957900866866112,
-0.21471183001995087,
-0.11392342299222946,
-0.08052785694599152,
0.03884538635611534,
-0.04130658507347107,
-0.060995861887931824,
0.11091354489326477,
0.02844683825969696,
-0.030322344973683357,
0.17285692691802979,
-0.17024332284927368,
-0.09135377407073975,
0.1290961354970932,
-0.029221560806035995,
-0.03268488124012947,
0.011722485534846783,
0.03857826814055443,
0.05097362771630287,
0.08401205390691757,
0.014766629785299301,
-0.011109510436654091,
0.011635434813797474,
0.056350648403167725,
-0.09981859475374222,
-0.09167876839637756,
-0.005859402474015951,
-0.0005883154808543622,
-0.034298304468393326,
0.10945543646812439,
-0.03383595123887062,
0.041354984045028687,
-0.008936284109950066,
0.1294357031583786,
-0.057654257863759995,
0.03526449203491211,
-0.12671145796775818,
0.16770467162132263,
0.08822590112686157,
0.027671687304973602,
-0.007538457866758108,
-0.012322531081736088,
0.005310431122779846,
0.19850799441337585,
0.146517813205719,
-0.01926291733980179,
0.005686479154974222,
0.040676988661289215,
0.0016017778543755412,
-0.008518110029399395,
0.07472474873065948,
0.035468824207782745,
0.16594862937927246,
-0.05107942223548889,
0.07951857149600983,
-0.06667924672365189,
-0.019605541601777077,
-0.07997292280197144,
-0.09883762896060944,
0.06619518250226974,
-0.03586249426007271,
-0.0979185625910759,
0.1923186182975769,
-0.05407030135393143,
0.10333787649869919,
0.2250918596982956,
-0.10357964783906937,
-0.08996095508337021,
-0.0014895714120939374,
-0.04041577875614166,
0.03664551302790642,
0.05644625797867775,
-0.07983709126710892,
0.07205080986022949,
-0.0059592751786112785,
0.039045847952365875,
-0.14667513966560364,
-0.09750763326883316,
0.024079235270619392,
-0.0446268767118454,
0.22763580083847046,
-0.049285244196653366,
0.07241731882095337,
0.04815707355737686,
0.011172636412084103,
-0.06473305821418762,
0.07198309153318405,
-0.07125826925039291,
-0.06556419283151627,
0.09966393560171127,
0.160853311419487,
-0.007315691560506821,
0.02472493052482605,
-0.0224288422614336,
-0.08046544343233109,
0.01020015962421894,
-0.14879466593265533,
0.04379639774560928,
-0.01786303147673607,
0.10010140389204025,
-0.07500310242176056,
0.04667023569345474,
0.12089522182941437,
0.04245147481560707,
-0.06013680249452591,
-0.06642884761095047,
0.024313827976584435,
0.07844068855047226,
-0.15867295861244202,
-0.11108723282814026,
-0.04335769638419151,
-0.021058091893792152,
-0.14100347459316254,
-0.008080553263425827,
-0.08364854007959366,
-0.006403855513781309,
-0.08275862783193588,
-0.016136936843395233,
-0.0890834778547287,
0.1195945143699646,
0.04703992232680321,
-0.030564283952116966,
0.010745042935013771,
-0.1504765897989273,
0.06285188347101212,
0.09078629314899445,
-0.11877089738845825,
-0.11403242498636246
] |
null | null | transformers |
Working example of using pretrained model to predict emotion in local audio file
```
def predict_emotion_hubert(audio_file):
""" inspired by an example from https://github.com/m3hrdadfi/soxan """
from audio_models import HubertForSpeechClassification
from transformers import Wav2Vec2FeatureExtractor, AutoConfig
import torch.nn.functional as F
import torch
import numpy as np
from pydub import AudioSegment
model = HubertForSpeechClassification.from_pretrained("Rajaram1996/Hubert_emotion") # Downloading: 362M
feature_extractor = Wav2Vec2FeatureExtractor.from_pretrained("facebook/hubert-base-ls960")
sampling_rate=16000 # defined by the model; must convert mp3 to this rate.
config = AutoConfig.from_pretrained("Rajaram1996/Hubert_emotion")
def speech_file_to_array(path, sampling_rate):
# using torchaudio...
# speech_array, _sampling_rate = torchaudio.load(path)
# resampler = torchaudio.transforms.Resample(_sampling_rate, sampling_rate)
# speech = resampler(speech_array).squeeze().numpy()
sound = AudioSegment.from_file(path)
sound = sound.set_frame_rate(sampling_rate)
sound_array = np.array(sound.get_array_of_samples())
return sound_array
sound_array = speech_file_to_array(audio_file, sampling_rate)
inputs = feature_extractor(sound_array, sampling_rate=sampling_rate, return_tensors="pt", padding=True)
inputs = {key: inputs[key].to("cpu").float() for key in inputs}
with torch.no_grad():
logits = model(**inputs).logits
scores = F.softmax(logits, dim=1).detach().cpu().numpy()[0]
outputs = [{
"emo": config.id2label[i],
"score": round(score * 100, 1)}
for i, score in enumerate(scores)
]
return [row for row in sorted(outputs, key=lambda x:x["score"], reverse=True) if row['score'] != '0.0%'][:2]
```
```
result = predict_emotion_hubert("male-crying.mp3")
>>> result
[{'emo': 'male_sad', 'score': 91.0}, {'emo': 'male_fear', 'score': 4.8}]
```
| {"tags": ["speech", "audio", "HUBert"], "inference": true, "pipeline_tag": "audio-classification"} | audio-classification | Rajaram1996/Hubert_emotion | [
"transformers",
"pytorch",
"hubert",
"speech",
"audio",
"HUBert",
"audio-classification",
"endpoints_compatible",
"has_space",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [] | TAGS
#transformers #pytorch #hubert #speech #audio #HUBert #audio-classification #endpoints_compatible #has_space #region-us
|
Working example of using pretrained model to predict emotion in local audio file
| [] | [
"TAGS\n#transformers #pytorch #hubert #speech #audio #HUBert #audio-classification #endpoints_compatible #has_space #region-us \n"
] | [
43
] | [
"passage: TAGS\n#transformers #pytorch #hubert #speech #audio #HUBert #audio-classification #endpoints_compatible #has_space #region-us \n"
] | [
-0.06485438346862793,
0.06310027837753296,
-0.006011627614498138,
-0.016472643241286278,
0.07540974020957947,
-0.02820802479982376,
0.04543856903910637,
0.07195255905389786,
0.05011708661913872,
0.04251401126384735,
0.025530805811285973,
0.14635074138641357,
-0.06260419636964798,
-0.03694445267319679,
-0.09574999660253525,
-0.2898493707180023,
0.058627087622880936,
0.04658294841647148,
-0.008682872168719769,
0.07891124486923218,
0.07428378611803055,
-0.1114702820777893,
-0.004263517912477255,
0.013729080557823181,
-0.11099054664373398,
0.0579029843211174,
0.0440526008605957,
-0.10003286600112915,
0.08788865804672241,
0.00040266994619742036,
0.13593466579914093,
0.049099475145339966,
-0.04618571698665619,
-0.12044037878513336,
0.03468606248497963,
-0.008483641780912876,
-0.023606346920132637,
0.023248933255672455,
0.0491839163005352,
-0.10463672131299973,
0.018911348655819893,
0.049587324261665344,
-0.03697937726974487,
0.042363252490758896,
-0.12417878210544586,
-0.17007359862327576,
-0.01763727329671383,
0.07081194221973419,
-0.020306173712015152,
0.05943150073289871,
-0.040564242750406265,
0.07333429157733917,
-0.14690905809402466,
0.07508178800344467,
0.18865489959716797,
-0.24716417491436005,
-0.006710052490234375,
0.10789954662322998,
0.18070343136787415,
0.09987533092498779,
-0.11441893130540848,
0.08077850937843323,
0.05385294184088707,
0.01488281786441803,
0.02284049242734909,
-0.10802880674600601,
-0.12024607509374619,
0.014308670535683632,
-0.11663680523633957,
-0.03170722350478172,
0.22694167494773865,
-0.02310028299689293,
0.03796302154660225,
-0.05836322903633118,
-0.04378452152013779,
-0.10780948400497437,
-0.0008542338036932051,
0.046018753200769424,
0.0053827171213924885,
0.034057676792144775,
-0.0007912250002846122,
0.0005410859012044966,
-0.10273443162441254,
0.04390720650553703,
-0.13662205636501312,
0.19205015897750854,
-0.013597878627479076,
0.03125513717532158,
-0.1040923222899437,
0.0054841660894453526,
-0.010022194124758244,
-0.10058893263339996,
0.06206909194588661,
-0.08024395257234573,
-0.0009174461010843515,
-0.00887864176183939,
-0.05883539468050003,
0.06184503808617592,
0.06568673253059387,
0.06698127835988998,
-0.09196503460407257,
0.04361548647284508,
-0.050234489142894745,
0.13824462890625,
0.11680014431476593,
0.04112499952316284,
0.0017021371750161052,
-0.08931273967027664,
-0.010631231591105461,
-0.05399682745337486,
0.0034530048724263906,
-0.05713491886854172,
-0.14611805975437164,
-0.001472190604545176,
0.02467956952750683,
0.02337876334786415,
0.014443422667682171,
-0.008444436825811863,
-0.08679813146591187,
-0.007780646439641714,
0.005831445567309856,
-0.03769385442137718,
0.03772274777293205,
0.008318254724144936,
0.12005876004695892,
0.13472799956798553,
-0.05560484901070595,
0.054801005870103836,
-0.002918198937550187,
0.10923896729946136,
-0.04267803207039833,
0.029779141768813133,
0.012710652314126492,
-0.07257136702537537,
0.059761062264442444,
-0.18907693028450012,
0.10117873549461365,
-0.11661036312580109,
0.08446493744850159,
-0.015598191879689693,
0.017327310517430305,
0.032838691025972366,
-0.09526184946298599,
0.028324982151389122,
-0.03574669733643532,
0.05974023789167404,
-0.1053900420665741,
-0.07785232365131378,
-0.10601102560758591,
0.11474555730819702,
-0.06865816563367844,
0.10984266549348831,
-0.1318351924419403,
0.11141592264175415,
-0.05822025611996651,
0.04746633768081665,
-0.1725684255361557,
0.012059242464601994,
-0.08568844944238663,
0.03587156906723976,
0.018049074336886406,
-0.10699804127216339,
-0.14663758873939514,
0.12266403436660767,
-0.10448022931814194,
0.054656192660331726,
-0.11306819319725037,
-0.14346638321876526,
0.10425449907779694,
-0.06247157230973244,
-0.07646161317825317,
0.1211826428771019,
0.01103049237281084,
-0.03217208757996559,
0.05124767869710922,
0.3410411477088928,
0.06801056116819382,
-0.14626672863960266,
-0.016692418605089188,
0.11222999542951584,
-0.0023620682768523693,
-0.011770831421017647,
0.051044072955846786,
0.001236302894540131,
0.01234222948551178,
-0.03026749938726425,
0.13033846020698547,
0.044798221439123154,
-0.03486763313412666,
-0.018061775714159012,
0.006539320573210716,
-0.018548745661973953,
0.059385091066360474,
0.026059765368700027,
0.05937368422746658,
-0.06813299655914307,
-0.03568446263670921,
0.14168702065944672,
0.010848994366824627,
0.03148595988750458,
0.08018366992473602,
-0.04199855774641037,
0.16837765276432037,
-0.044714607298374176,
-0.06244083121418953,
-0.21672680974006653,
0.11574747413396835,
-0.05663499981164932,
0.07260072976350784,
0.10117477923631668,
0.22748200595378876,
0.03595897555351257,
-0.12055604159832001,
-0.05894579738378525,
-0.053194765001535416,
0.09455935657024384,
0.06489178538322449,
-0.007239296566694975,
-0.14725112915039062,
0.04937676340341568,
-0.08402691036462784,
-0.038255833089351654,
-0.029361838474869728,
-0.052136268466711044,
0.14198535680770874,
0.13610412180423737,
0.030672647058963776,
0.01820346713066101,
0.03276493772864342,
0.07204591482877731,
-0.04069812223315239,
0.04272099584341049,
0.05851922556757927,
0.0059593068435788155,
-0.03271773084998131,
0.19385403394699097,
-0.15300917625427246,
0.32083144783973694,
0.21304017305374146,
-0.2825070321559906,
0.039241112768650055,
0.018762364983558655,
-0.00802435353398323,
0.02770216390490532,
0.03043276257812977,
-0.060107383877038956,
0.10809705406427383,
-0.06573116779327393,
0.09174119681119919,
-0.052647609263658524,
-0.02651624195277691,
0.014246741309762001,
-0.037572357803583145,
-0.048836566507816315,
0.0642976388335228,
-0.044362690299749374,
-0.10131067037582397,
0.195746049284935,
0.29942724108695984,
-0.040275853127241135,
0.31457027792930603,
-0.04709557443857193,
-0.0007053816807456315,
0.05244600027799606,
-0.09270686656236649,
-0.10430189222097397,
0.11870544403791428,
-0.285683274269104,
-0.06668603420257568,
0.047098878771066666,
0.032512106001377106,
0.06171693652868271,
-0.1354716420173645,
-0.028721759095788002,
-0.01812688261270523,
0.05566229671239853,
-0.13420189917087555,
0.1073315218091011,
0.04944784194231033,
0.06099307909607887,
-0.041865263134241104,
-0.1038864403963089,
0.07038073986768723,
-0.013417037203907967,
-0.024369841441512108,
0.03279412165284157,
-0.1505306363105774,
-0.2669747769832611,
-0.07784590125083923,
-0.15868961811065674,
0.007354926783591509,
0.009742597118020058,
0.12414292246103287,
-0.051736216992139816,
-0.0009531835094094276,
0.04955863952636719,
0.0807722806930542,
-0.15923181176185608,
0.01821262761950493,
-0.03055385872721672,
0.04289674758911133,
-0.03294239193201065,
-0.07385914027690887,
-0.0459660068154335,
-0.028054049238562584,
0.042318832129240036,
0.08011918514966965,
0.009115234017372131,
0.0267496258020401,
0.20526643097400665,
0.11285748332738876,
0.03176119923591614,
-0.025121668353676796,
0.2322942018508911,
-0.1675635576248169,
0.011437872424721718,
0.11630961298942566,
-0.0971401184797287,
0.015065078623592854,
0.22037456929683685,
0.0872512236237526,
0.00565347820520401,
-0.029688091948628426,
0.011509818024933338,
-0.08087384700775146,
-0.17333726584911346,
-0.13028307259082794,
-0.1959596425294876,
0.004432546440511942,
-0.050619155168533325,
0.051771488040685654,
0.09879347681999207,
-0.039117977023124695,
-0.008111024275422096,
-0.06788612902164459,
0.0015903711318969727,
-0.010432631708681583,
0.2173795849084854,
-0.06805156916379929,
0.12393911182880402,
-0.02715209499001503,
-0.10000871866941452,
0.0706111416220665,
0.10577382147312164,
0.08092054724693298,
0.16787008941173553,
-0.011380897834897041,
0.07079683989286423,
0.04669375717639923,
0.1647946983575821,
0.037722766399383545,
0.022935882210731506,
-0.01724160648882389,
-0.027365418151021004,
-0.05047257989645004,
0.03406987339258194,
0.03589002788066864,
0.2817109227180481,
-0.1412629336118698,
-0.00325812422670424,
-0.23024983704090118,
0.0354694165289402,
0.02930883690714836,
0.14154675602912903,
-0.07822196930646896,
0.021464647725224495,
0.1413809359073639,
-0.04600336775183678,
-0.0898599848151207,
0.13581550121307373,
0.10054411739110947,
-0.06704799085855484,
0.0695895105600357,
0.07991775870323181,
0.1036602109670639,
-0.04640304669737816,
0.0736784115433693,
-0.06718819588422775,
-0.17568084597587585,
0.016949687153100967,
-0.015268146060407162,
-0.1286592334508896,
0.2291414737701416,
-0.0018712447490543127,
-0.08770384639501572,
0.01430006604641676,
-0.02959073893725872,
0.05150821432471275,
0.12085463851690292,
0.13780094683170319,
0.05591527000069618,
-0.1825239360332489,
-0.12142274528741837,
0.029737263917922974,
-0.05113309621810913,
0.13686741888523102,
0.05232448875904083,
-0.06498587131500244,
-0.017276281490921974,
-0.02239105850458145,
0.014835277572274208,
0.0005702991620637476,
-0.053791917860507965,
-0.10640180855989456,
0.05894869938492775,
0.1543409824371338,
0.05081693083047867,
-0.0008851506281644106,
-0.09091192483901978,
-0.1937551349401474,
0.0507378987967968,
-0.07535452395677567,
-0.011285842396318913,
-0.08646532893180847,
-0.15986190736293793,
0.12254264205694199,
0.008646177127957344,
0.10741270333528519,
-0.02564225345849991,
-0.012785273604094982,
-0.09967795014381409,
-0.09206248074769974,
0.1668618619441986,
-0.09630102664232254,
0.016602396965026855,
-0.06504745036363602,
0.28995174169540405,
-0.03111616149544716,
0.11065316945314407,
-0.010911324061453342,
0.0874638482928276,
-0.04998619109392166,
-0.052599094808101654,
0.0548761822283268,
-0.0865771546959877,
0.005417057778686285,
0.08460969477891922,
-0.03557521104812622,
-0.08666931092739105,
0.05300794169306755,
-0.01511333603411913,
0.19882576167583466,
0.1876368373632431,
-0.03929329663515091,
0.1905522644519806,
0.08720091730356216,
-0.014637494459748268,
-0.34765398502349854,
0.0022663515992462635,
-0.0628109872341156,
0.008236020803451538,
0.002179363975301385,
-0.124427430331707,
0.12951143085956573,
-0.0727984830737114,
-0.07336017489433289,
0.10913407802581787,
-0.18332156538963318,
-0.08536972105503082,
0.19774673879146576,
-0.15591810643672943,
0.368017315864563,
-0.10104658454656601,
-0.06342840939760208,
-0.003295320086181164,
-0.1503959447145462,
0.10250414162874222,
-0.0728577971458435,
0.10179721564054489,
0.05199399217963219,
0.029429679736495018,
0.03085421584546566,
-0.030072353780269623,
0.09583587944507599,
0.05348385125398636,
-0.020793912932276726,
-0.021153708919882774,
-0.10577314347028732,
0.06146250292658806,
-0.004178876988589764,
-0.11077533662319183,
-0.0353902205824852,
-0.028082994744181633,
-0.17568935453891754,
-0.01647290773689747,
-0.12122905999422073,
0.09988170117139816,
0.010062247514724731,
-0.03660483658313751,
-0.024977844208478928,
0.038465552031993866,
-0.00806906633079052,
0.0020470239687711,
0.3439427614212036,
-0.13321876525878906,
0.12156783044338226,
0.11957485228776932,
0.10780996829271317,
-0.1439903974533081,
-0.16381029784679413,
-0.018345003947615623,
-0.06681237369775772,
0.10657650232315063,
-0.07892832905054092,
0.07258976995944977,
0.10846423357725143,
0.016092684119939804,
0.0445534810423851,
0.09151175618171692,
-0.007943244650959969,
-0.006793295964598656,
0.1442205309867859,
-0.14755842089653015,
-0.01987779326736927,
-0.06282015889883041,
-0.02705080434679985,
0.12515169382095337,
0.0012597517343237996,
0.12990008294582367,
0.048188306391239166,
-0.031104879453778267,
0.013329838402569294,
-0.06295043975114822,
-0.13922102749347687,
0.06233343854546547,
0.052963804453611374,
0.054935943335294724,
-0.1255093663930893,
0.03150055184960365,
0.008125894702970982,
-0.24300552904605865,
-0.04017112776637077,
0.060329075902700424,
-0.05456326901912689,
-0.10374639183282852,
-0.10157039016485214,
-0.05580826848745346,
-0.02498939260840416,
-0.006806416902691126,
0.04569530859589577,
-0.15750718116760254,
0.025350935757160187,
0.20153383910655975,
0.07277969270944595,
0.09294905513525009,
-0.0702364519238472,
0.0003580574702937156,
0.0776173546910286,
0.025768179446458817,
0.0052865417674183846,
0.056414004415273666,
-0.1882336139678955,
-0.023410500958561897,
-0.001243227394297719,
0.11521903425455093,
-0.11349374800920486,
-0.048065997660160065,
-0.1804572343826294,
0.058363817632198334,
-0.1286621242761612,
-0.0649411529302597,
-0.06333371251821518,
-0.05909988656640053,
0.003144416958093643,
-0.08199901133775711,
-0.07220254838466644,
-0.01177721843123436,
-0.1338530331850052,
0.0510777123272419,
0.028625017032027245,
0.08791911602020264,
-0.03849775716662407,
-0.038142867386341095,
0.11178560554981232,
-0.07571985572576523,
0.09262572228908539,
0.16310173273086548,
-0.08891810476779938,
0.05443030223250389,
-0.058386947959661484,
-0.2064988911151886,
0.1760973185300827,
0.016733579337596893,
0.04103467985987663,
-0.015445028431713581,
-0.008977696299552917,
0.06581141799688339,
0.06933048367500305,
0.03324434161186218,
-0.037740372121334076,
-0.053171414881944656,
-0.008458352647721767,
-0.0669516772031784,
-0.15626609325408936,
-0.0009515213896520436,
-0.06573576480150223,
0.22490188479423523,
0.00891116913408041,
0.10705571621656418,
-0.007966302335262299,
0.022672029212117195,
-0.03830188512802124,
0.03987783193588257,
-0.03513777628540993,
-0.2129068821668625,
-0.05669693276286125,
-0.06021644547581673,
0.006878499872982502,
-0.03974529355764389,
0.2735234200954437,
0.04835684224963188,
-0.12079864740371704,
0.05703216791152954,
0.08858887851238251,
-0.06274659931659698,
0.04619098827242851,
0.22158955037593842,
0.08059773594141006,
-0.08346699923276901,
-0.11268822848796844,
-0.012153889052569866,
0.05754154175519943,
0.1152266412973404,
0.02864498272538185,
0.17646819353103638,
0.09809654206037521,
0.03136296197772026,
0.08224749565124512,
0.01479814387857914,
-0.18614770472049713,
-0.04054386168718338,
0.027947284281253815,
0.09727834910154343,
-0.04835683852434158,
0.09504576027393341,
0.08740868419408798,
-0.03161780536174774,
0.12113301455974579,
-0.09322146326303482,
0.015708500519394875,
-0.1631215512752533,
-0.049490269273519516,
-0.039759933948516846,
-0.1273152232170105,
-0.001955525716766715,
-0.0658869743347168,
0.08369748294353485,
0.060085974633693695,
0.03917904198169708,
0.0113754877820611,
0.14839895069599152,
-0.05042184516787529,
-0.06863240152597427,
0.0905027687549591,
0.0025019380263984203,
0.01999526098370552,
-0.048437830060720444,
0.014351814053952694,
0.020099708810448647,
-0.05034225434064865,
-0.051078006625175476,
-0.039217762649059296,
-0.10598710179328918,
-0.0468495637178421,
-0.13886447250843048,
-0.10771526396274567,
-0.03719073906540871,
0.012088313698768616,
0.00430901488289237,
0.1862594187259674,
0.019973335787653923,
0.007565367501229048,
0.01744845137000084,
0.19816860556602478,
-0.11948774009943008,
-0.04685385152697563,
-0.006839736830443144,
0.11933312565088272,
-0.037233494222164154,
0.12485168129205704,
-0.04239792376756668,
-0.010565068572759628,
-0.09441714733839035,
0.15990935266017914,
0.3259252905845642,
-0.07869022339582443,
0.06326229870319366,
0.03929533809423447,
0.03543388098478317,
0.013305889442563057,
-0.005912333261221647,
0.13115137815475464,
0.26489341259002686,
-0.01949378103017807,
-0.08428585529327393,
-0.07211730629205704,
-0.02265067584812641,
-0.026437077671289444,
0.06959885358810425,
0.06132680922746658,
-0.1238870620727539,
-0.025561099871993065,
0.09812842309474945,
-0.2528482675552368,
0.005224013701081276,
-0.03279967978596687,
-0.2530728876590729,
-0.06944651156663895,
-0.0370752178132534,
0.18033766746520996,
0.06073867902159691,
0.030922265723347664,
-0.05081801861524582,
-0.16002294421195984,
-0.010103944689035416,
0.0031323267612606287,
-0.14369964599609375,
0.0005619734874926507,
0.05154742673039436,
-0.06145237013697624,
-0.02137927897274494,
-0.03884367644786835,
0.06072257459163666,
0.02094440534710884,
0.11926103383302689,
0.073050856590271,
0.01908569224178791,
0.05641980841755867,
-0.22096078097820282,
-0.13358832895755768,
0.20222216844558716,
-0.04379977658390999,
0.046087395399808884,
0.09208078682422638,
-0.1430196613073349,
0.04238198697566986,
-0.032799940556287766,
-0.09061050415039062,
-0.028630150482058525,
-0.0033927815966308117,
-0.08290427923202515,
0.03976738825440407,
-0.02582532726228237,
0.01808219775557518,
-0.0668724924325943,
-0.02826952002942562,
-0.04240410774946213,
0.10699323564767838,
-0.006238402333110571,
-0.10822867602109909,
-0.11511778086423874,
-0.04440544918179512,
-0.03074493259191513,
-0.04049113392829895,
-0.11112888902425766,
-0.0424722395837307,
-0.04225846379995346,
0.10719500482082367,
-0.12966057658195496,
0.01721741072833538,
0.02744700014591217,
-0.005054377485066652,
0.020700670778751373,
-0.026745745912194252,
0.07448926568031311,
0.04137676954269409,
-0.11847785860300064,
-0.03300420194864273
] |
null | null | transformers |
# Wav2Vec2-Large-XLSR-53-tamil
Fine-tuned [facebook/wav2vec2-large-xlsr-53](https://huggingface.co/facebook/wav2vec2-large-xlsr-53) in Tamil using the [Common Voice](https://huggingface.co/datasets/common_voice)
When using this model, make sure that your speech input is sampled at 16kHz.
## Usage
The model can be used directly (without a language model) as follows:
```python
import torch
import torchaudio
from datasets import load_dataset
from transformers import Wav2Vec2ForCTC, Wav2Vec2Processor
test_dataset = load_dataset("common_voice", "ta", split="test[:2%]")
processor = Wav2Vec2Processor.from_pretrained("Rajaram1996/wav2vec2-large-xlsr-53-tamil")
model = Wav2Vec2ForCTC.from_pretrained("Rajaram1996/wav2vec2-large-xlsr-53-tamil")
resampler = torchaudio.transforms.Resample(48_000, 16_000)
# Preprocessing the datasets.
# We need to read the aduio files as arrays
def speech_file_to_array_fn(batch):
speech_array, sampling_rate = torchaudio.load(batch["path"])
batch["speech"] = resampler(speech_array).squeeze().numpy()
return batch
test_dataset = test_dataset.map(speech_file_to_array_fn)
inputs = processor(test_dataset["speech"][:2], sampling_rate=16_000, return_tensors="pt", padding=True)
with torch.no_grad():
logits = model(inputs.input_values, attention_mask=inputs.attention_mask).logits
predicted_ids = torch.argmax(logits, dim=-1)
print("Prediction:", processor.batch_decode(predicted_ids))
print("Reference:", test_dataset["sentence"][:2])
```
## Evaluation
The model can be evaluated as follows on the {language} test data of Common Voice.
```python
import torch
import torchaudio
from datasets import load_dataset, load_metric
from transformers import Wav2Vec2ForCTC, Wav2Vec2Processor
import re
test_dataset = load_dataset("common_voice", "ta", split="test")
wer = load_metric("wer")
processor = Wav2Vec2Processor.from_pretrained("Rajaram1996/wav2vec2-large-xlsr-53-tamil")
model = Wav2Vec2ForCTC.from_pretrained("Rajaram1996/wav2vec2-large-xlsr-53-tamil")
model.to("cuda")
chars_to_ignore_regex = '[\\\\,\\\\?\\\\.\\\\!\\\\-\\\\;\\\\:\\\\"\\\\“]'
resampler = torchaudio.transforms.Resample(48_000, 16_000)
# Preprocessing the datasets.
# We need to read the aduio files as arrays
def speech_file_to_array_fn(batch):
batch["sentence"] = re.sub(chars_to_ignore_regex, '', batch["sentence"]).lower()
speech_array, sampling_rate = torchaudio.load(batch["path"])
batch["speech"] = resampler(speech_array).squeeze().numpy()
return batch
test_dataset = test_dataset.map(speech_file_to_array_fn)
# Preprocessing the datasets.
# We need to read the aduio files as arrays
def evaluate(batch):
inputs = processor(batch["speech"], sampling_rate=16_000, return_tensors="pt", padding=True)
with torch.no_grad():
logits = model(inputs.input_values.to("cuda"), attention_mask=inputs.attention_mask.to("cuda")).logits
pred_ids = torch.argmax(logits, dim=-1)
batch["pred_strings"] = processor.batch_decode(pred_ids)
return batch
result = test_dataset.map(evaluate, batched=True, batch_size=8)
print("WER: {:2f}".format(100 * wer.compute(predictions=result["pred_strings"], references=result["sentence"])))
```
**Test Result**: 69.76 % | {"language": ["ta"], "license": "apache-2.0", "tags": ["audio", "automatic-speech-recognition", "speech", "xlsr-fine-tuning-week", "hf-asr-leaderboard"], "datasets": ["common_voice"], "model-index": [{"name": "Rajaram1996/wav2vec2-large-xlsr-53-tamil", "results": [{"task": {"type": "automatic-speech-recognition", "name": "Speech Recognition"}, "dataset": {"name": "Common Voice ta", "type": "common_voice", "args": "ta"}, "metrics": [{"type": "wer", "value": 69.76, "name": "Test WER"}]}]}]} | automatic-speech-recognition | Rajaram1996/wav2vec2-large-xlsr-53-tamil | [
"transformers",
"pytorch",
"jax",
"wav2vec2",
"automatic-speech-recognition",
"audio",
"speech",
"xlsr-fine-tuning-week",
"hf-asr-leaderboard",
"ta",
"dataset:common_voice",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [
"ta"
] | TAGS
#transformers #pytorch #jax #wav2vec2 #automatic-speech-recognition #audio #speech #xlsr-fine-tuning-week #hf-asr-leaderboard #ta #dataset-common_voice #license-apache-2.0 #model-index #endpoints_compatible #region-us
|
# Wav2Vec2-Large-XLSR-53-tamil
Fine-tuned facebook/wav2vec2-large-xlsr-53 in Tamil using the Common Voice
When using this model, make sure that your speech input is sampled at 16kHz.
## Usage
The model can be used directly (without a language model) as follows:
## Evaluation
The model can be evaluated as follows on the {language} test data of Common Voice.
Test Result: 69.76 % | [
"# Wav2Vec2-Large-XLSR-53-tamil\n\nFine-tuned facebook/wav2vec2-large-xlsr-53 in Tamil using the Common Voice\n\nWhen using this model, make sure that your speech input is sampled at 16kHz.",
"## Usage\nThe model can be used directly (without a language model) as follows:",
"## Evaluation\n\nThe model can be evaluated as follows on the {language} test data of Common Voice.\n\n\n\nTest Result: 69.76 %"
] | [
"TAGS\n#transformers #pytorch #jax #wav2vec2 #automatic-speech-recognition #audio #speech #xlsr-fine-tuning-week #hf-asr-leaderboard #ta #dataset-common_voice #license-apache-2.0 #model-index #endpoints_compatible #region-us \n",
"# Wav2Vec2-Large-XLSR-53-tamil\n\nFine-tuned facebook/wav2vec2-large-xlsr-53 in Tamil using the Common Voice\n\nWhen using this model, make sure that your speech input is sampled at 16kHz.",
"## Usage\nThe model can be used directly (without a language model) as follows:",
"## Evaluation\n\nThe model can be evaluated as follows on the {language} test data of Common Voice.\n\n\n\nTest Result: 69.76 %"
] | [
90,
60,
20,
30
] | [
"passage: TAGS\n#transformers #pytorch #jax #wav2vec2 #automatic-speech-recognition #audio #speech #xlsr-fine-tuning-week #hf-asr-leaderboard #ta #dataset-common_voice #license-apache-2.0 #model-index #endpoints_compatible #region-us \n# Wav2Vec2-Large-XLSR-53-tamil\n\nFine-tuned facebook/wav2vec2-large-xlsr-53 in Tamil using the Common Voice\n\nWhen using this model, make sure that your speech input is sampled at 16kHz.## Usage\nThe model can be used directly (without a language model) as follows:## Evaluation\n\nThe model can be evaluated as follows on the {language} test data of Common Voice.\n\n\n\nTest Result: 69.76 %"
] | [
-0.16092170774936676,
-0.0070624761283397675,
-0.0024830391630530357,
-0.044864755123853683,
0.05153346806764603,
-0.09359299391508102,
0.10853168368339539,
0.09462780505418777,
0.06810605525970459,
0.01093937549740076,
0.010181811638176441,
0.016100402921438217,
0.06837795674800873,
0.10123593360185623,
-0.05412781238555908,
-0.15779733657836914,
0.015163024887442589,
0.007862400263547897,
0.09644084423780441,
0.09497284889221191,
0.09384439885616302,
-0.05966728553175926,
0.02875632233917713,
0.10398513078689575,
-0.0511317104101181,
0.06158937141299248,
0.056684710085392,
-0.10868123173713684,
0.11542747169733047,
0.05767963081598282,
-0.015355504117906094,
0.0692066103219986,
0.06910718232393265,
-0.15305590629577637,
0.031893398612737656,
-0.0028111031278967857,
0.06437256932258606,
0.0005524384323507547,
0.04944485053420067,
0.047994162887334824,
0.12384246289730072,
0.10756508260965347,
-0.04615407437086105,
0.10096420347690582,
-0.02884049341082573,
-0.17849314212799072,
-0.015421721152961254,
0.014078076928853989,
0.09993598610162735,
0.10875297337770462,
-0.08824187517166138,
0.1608724296092987,
-0.09471878409385681,
0.11066064238548279,
0.10001341998577118,
-0.16386424005031586,
0.019977254793047905,
0.033735476434230804,
0.059392161667346954,
0.09157734364271164,
-0.03844573348760605,
0.01763894222676754,
0.0334569476544857,
0.0360727496445179,
-0.028799019753932953,
-0.06860291212797165,
-0.1682649701833725,
-0.029040925204753876,
-0.14093250036239624,
0.04846372455358505,
0.18734648823738098,
0.003511796472594142,
-0.04866805672645569,
-0.0993277058005333,
-0.06307590007781982,
0.0017060734098777175,
-0.03917858749628067,
-0.0636419802904129,
-0.021655837073922157,
0.03407486155629158,
0.07810501754283905,
0.0020927307195961475,
-0.09853944182395935,
-0.11495286226272583,
0.020403176546096802,
0.044472888112068176,
0.014447282999753952,
0.021789224818348885,
-0.15847443044185638,
-0.005064217373728752,
-0.13834865391254425,
-0.07364998757839203,
-0.03346693143248558,
0.03241918236017227,
-0.08062130212783813,
0.030064325779676437,
-0.05088251829147339,
-0.18415102362632751,
0.09070614725351334,
-0.09624139219522476,
-0.011776815168559551,
0.03876413404941559,
-0.042182162404060364,
0.023284457623958588,
0.028664449229836464,
0.11707428842782974,
-0.11811785399913788,
-0.015384736470878124,
0.053000062704086304,
0.032412659376859665,
0.023640528321266174,
-0.023712243884801865,
-0.02176452800631523,
-0.06705964356660843,
0.03078625351190567,
0.03138246387243271,
-0.052303653210401535,
-0.004910687915980816,
-0.023867079988121986,
-0.03348008543252945,
0.06925549358129501,
-0.11582329869270325,
-0.044331055134534836,
0.057697758078575134,
0.05033577233552933,
0.16488809883594513,
0.02813149243593216,
0.050147246569395065,
-0.0740995779633522,
-0.07198888063430786,
-0.004596138838678598,
0.055206600576639175,
0.013734682463109493,
-0.06769447773694992,
0.003438180312514305,
-0.02182621695101261,
-0.031520236283540726,
-0.07339702546596527,
-0.05394793301820755,
-0.05181897431612015,
-0.08264733850955963,
0.0031280401162803173,
-0.05492355301976204,
-0.06934478878974915,
-0.0033908975310623646,
-0.01444502267986536,
-0.08183349668979645,
0.010594426654279232,
-0.057006411254405975,
0.06242328882217407,
0.12547144293785095,
0.10994372516870499,
0.010785195045173168,
0.07844749093055725,
-0.05931302532553673,
-0.014779781922698021,
0.015883099287748337,
0.12491245567798615,
-0.0787225216627121,
-0.028992576524615288,
-0.09625095874071121,
-0.06519634276628494,
-0.05843101814389229,
0.045226987451314926,
0.03281340375542641,
0.07338310778141022,
-0.2503937780857086,
-0.09368710219860077,
0.16764837503433228,
-0.12399058043956757,
-0.05303128436207771,
0.21652506291866302,
0.04941600561141968,
0.06366246193647385,
0.12114443629980087,
0.22976988554000854,
0.06121694669127464,
-0.1873830407857895,
0.009795892983675003,
0.04362087696790695,
-0.030967291444540024,
-0.0674639642238617,
0.05099757760763168,
-0.04734194278717041,
0.014329315163195133,
0.02210003137588501,
-0.023618876934051514,
0.07115453481674194,
-0.018495872616767883,
-0.045750878751277924,
-0.030993597581982613,
-0.1118900328874588,
0.014889916405081749,
0.022015273571014404,
0.00898565910756588,
-0.005108634941279888,
-0.020852454006671906,
-0.009953279048204422,
0.13566064834594727,
-0.11772468686103821,
0.0480799674987793,
-0.16981852054595947,
0.11993458867073059,
-0.12072941660881042,
0.009377342648804188,
-0.12903229892253876,
0.2224748283624649,
-0.005519333761185408,
0.06116434559226036,
0.07406803965568542,
0.15198594331741333,
0.01981515809893608,
-0.032948799431324005,
-0.020221535116434097,
0.00023918425722513348,
0.10560664534568787,
0.0063036601059138775,
-0.035878680646419525,
-0.10004343837499619,
0.013292177580296993,
-0.044595714658498764,
0.10454174876213074,
-0.14234723150730133,
-0.028838083148002625,
0.11906059831380844,
0.017854811623692513,
0.0009002291481010616,
-0.00005235754360910505,
0.12163221836090088,
0.05622510239481926,
0.038270220160484314,
0.007647447753697634,
0.01216172520071268,
-0.018947018310427666,
-0.046255018562078476,
0.15106837451457977,
-0.13939951360225677,
0.020396210253238678,
0.09472604095935822,
-0.02164589799940586,
0.03197960555553436,
0.14993658661842346,
0.004401254002004862,
-0.0332062728703022,
-0.04795139282941818,
0.00739652244374156,
0.26355159282684326,
0.022787660360336304,
0.12956976890563965,
-0.08517513424158096,
0.016563251614570618,
0.03451139107346535,
-0.08993643522262573,
0.0634467676281929,
0.0618419274687767,
0.055974166840314865,
-0.038500428199768066,
-0.014095684513449669,
-0.05606997758150101,
-0.0968894436955452,
0.19384096562862396,
-0.023093031719326973,
-0.10577770322561264,
0.056729596108198166,
-0.03185146674513817,
-0.036235932260751724,
0.028369804844260216,
-0.20675279200077057,
-0.033826567232608795,
0.03156198188662529,
0.01141976285725832,
0.06866060197353363,
-0.11612419039011002,
0.047590889036655426,
-0.015890035778284073,
-0.10327067226171494,
-0.1693696677684784,
0.13842390477657318,
-0.04228592291474342,
0.02539926767349243,
-0.1252017617225647,
-0.10437848418951035,
-0.029695766046643257,
-0.0365237295627594,
-0.1773122400045395,
0.08167537301778793,
-0.025075189769268036,
-0.293239027261734,
-0.14159369468688965,
-0.05158515274524689,
-0.008898151107132435,
-0.006261833012104034,
0.08607380837202072,
-0.1411251574754715,
-0.05860143154859543,
-0.041185569018125534,
0.1092626228928566,
0.057418178766965866,
-0.01657174900174141,
-0.007867299020290375,
-0.02680930495262146,
0.09897689521312714,
-0.144121453166008,
-0.0008819604408927262,
-0.03840550035238266,
-0.011877904646098614,
0.028830762952566147,
-0.010870489291846752,
-0.009527178481221199,
0.19849449396133423,
0.025967005640268326,
0.03585192561149597,
0.018145840615034103,
0.22509470582008362,
-0.08211471140384674,
-0.010062573477625847,
0.22240819036960602,
0.018424347043037415,
-0.017712296918034554,
0.11829912662506104,
0.014570487663149834,
-0.0642741322517395,
-0.010712014511227608,
0.0010917617473751307,
-0.029002627357840538,
-0.20725899934768677,
-0.10303496569395065,
-0.06974530220031738,
-0.0730670690536499,
-0.059193674474954605,
0.011026500724256039,
0.043403513729572296,
-0.006002518814057112,
-0.01864335685968399,
-0.05675618723034859,
0.02117924392223358,
-0.033306289464235306,
0.1793765127658844,
-0.016912756487727165,
0.09519416838884354,
-0.07004917412996292,
-0.030795171856880188,
0.024343259632587433,
0.017728202044963837,
0.03501012548804283,
0.0929306149482727,
0.08615806698799133,
0.05806095525622368,
0.11958012729883194,
0.13584284484386444,
0.04754381254315376,
-0.09198495000600815,
-0.043934304267168045,
0.018885117024183273,
-0.06256557255983353,
-0.027108609676361084,
0.058237843215465546,
0.1691974550485611,
-0.04817390814423561,
0.019212104380130768,
0.03090953454375267,
-0.015053052455186844,
0.21781311929225922,
0.09860123693943024,
-0.17962560057640076,
-0.05040878802537918,
-0.023351795971393585,
-0.10399463772773743,
0.0075944168493151665,
0.057642094790935516,
0.09604905545711517,
-0.06830240041017532,
0.07172538340091705,
0.022554008290171623,
0.0775839313864708,
-0.0012170501286163926,
0.05410708114504814,
-0.16130228340625763,
0.02804696001112461,
0.01627342589199543,
0.06287727504968643,
-0.1872076392173767,
0.17705853283405304,
0.02445167303085327,
0.07955724000930786,
-0.0031274498905986547,
-0.007038593757897615,
0.012859293259680271,
0.12022373825311661,
0.06463295966386795,
0.00525280274450779,
0.1240154504776001,
-0.10826464742422104,
-0.08004128187894821,
0.0585661344230175,
-0.030505914241075516,
0.13538403809070587,
0.05415615811944008,
-0.005079229362308979,
-0.03396415337920189,
0.00863282848149538,
-0.08573345094919205,
-0.1394808143377304,
-0.00458122231066227,
0.062049947679042816,
0.2556249797344208,
0.09280189871788025,
-0.019033024087548256,
-0.09833057224750519,
-0.1579309105873108,
0.01024496927857399,
-0.11450904607772827,
-0.06539873778820038,
-0.031015146523714066,
-0.08479457348585129,
0.13713514804840088,
-0.06348390877246857,
-0.031208902597427368,
0.07090214639902115,
0.1187695637345314,
-0.01798793487250805,
-0.009676161222159863,
0.05831349641084671,
-0.0691467672586441,
-0.10315689444541931,
0.04031316563487053,
0.19479623436927795,
0.06018103286623955,
0.056443970650434494,
0.053395263850688934,
0.0007930409628897905,
-0.013075381517410278,
-0.02932398021221161,
0.026139819994568825,
0.11148091405630112,
-0.22221261262893677,
-0.01612330600619316,
0.07542457431554794,
-0.1678844541311264,
-0.12976232171058655,
-0.04740982502698898,
0.13714009523391724,
0.08133605867624283,
-0.019676448777318,
0.1907946914434433,
0.23414571583271027,
-0.050755519419908524,
-0.17000418901443481,
-0.08837059140205383,
0.06098880618810654,
0.11051327735185623,
-0.01166791282594204,
-0.0930066928267479,
0.12263945490121841,
-0.04005108401179314,
-0.05279439687728882,
-0.06296631693840027,
-0.17767220735549927,
-0.1401209831237793,
0.18543873727321625,
-0.059307269752025604,
0.1657881736755371,
-0.02211795747280121,
-0.045729830861091614,
-0.05875667184591293,
-0.024526311084628105,
-0.06286527216434479,
-0.08839835971593857,
0.08905486017465591,
-0.011383108794689178,
0.1205577552318573,
0.03517993539571762,
-0.0014218990691006184,
0.07492552697658539,
0.06280667334794998,
-0.04650215432047844,
0.02304462529718876,
0.06424187123775482,
-0.002251226920634508,
0.058432526886463165,
0.19949127733707428,
-0.07146594673395157,
0.013297121040523052,
-0.040059447288513184,
-0.10592286288738251,
-0.08522260189056396,
0.05917908996343613,
0.086824432015419,
0.008564982563257217,
0.05094999074935913,
-0.08296242356300354,
-0.020172132179141045,
0.01141340658068657,
-0.02509048394858837,
-0.17603233456611633,
0.01121020969003439,
0.15714746713638306,
0.2411845624446869,
-0.19019193947315216,
-0.07497220486402512,
-0.041563745588064194,
-0.02423926256597042,
0.09348613023757935,
-0.03861948847770691,
0.06548821181058884,
0.033059608191251755,
0.056860294193029404,
0.11049967259168625,
-0.007607110310345888,
-0.10257220268249512,
0.07908079773187637,
0.044631246477365494,
-0.006860099732875824,
-0.17666053771972656,
-0.0257788747549057,
0.009655035100877285,
-0.026161983609199524,
0.0830756425857544,
0.13375882804393768,
-0.09382883459329605,
-0.024007949978113174,
-0.030430495738983154,
0.047003235667943954,
-0.12919865548610687,
0.2119825929403305,
0.046315766870975494,
0.07316943258047104,
-0.14180223643779755,
-0.010342464782297611,
-0.008527152240276337,
-0.0013667333405464888,
0.021737579256296158,
0.029123472049832344,
-0.06620416790246964,
-0.0790090411901474,
-0.07573670893907547,
0.03278346732258797,
0.06296291947364807,
-0.13161662220954895,
-0.07155207544565201,
-0.08535376936197281,
0.003010151442140341,
0.14806817471981049,
0.0384589247405529,
0.006510988809168339,
-0.13629914820194244,
-0.10714031755924225,
-0.07614827901124954,
0.03316139057278633,
0.06687068194150925,
-0.012945190072059631,
-0.15898598730564117,
0.08748362213373184,
0.061662085354328156,
0.05909540504217148,
-0.057901401072740555,
-0.07772541046142578,
0.04089992865920067,
0.0747477188706398,
-0.10481871664524078,
0.016179144382476807,
-0.06529717892408371,
0.018718672916293144,
0.011898110620677471,
-0.08966423571109772,
-0.02078673616051674,
0.0873277336359024,
-0.08964309841394424,
0.08177874237298965,
-0.003054175293073058,
0.08779246360063553,
-0.08409596979618073,
0.046637460589408875,
0.04044042527675629,
-0.023614855483174324,
0.08669499307870865,
0.08764864504337311,
-0.17780740559101105,
0.13234369456768036,
-0.1758711338043213,
-0.1051352322101593,
0.09494367986917496,
0.09718155860900879,
0.010652180761098862,
-0.10630632936954498,
0.01747545599937439,
0.11692783981561661,
0.08910025656223297,
-0.011527076363563538,
0.08405545353889465,
-0.03692300245165825,
-0.010477268137037754,
-0.15150387585163116,
-0.005849247332662344,
-0.029136713594198227,
0.009801935404539108,
0.05987264961004257,
0.16921666264533997,
0.1555970162153244,
-0.10830767452716827,
0.04298095405101776,
-0.09555023908615112,
0.03220370039343834,
-0.06935493648052216,
-0.032468054443597794,
-0.13832540810108185,
-0.06841891258955002,
0.09258121997117996,
-0.04259386286139488,
0.12060177326202393,
-0.003279270837083459,
0.06333884596824646,
-0.02143274061381817,
-0.10053499788045883,
0.0035708758514374495,
-0.03429460898041725,
0.24566537141799927,
0.07286001741886139,
0.04809403046965599,
0.01213564071804285,
-0.04286821559071541,
0.031683918088674545,
0.16386763751506805,
-0.04418403282761574,
0.09685219824314117,
-0.02358568087220192,
0.08756046742200851,
0.10280650854110718,
-0.06477154046297073,
-0.0031337460968643427,
-0.0028484759386628866,
-0.19876417517662048,
0.01839613728225231,
-0.057995181530714035,
0.19305793941020966,
0.1723499894142151,
-0.020061327144503593,
0.052083659917116165,
-0.02332085743546486,
-0.08224987238645554,
-0.1642579734325409,
-0.07639653980731964,
-0.10397762805223465,
-0.18130002915859222,
0.037700917571783066,
-0.07515926659107208,
0.029748184606432915,
0.017357952892780304,
0.03966940566897392,
-0.025430820882320404,
0.15942107141017914,
0.01566510647535324,
-0.08989563584327698,
0.07046136260032654,
-0.09372097253799438,
-0.036170948296785355,
-0.06008322164416313,
0.07029479742050171,
0.16030052304267883,
0.0016579640796408057,
0.04808269813656807,
0.005933762528002262,
-0.11076424270868301,
0.023065531626343727,
-0.08272257447242737,
-0.07952509820461273,
-0.008837130852043629,
-0.030002126470208168,
0.04109882935881615,
0.1046273410320282,
0.12170626223087311,
-0.0674714744091034,
-0.004119720309972763,
0.04303384944796562,
-0.04712344706058502,
-0.16526693105697632,
-0.11187953501939774,
0.18666549026966095,
0.030512336641550064,
0.04031597822904587,
-0.016722558066248894,
-0.04914822056889534,
-0.013086208142340183,
0.23084703087806702,
0.12773515284061432,
0.028349105268716812,
0.040488000959157944,
-0.07744566351175308,
0.008765305392444134,
-0.0910729467868805,
0.016766462475061417,
0.09228077530860901,
0.21846993267536163,
0.0006969556561671197,
0.006186569109559059,
-0.1081228107213974,
-0.08645304292440414,
-0.033664654940366745,
-0.024629317224025726,
-0.03382275998592377,
-0.06818520277738571,
0.011890332214534283,
0.14956960082054138,
-0.05356266349554062,
-0.08497397601604462,
-0.20578394830226898,
-0.0478052943944931,
-0.0645892396569252,
-0.025455357506871223,
-0.03125333413481712,
0.09011869877576828,
-0.011110291816294193,
-0.0824965387582779,
0.027948247268795967,
0.14088666439056396,
0.004881420638412237,
-0.09341094642877579,
0.0006063121254555881,
0.07527463138103485,
-0.16271895170211792,
-0.09810387343168259,
0.0003545544750522822,
0.17341743409633636,
0.005454647820442915,
0.09850165992975235,
0.007842130027711391,
0.22016629576683044,
-0.0389627069234848,
-0.0839507058262825,
0.015589422546327114,
0.183601513504982,
0.012866831384599209,
0.12187206745147705,
0.0018071483355015516,
-0.10098198056221008,
0.018283136188983917,
-0.1582552194595337,
0.013751910999417305,
-0.08893182128667831,
0.05823872238397598,
-0.027256784960627556,
0.06086717173457146,
0.029790954664349556,
-0.0713096410036087,
-0.052360497415065765,
-0.04758310317993164,
0.0603233203291893,
-0.0047157201915979385,
-0.0998964011669159,
-0.07602924108505249,
-0.2470584660768509,
0.0014719514874741435,
-0.12364650517702103,
-0.00602564075961709,
-0.16744551062583923,
-0.015303097665309906,
-0.03259044513106346,
-0.04704038053750992,
0.012819822877645493,
0.04522193968296051,
0.0890176072716713,
0.017333008348941803,
0.016511162742972374,
-0.022459493950009346,
0.06282424181699753,
0.1183905303478241,
-0.20078933238983154,
-0.11536610126495361
] |
null | null | transformers | # Model Card for roberta-base-on-cuad
# Model Details
## Model Description
- **Developed by:** Mohammed Rakib
- **Shared by [Optional]:** More information needed
- **Model type:** Question Answering
- **Language(s) (NLP):** en
- **License:** MIT
- **Related Models:**
- **Parent Model:** RoBERTa
- **Resources for more information:**
- GitHub Repo: [defactolaw](https://github.com/afra-tech/defactolaw)
- Associated Paper: [An Open Source Contractual Language Understanding Application Using Machine Learning](https://aclanthology.org/2022.lateraisse-1.6/)
# Uses
## Direct Use
This model can be used for the task of Question Answering on Legal Documents.
# Training Details
Read: [An Open Source Contractual Language Understanding Application Using Machine Learning](https://aclanthology.org/2022.lateraisse-1.6/)
for detailed information on training procedure, dataset preprocessing and evaluation.
## Training Data
See [CUAD dataset card](https://huggingface.co/datasets/cuad) for more information.
## Training Procedure
### Preprocessing
More information needed
### Speeds, Sizes, Times
More information needed
# Evaluation
## Testing Data, Factors & Metrics
### Testing Data
See [CUAD dataset card](https://huggingface.co/datasets/cuad) for more information.
### Factors
### Metrics
More information needed
## Results
More information needed
# Model Examination
More information needed
- **Hardware Type:** More information needed
- **Hours used:** More information needed
- **Cloud Provider:** More information needed
- **Compute Region:** More information needed
- **Carbon Emitted:** More information needed
# Technical Specifications [optional]
## Model Architecture and Objective
More information needed
## Compute Infrastructure
More information needed
### Hardware
Used V100/P100 from Google Colab Pro
### Software
Python, Transformers
# Citation
**BibTeX:**
```
@inproceedings{nawar-etal-2022-open,
title = "An Open Source Contractual Language Understanding Application Using Machine Learning",
author = "Nawar, Afra and
Rakib, Mohammed and
Hai, Salma Abdul and
Haq, Sanaulla",
booktitle = "Proceedings of the First Workshop on Language Technology and Resources for a Fair, Inclusive, and Safe Society within the 13th Language Resources and Evaluation Conference",
month = jun,
year = "2022",
address = "Marseille, France",
publisher = "European Language Resources Association",
url = "https://aclanthology.org/2022.lateraisse-1.6",
pages = "42--50",
abstract = "Legal field is characterized by its exclusivity and non-transparency. Despite the frequency and relevance of legal dealings, legal documents like contracts remains elusive to non-legal professionals for the copious usage of legal jargon. There has been little advancement in making legal contracts more comprehensible. This paper presents how Machine Learning and NLP can be applied to solve this problem, further considering the challenges of applying ML to the high length of contract documents and training in a low resource environment. The largest open-source contract dataset so far, the Contract Understanding Atticus Dataset (CUAD) is utilized. Various pre-processing experiments and hyperparameter tuning have been carried out and we successfully managed to eclipse SOTA results presented for models in the CUAD dataset trained on RoBERTa-base. Our model, A-type-RoBERTa-base achieved an AUPR score of 46.6{\%} compared to 42.6{\%} on the original RoBERT-base. This model is utilized in our end to end contract understanding application which is able to take a contract and highlight the clauses a user is looking to find along with it{'}s descriptions to aid due diligence before signing. Alongside digital, i.e. searchable, contracts the system is capable of processing scanned, i.e. non-searchable, contracts using tesseract OCR. This application is aimed to not only make contract review a comprehensible process to non-legal professionals, but also to help lawyers and attorneys more efficiently review contracts.",
}
```
# Glossary [optional]
More information needed
# More Information [optional]
More information needed
# Model Card Authors [optional]
Mohammed Rakib in collaboration with Ezi Ozoani and the Hugging Face team
# Model Card Contact
More information needed
# How to Get Started with the Model
Use the code below to get started with the model.
<details>
<summary> Click to expand </summary>
```python
from transformers import AutoTokenizer, AutoModelForQuestionAnswering
tokenizer = AutoTokenizer.from_pretrained("Rakib/roberta-base-on-cuad")
model = AutoModelForQuestionAnswering.from_pretrained("Rakib/roberta-base-on-cuad")
```
</details> | {"language": ["en"], "license": "mit", "library_name": "transformers", "tags": ["legal-contract-review", "roberta", "cuad"], "datasets": ["cuad"], "pipeline_tag": "question-answering"} | question-answering | Rakib/roberta-base-on-cuad | [
"transformers",
"pytorch",
"roberta",
"question-answering",
"legal-contract-review",
"cuad",
"en",
"dataset:cuad",
"license:mit",
"endpoints_compatible",
"has_space",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [
"en"
] | TAGS
#transformers #pytorch #roberta #question-answering #legal-contract-review #cuad #en #dataset-cuad #license-mit #endpoints_compatible #has_space #region-us
| # Model Card for roberta-base-on-cuad
# Model Details
## Model Description
- Developed by: Mohammed Rakib
- Shared by [Optional]: More information needed
- Model type: Question Answering
- Language(s) (NLP): en
- License: MIT
- Related Models:
- Parent Model: RoBERTa
- Resources for more information:
- GitHub Repo: defactolaw
- Associated Paper: An Open Source Contractual Language Understanding Application Using Machine Learning
# Uses
## Direct Use
This model can be used for the task of Question Answering on Legal Documents.
# Training Details
Read: An Open Source Contractual Language Understanding Application Using Machine Learning
for detailed information on training procedure, dataset preprocessing and evaluation.
## Training Data
See CUAD dataset card for more information.
## Training Procedure
### Preprocessing
More information needed
### Speeds, Sizes, Times
More information needed
# Evaluation
## Testing Data, Factors & Metrics
### Testing Data
See CUAD dataset card for more information.
### Factors
### Metrics
More information needed
## Results
More information needed
# Model Examination
More information needed
- Hardware Type: More information needed
- Hours used: More information needed
- Cloud Provider: More information needed
- Compute Region: More information needed
- Carbon Emitted: More information needed
# Technical Specifications [optional]
## Model Architecture and Objective
More information needed
## Compute Infrastructure
More information needed
### Hardware
Used V100/P100 from Google Colab Pro
### Software
Python, Transformers
BibTeX:
# Glossary [optional]
More information needed
# More Information [optional]
More information needed
# Model Card Authors [optional]
Mohammed Rakib in collaboration with Ezi Ozoani and the Hugging Face team
# Model Card Contact
More information needed
# How to Get Started with the Model
Use the code below to get started with the model.
<details>
<summary> Click to expand </summary>
</details> | [
"# Model Card for roberta-base-on-cuad",
"# Model Details",
"## Model Description\n \n- Developed by: Mohammed Rakib\n- Shared by [Optional]: More information needed\n- Model type: Question Answering \n- Language(s) (NLP): en\n- License: MIT\n- Related Models:\n - Parent Model: RoBERTa \n- Resources for more information: \n - GitHub Repo: defactolaw\n - Associated Paper: An Open Source Contractual Language Understanding Application Using Machine Learning",
"# Uses",
"## Direct Use\n \nThis model can be used for the task of Question Answering on Legal Documents.",
"# Training Details\n\nRead: An Open Source Contractual Language Understanding Application Using Machine Learning \nfor detailed information on training procedure, dataset preprocessing and evaluation.",
"## Training Data\n \nSee CUAD dataset card for more information.",
"## Training Procedure",
"### Preprocessing\n \nMore information needed",
"### Speeds, Sizes, Times\n \nMore information needed",
"# Evaluation",
"## Testing Data, Factors & Metrics",
"### Testing Data\n \nSee CUAD dataset card for more information.",
"### Factors",
"### Metrics\n \nMore information needed",
"## Results \n \nMore information needed",
"# Model Examination\n \nMore information needed\n \n- Hardware Type: More information needed\n- Hours used: More information needed\n- Cloud Provider: More information needed\n- Compute Region: More information needed\n- Carbon Emitted: More information needed",
"# Technical Specifications [optional]",
"## Model Architecture and Objective\n \nMore information needed",
"## Compute Infrastructure\n \nMore information needed",
"### Hardware\n \nUsed V100/P100 from Google Colab Pro",
"### Software\n\nPython, Transformers\n \nBibTeX:",
"# Glossary [optional]\nMore information needed",
"# More Information [optional]\n \nMore information needed",
"# Model Card Authors [optional]\n \nMohammed Rakib in collaboration with Ezi Ozoani and the Hugging Face team",
"# Model Card Contact\n \nMore information needed",
"# How to Get Started with the Model\n \nUse the code below to get started with the model.\n \n<details>\n<summary> Click to expand </summary>\n\n\n</details>"
] | [
"TAGS\n#transformers #pytorch #roberta #question-answering #legal-contract-review #cuad #en #dataset-cuad #license-mit #endpoints_compatible #has_space #region-us \n",
"# Model Card for roberta-base-on-cuad",
"# Model Details",
"## Model Description\n \n- Developed by: Mohammed Rakib\n- Shared by [Optional]: More information needed\n- Model type: Question Answering \n- Language(s) (NLP): en\n- License: MIT\n- Related Models:\n - Parent Model: RoBERTa \n- Resources for more information: \n - GitHub Repo: defactolaw\n - Associated Paper: An Open Source Contractual Language Understanding Application Using Machine Learning",
"# Uses",
"## Direct Use\n \nThis model can be used for the task of Question Answering on Legal Documents.",
"# Training Details\n\nRead: An Open Source Contractual Language Understanding Application Using Machine Learning \nfor detailed information on training procedure, dataset preprocessing and evaluation.",
"## Training Data\n \nSee CUAD dataset card for more information.",
"## Training Procedure",
"### Preprocessing\n \nMore information needed",
"### Speeds, Sizes, Times\n \nMore information needed",
"# Evaluation",
"## Testing Data, Factors & Metrics",
"### Testing Data\n \nSee CUAD dataset card for more information.",
"### Factors",
"### Metrics\n \nMore information needed",
"## Results \n \nMore information needed",
"# Model Examination\n \nMore information needed\n \n- Hardware Type: More information needed\n- Hours used: More information needed\n- Cloud Provider: More information needed\n- Compute Region: More information needed\n- Carbon Emitted: More information needed",
"# Technical Specifications [optional]",
"## Model Architecture and Objective\n \nMore information needed",
"## Compute Infrastructure\n \nMore information needed",
"### Hardware\n \nUsed V100/P100 from Google Colab Pro",
"### Software\n\nPython, Transformers\n \nBibTeX:",
"# Glossary [optional]\nMore information needed",
"# More Information [optional]\n \nMore information needed",
"# Model Card Authors [optional]\n \nMohammed Rakib in collaboration with Ezi Ozoani and the Hugging Face team",
"# Model Card Contact\n \nMore information needed",
"# How to Get Started with the Model\n \nUse the code below to get started with the model.\n \n<details>\n<summary> Click to expand </summary>\n\n\n</details>"
] | [
56,
13,
3,
91,
3,
20,
33,
13,
4,
8,
12,
3,
11,
15,
4,
8,
5,
48,
9,
10,
8,
15,
13,
11,
10,
26,
7,
41
] | [
"passage: TAGS\n#transformers #pytorch #roberta #question-answering #legal-contract-review #cuad #en #dataset-cuad #license-mit #endpoints_compatible #has_space #region-us \n# Model Card for roberta-base-on-cuad# Model Details## Model Description\n \n- Developed by: Mohammed Rakib\n- Shared by [Optional]: More information needed\n- Model type: Question Answering \n- Language(s) (NLP): en\n- License: MIT\n- Related Models:\n - Parent Model: RoBERTa \n- Resources for more information: \n - GitHub Repo: defactolaw\n - Associated Paper: An Open Source Contractual Language Understanding Application Using Machine Learning# Uses## Direct Use\n \nThis model can be used for the task of Question Answering on Legal Documents.# Training Details\n\nRead: An Open Source Contractual Language Understanding Application Using Machine Learning \nfor detailed information on training procedure, dataset preprocessing and evaluation.## Training Data\n \nSee CUAD dataset card for more information.## Training Procedure### Preprocessing\n \nMore information needed### Speeds, Sizes, Times\n \nMore information needed# Evaluation## Testing Data, Factors & Metrics### Testing Data\n \nSee CUAD dataset card for more information.### Factors### Metrics\n \nMore information needed## Results \n \nMore information needed# Model Examination\n \nMore information needed\n \n- Hardware Type: More information needed\n- Hours used: More information needed\n- Cloud Provider: More information needed\n- Compute Region: More information needed\n- Carbon Emitted: More information needed# Technical Specifications [optional]## Model Architecture and Objective\n \nMore information needed## Compute Infrastructure\n \nMore information needed### Hardware\n \nUsed V100/P100 from Google Colab Pro### Software\n\nPython, Transformers\n \nBibTeX:# Glossary [optional]\nMore information needed# More Information [optional]\n \nMore information needed# Model Card Authors [optional]\n \nMohammed Rakib in collaboration with Ezi Ozoani and the Hugging Face team# Model Card Contact\n \nMore information needed# How to Get Started with the Model\n \nUse the code below to get started with the model.\n \n<details>\n<summary> Click to expand </summary>\n\n\n</details>"
] | [
-0.06750648468732834,
0.26255372166633606,
-0.004611118696630001,
0.014861905947327614,
0.11249837279319763,
0.04256739839911461,
0.08040853589773178,
0.14517296850681305,
-0.04841625690460205,
0.10670404881238937,
0.034397322684526443,
0.04009288176894188,
0.14142045378684998,
0.1584603488445282,
0.08685164153575897,
-0.1912824809551239,
0.020719291642308235,
-0.09905096888542175,
0.023996565490961075,
0.12085802108049393,
0.14118920266628265,
-0.09084343910217285,
0.0738832950592041,
-0.01915116235613823,
-0.090644970536232,
-0.008316081948578358,
-0.10663824528455734,
-0.03751328960061073,
0.019677413627505302,
0.01770004816353321,
0.04126332327723503,
0.0029687106143683195,
0.06624561548233032,
-0.31961071491241455,
0.026607433333992958,
0.072642020881176,
-0.010674364864826202,
0.045180413872003555,
0.07570786029100418,
-0.07084843516349792,
0.10947375744581223,
-0.14521925151348114,
0.13518308103084564,
0.05271183326840401,
-0.08603022247552872,
-0.1603345423936844,
-0.12414401024580002,
0.15388143062591553,
0.053973790258169174,
0.08018611371517181,
-0.042041074484586716,
0.08424018323421478,
-0.05650710687041283,
0.007858495227992535,
0.12499267607927322,
-0.16234324872493744,
-0.058036841452121735,
0.039386771619319916,
0.12033596634864807,
0.05181162431836128,
-0.15028472244739532,
-0.020320460200309753,
-0.012962773442268372,
0.01967501826584339,
0.005572900641709566,
0.011401619762182236,
0.046590566635131836,
0.029702845960855484,
-0.095218725502491,
-0.06668256968259811,
0.07148751616477966,
-0.008493741974234581,
-0.03947180509567261,
-0.25152426958084106,
0.011344663798809052,
-0.0730888694524765,
-0.013479003682732582,
-0.014204450882971287,
0.03392026573419571,
0.0013974012108519673,
0.018377991393208504,
-0.08678409457206726,
-0.0780739039182663,
-0.0031035244464874268,
0.028917018324136734,
0.07164610922336578,
0.042512815445661545,
-0.03917922452092171,
0.04436250030994415,
0.09246109426021576,
0.088465616106987,
-0.11024848371744156,
-0.08176858723163605,
-0.04594627022743225,
-0.10311529785394669,
-0.02531805820763111,
0.010647965595126152,
-0.004353986121714115,
0.035262707620859146,
0.24601678550243378,
-0.055930692702531815,
0.07279558479785919,
0.0021948309149593115,
-0.005472906399518251,
0.07009401172399521,
0.10801193863153458,
-0.06215488910675049,
-0.12925024330615997,
-0.07857043296098709,
0.046876806765794754,
-0.02823220007121563,
-0.031187932938337326,
-0.01648983173072338,
0.03439102694392204,
0.0713072344660759,
0.10889844596385956,
0.0758618414402008,
0.009505724534392357,
-0.049217820167541504,
-0.02815571054816246,
0.13839960098266602,
-0.1404339224100113,
0.0648093894124031,
0.012641661800444126,
-0.007352476939558983,
-0.06699705123901367,
0.021700993180274963,
0.010398427955806255,
-0.05624832957983017,
0.0852891057729721,
-0.06222730875015259,
-0.04845207557082176,
-0.08176019042730331,
0.0019161539385095239,
0.06728029996156693,
-0.05048881843686104,
-0.03106207214295864,
-0.034709613770246506,
-0.0788542777299881,
-0.08598236739635468,
0.06301374733448029,
-0.0631878525018692,
-0.019974634051322937,
-0.029157470911741257,
-0.037220828235149384,
0.014874002896249294,
0.03184971958398819,
0.13768760859966278,
-0.011952987872064114,
0.07127978652715683,
-0.044598110020160675,
0.0398450568318367,
0.1256658434867859,
0.04400099813938141,
-0.07104594260454178,
0.04438207671046257,
-0.12719395756721497,
0.09523361176252365,
-0.10434965044260025,
0.007170566823333502,
-0.164989173412323,
0.004086872562766075,
-0.00416148453950882,
0.008312148042023182,
0.0381753072142601,
0.10750684887170792,
-0.10648747533559799,
-0.023949522525072098,
0.17972393333911896,
-0.06715543568134308,
-0.07618504017591476,
0.05423443764448166,
-0.04020572826266289,
0.13755106925964355,
0.016886092722415924,
0.03641404211521149,
0.10742367804050446,
-0.1549537032842636,
-0.060621947050094604,
-0.03715353086590767,
0.0057833194732666016,
0.11164422333240509,
0.07561299204826355,
-0.0828946977853775,
0.08523508161306381,
0.021069684997200966,
-0.05343760922551155,
-0.048220470547676086,
-0.03556472808122635,
-0.09076899290084839,
0.023670483380556107,
-0.08556940406560898,
0.01722879335284233,
0.024729136377573013,
-0.07918809354305267,
-0.017763473093509674,
-0.18741145730018616,
-0.01837269216775894,
0.09638124704360962,
0.0014841568190604448,
0.00637954194098711,
-0.10354869812726974,
0.031180711463093758,
-0.01830335147678852,
-0.005106870550662279,
-0.19156230986118317,
-0.060999538749456406,
0.03287423774600029,
-0.13513430953025818,
0.07585548609495163,
-0.14423272013664246,
0.05191109701991081,
0.018766727298498154,
-0.050890471786260605,
-0.017345132306218147,
-0.004705590661615133,
0.00527104502543807,
-0.07122135162353516,
-0.18639680743217468,
-0.017333125695586205,
-0.04096028953790665,
0.23523490130901337,
-0.07905701547861099,
0.023012075573205948,
0.06871069967746735,
0.15688516199588776,
0.014945462346076965,
-0.0807243213057518,
0.030387187376618385,
-0.020171182230114937,
0.0297352634370327,
-0.06661969423294067,
0.021641071885824203,
0.007972190156579018,
0.021432604640722275,
-0.0007379421149380505,
-0.13923920691013336,
-0.09988412261009216,
0.07837597280740738,
0.14778558909893036,
-0.1357480138540268,
-0.04861409589648247,
-0.03498435392975807,
-0.04357428103685379,
-0.08530399948358536,
-0.04662037640810013,
0.18151205778121948,
0.05179167538881302,
0.031020596623420715,
-0.055924735963344574,
-0.09498117864131927,
0.007791886571794748,
0.028241315856575966,
-0.059752609580755234,
0.1104419082403183,
0.08417055010795593,
-0.1400170773267746,
0.1326044350862503,
0.06396330893039703,
0.10573261976242065,
0.1258176565170288,
-0.009217850863933563,
-0.09481685608625412,
-0.04491305351257324,
0.023991817608475685,
0.007834825664758682,
0.12419037520885468,
-0.09970784932374954,
0.004168929066509008,
0.040847644209861755,
-0.05241546779870987,
0.05228216201066971,
-0.06660481542348862,
0.013077382929623127,
0.028502149507403374,
0.007776007987558842,
0.008846374228596687,
-0.016377002000808716,
0.011450575664639473,
0.06718389689922333,
0.0645156279206276,
0.08289501816034317,
-0.005393193569034338,
-0.021361229941248894,
-0.09943676739931107,
0.13239681720733643,
-0.13619303703308105,
-0.2285005897283554,
-0.12122689932584763,
0.028110338374972343,
0.04009808227419853,
-0.06277848035097122,
0.014835469424724579,
-0.06814666092395782,
-0.07201427966356277,
-0.06991599500179291,
0.008084464818239212,
0.04561243951320648,
-0.10998707264661789,
-0.022932538762688637,
0.014552269130945206,
-0.004642935935407877,
-0.14373274147510529,
0.032412007451057434,
0.05399932712316513,
-0.05765179172158241,
-0.027478555217385292,
0.043408315628767014,
0.12712538242340088,
0.11249099671840668,
0.008810989558696747,
-0.0129253463819623,
0.017936762422323227,
0.20116429030895233,
-0.17758293449878693,
0.08566708117723465,
0.08147527277469635,
-0.036415521055459976,
0.07144976407289505,
0.13881756365299225,
-0.0010041121859103441,
-0.058697834610939026,
0.023214947432279587,
0.05830385163426399,
-0.021020838990807533,
-0.25429683923721313,
-0.03725670650601387,
-0.03856028988957405,
-0.05166426673531532,
0.08997636288404465,
0.08211652934551239,
0.04302021488547325,
0.0212416909635067,
-0.08975850045681,
-0.05207148566842079,
0.047128476202487946,
0.09750013053417206,
-0.01770945079624653,
0.003946676384657621,
0.03617658466100693,
-0.050192564725875854,
0.019254541024565697,
0.08104753494262695,
0.06393109261989594,
0.14347852766513824,
0.037111274898052216,
0.19690029323101044,
0.06293769925832748,
0.10098373889923096,
-0.02612757310271263,
0.0035407079849392176,
-0.034459810703992844,
0.07682015001773834,
0.005931725725531578,
-0.06321749091148376,
-0.035525523126125336,
0.0701817125082016,
0.07644753158092499,
-0.06578372418880463,
-0.00977245531976223,
-0.04137352108955383,
0.06356776505708694,
0.19201219081878662,
-0.014972662553191185,
-0.1361365020275116,
-0.06181491166353226,
0.05623272806406021,
-0.08303677290678024,
-0.11812879145145416,
-0.04384535923600197,
0.02028477005660534,
-0.19573697447776794,
0.03602933883666992,
-0.026148607954382896,
0.08285308629274368,
-0.07582453638315201,
-0.05148288980126381,
0.06931352615356445,
0.06384497135877609,
-0.016570020467042923,
0.06877300888299942,
-0.18228395283222198,
0.04952678829431534,
0.01279188971966505,
0.0827295184135437,
-0.1152680441737175,
0.0671815574169159,
0.00014931951591279358,
-0.057448141276836395,
0.17911888659000397,
-0.0036959382705390453,
-0.04374213144183159,
-0.05535631999373436,
-0.08303280174732208,
-0.0018236908363178372,
0.13007502257823944,
-0.14719334244728088,
0.07231111824512482,
-0.025145558640360832,
-0.038839828222990036,
-0.04115940257906914,
-0.05493210256099701,
-0.11702284216880798,
-0.2246076464653015,
0.06757057458162308,
-0.16019517183303833,
0.049528568983078,
-0.08809559047222137,
-0.04107474163174629,
-0.03397291526198387,
0.1770409494638443,
-0.16284064948558807,
-0.10542827099561691,
-0.12822988629341125,
-0.032132137566804886,
0.12362822145223618,
-0.060728318989276886,
0.04650505632162094,
-0.024033982306718826,
0.1717664897441864,
-0.04238228499889374,
-0.036750201135873795,
0.029421944171190262,
-0.04340845346450806,
-0.19823910295963287,
-0.06943409144878387,
0.16711314022541046,
0.10036046802997589,
0.06222594529390335,
0.0120086669921875,
0.03511758893728256,
-0.016543714329600334,
-0.07643911242485046,
0.016229132190346718,
0.13414913415908813,
0.04335079342126846,
0.010909244418144226,
-0.01726384647190571,
-0.08962923288345337,
-0.09510237723588943,
-0.03383462131023407,
0.10220617800951004,
0.07079572230577469,
-0.053494639694690704,
0.1889437884092331,
0.18895941972732544,
-0.091033935546875,
-0.23331020772457123,
-0.0017317875754088163,
0.03685859590768814,
-0.012078334577381611,
-0.010987653397023678,
-0.2245638072490692,
0.05464215576648712,
0.028402596712112427,
-0.025827035307884216,
0.060715969651937485,
-0.24790622293949127,
-0.12208685278892517,
0.03670185059309006,
0.017560945823788643,
-0.2524249255657196,
-0.17161931097507477,
-0.06996176391839981,
-0.006930655799806118,
-0.1894814670085907,
0.1226518526673317,
-0.012077207677066326,
0.03330032154917717,
0.024212313815951347,
0.03906846046447754,
0.02739882841706276,
-0.05702980235219002,
0.13079674541950226,
0.004550127778202295,
0.005586962215602398,
-0.09894584864377975,
-0.06777622550725937,
0.05858061462640762,
-0.025618188083171844,
0.07364045828580856,
-0.01379101350903511,
0.034207165241241455,
-0.13314417004585266,
-0.0425570122897625,
-0.07483933120965958,
0.006927595008164644,
-0.12316912412643433,
-0.07445710897445679,
-0.055695731192827225,
0.11631013453006744,
0.07840444892644882,
-0.04546814039349556,
0.006999824196100235,
-0.04335259273648262,
0.07546132802963257,
0.1533210128545761,
0.16149713099002838,
0.07015405595302582,
-0.09918588399887085,
-0.016956321895122528,
-0.04497187212109566,
0.0414159931242466,
-0.18566298484802246,
0.015409822575747967,
0.08090998232364655,
0.0482172816991806,
0.09494949132204056,
-0.028494244441390038,
-0.198444202542305,
-0.03361998125910759,
0.070451520383358,
-0.05094390735030174,
-0.24110594391822815,
0.027149854227900505,
0.08094531297683716,
-0.1662813127040863,
-0.07645032554864883,
0.03527091071009636,
0.00010749750799732283,
-0.0786144807934761,
0.026490695774555206,
0.09916925430297852,
0.006869422737509012,
0.09125639498233795,
0.06451340764760971,
0.05376308411359787,
-0.10104643553495407,
0.12433671206235886,
0.13099601864814758,
-0.09526029974222183,
0.024413203820586205,
0.14434178173542023,
-0.06318968534469604,
-0.06480579078197479,
0.04732315614819527,
0.11447049677371979,
0.023135915398597717,
-0.02517574280500412,
0.0616605319082737,
-0.08561259508132935,
0.05828513950109482,
0.05932198092341423,
-0.006381850223988295,
-0.016429798677563667,
0.0681808590888977,
0.05010238662362099,
-0.09764841198921204,
0.12190333008766174,
0.03056301921606064,
0.007256186567246914,
0.015264485962688923,
0.00414888933300972,
-0.007265819702297449,
0.009867219254374504,
0.012925923801958561,
-0.008404040709137917,
-0.05690332129597664,
-0.011584242805838585,
-0.09227890521287918,
-0.02793140895664692,
-0.09207988530397415,
0.011124718002974987,
0.019660962745547295,
-0.038360729813575745,
0.023422349244356155,
0.004214611370116472,
-0.047892820090055466,
-0.08938444405794144,
-0.02537650056183338,
0.09061478823423386,
-0.18837615847587585,
0.027044326066970825,
0.0934862494468689,
-0.096204973757267,
0.10904526710510254,
0.009283955208957195,
0.024651959538459778,
0.010161484591662884,
-0.11313352733850479,
-0.00709895696491003,
-0.09024722874164581,
0.018637368455529213,
0.019823875278234482,
-0.1684131920337677,
0.016237320378422737,
-0.053078558295965195,
-0.0716325044631958,
0.029181798920035362,
0.044283900409936905,
-0.11017803847789764,
0.11189404129981995,
-0.004156877286732197,
-0.022695044055581093,
-0.028845807537436485,
0.06404293328523636,
0.08763003349304199,
-0.0035399599000811577,
0.1015714481472969,
-0.016743198037147522,
0.05657010152935982,
-0.13806040585041046,
-0.0016044637886807323,
-0.0006844071904197335,
0.013509787619113922,
0.015988072380423546,
-0.03584738075733185,
0.07140832394361496,
0.007476920261979103,
0.11394517868757248,
-0.02059631422162056,
0.06096559762954712,
0.0707181841135025,
0.07395192235708237,
-0.09054545313119888,
0.06634906679391861,
-0.02446630224585533,
0.025584854185581207,
0.00016532941663172096,
0.004275074694305658,
-0.05307230353355408,
-0.04144200682640076,
-0.09920456260442734,
0.08319250494241714,
0.2035638839006424,
0.16954554617404938,
3.4676921245591075e-7,
0.07201734930276871,
-0.11101839691400528,
-0.06959113478660583,
0.1533965766429901,
-0.07336676865816116,
-0.017582135275006294,
-0.10622028261423111,
0.09154479950666428,
0.11743582785129547,
-0.18608129024505615,
0.09061693400144577,
-0.07667674869298935,
-0.04444769769906998,
-0.024984072893857956,
-0.12469779700040817,
-0.05032423138618469,
0.031698450446128845,
0.0035674916580319405,
-0.06014763191342354,
0.05645902454853058,
0.05084345489740372,
0.004570753313601017,
-0.008356358855962753,
0.11241474747657776,
-0.04542122408747673,
-0.05970210209488869,
0.07907223701477051,
0.05794808268547058,
0.045935120433568954,
-0.07627734541893005,
0.03735470771789551,
0.00425819493830204,
0.09849382936954498,
0.11962008476257324,
0.07240091264247894,
-0.02451520785689354,
0.019287260249257088,
-0.04335543140769005,
-0.08967221528291702,
0.022495824843645096,
-0.028295032680034637,
-0.05811961367726326,
0.15237200260162354,
0.02878132089972496,
0.0708942711353302,
-0.021211231127381325,
0.21646080911159515,
-0.020440751686692238,
-0.09912510961294174,
-0.1740778535604477,
0.00423621479421854,
-0.013670704327523708,
0.009555199183523655,
0.07350166141986847,
-0.14075127243995667,
0.0008836125489324331,
0.18293024599552155,
0.14298851788043976,
-0.039788734167814255,
0.025389492511749268,
0.08049644529819489,
0.023838771507143974,
-0.01608777604997158,
0.0368114672601223,
0.054257337003946304,
0.17353272438049316,
-0.091094471514225,
0.11174508929252625,
0.010642419569194317,
-0.0693858340382576,
-0.01503155566751957,
0.11235824972391129,
-0.04577592760324478,
-0.005124356132000685,
-0.052518803626298904,
0.11446782201528549,
-0.10989393293857574,
-0.30883529782295227,
0.06726861745119095,
-0.1009959727525711,
-0.16941051185131073,
-0.021335845813155174,
0.10132689774036407,
0.009059729054570198,
0.01883092150092125,
0.09620728343725204,
-0.019656557589769363,
0.19505847990512848,
0.047866374254226685,
-0.035159558057785034,
-0.056439973413944244,
0.08188604563474655,
-0.06575057655572891,
0.27455541491508484,
0.018798604607582092,
0.025813622400164604,
0.10222309827804565,
-0.01099605392664671,
-0.15219639241695404,
-0.03688454627990723,
0.1298658549785614,
-0.03917771950364113,
0.039653174579143524,
0.17617668211460114,
-0.014146117493510246,
0.07219669222831726,
0.09463489055633545,
-0.03176111727952957,
0.061334650963544846,
-0.037586409598588943,
0.022381575778126717,
-0.10031313449144363,
0.12644769251346588,
-0.07592357695102692,
0.1441694051027298,
0.1397184431552887,
-0.04760463535785675,
-0.008685681037604809,
-0.032097965478897095,
0.022203505039215088,
0.0007669544429518282,
0.15508253872394562,
-0.01370309293270111,
-0.14044642448425293,
0.027839420363307,
-0.07496453076601028,
0.11403212696313858,
-0.18569885194301605,
-0.05457359924912453,
0.08440602570772171,
0.0018610958941280842,
-0.06686524301767349,
0.14308269321918488,
0.06927979737520218,
-0.008746075443923473,
-0.035432327538728714,
-0.0782163068652153,
-0.00472017889842391,
0.12323068082332611,
-0.1061842069029808,
0.03312453255057335
] |
null | null | transformers |
GreatModel does not solve any NLP problem ... for exercise purpose only.
| {} | question-answering | RaphBL/great-model | [
"transformers",
"pytorch",
"camembert",
"question-answering",
"endpoints_compatible",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [] | TAGS
#transformers #pytorch #camembert #question-answering #endpoints_compatible #region-us
|
GreatModel does not solve any NLP problem ... for exercise purpose only.
| [] | [
"TAGS\n#transformers #pytorch #camembert #question-answering #endpoints_compatible #region-us \n"
] | [
31
] | [
"passage: TAGS\n#transformers #pytorch #camembert #question-answering #endpoints_compatible #region-us \n"
] | [
-0.02392732724547386,
0.008769679814577103,
-0.011196552775800228,
-0.01162737887352705,
0.08236770331859589,
0.026874860748648643,
0.012669439427554607,
0.08064489811658859,
0.11562054604291916,
0.017489802092313766,
0.17060106992721558,
0.21151210367679596,
-0.06250324100255966,
-0.07083753496408463,
-0.09580877423286438,
-0.21202823519706726,
0.03971386328339577,
0.09089689701795578,
-0.04297937825322151,
0.12123487889766693,
0.04590950161218643,
-0.1244800016283989,
0.048518549650907516,
-0.021325258538126945,
-0.0840955376625061,
0.05616683512926102,
-0.014224732294678688,
-0.0605483241379261,
0.1328163892030716,
0.012580699287354946,
0.15619297325611115,
0.03815365582704544,
-0.12688948214054108,
-0.18121157586574554,
0.04191024973988533,
-0.027331344783306122,
-0.05434928461909294,
0.02254457212984562,
0.04400837421417236,
-0.1114855408668518,
0.015307697467505932,
0.04451434686779976,
0.0051264530047774315,
0.06128805875778198,
-0.1829494684934616,
-0.13109605014324188,
-0.060791995376348495,
-0.0008152476511895657,
0.05764121934771538,
0.0930328220129013,
-0.01997533068060875,
0.1624957025051117,
-0.18473707139492035,
0.09489810466766357,
0.15740545094013214,
-0.3010874092578888,
-0.02409062720835209,
0.07676053047180176,
0.09240615367889404,
0.05133635178208351,
-0.01623140648007393,
0.06854184716939926,
0.03994001820683479,
0.027609607204794884,
-0.11648912727832794,
-0.10770788043737411,
-0.037946268916130066,
0.08571119606494904,
-0.0904030054807663,
-0.09676477313041687,
0.22014953196048737,
0.002577835461124778,
0.04945458099246025,
0.025110850110650063,
-0.0913703665137291,
0.018264934420585632,
0.02419223263859749,
-0.026868769899010658,
-0.02652713842689991,
0.046004608273506165,
0.03751743212342262,
-0.04449790343642235,
-0.1104179248213768,
0.02947152964770794,
-0.24856114387512207,
0.2339686006307602,
0.03356274217367172,
0.08899734914302826,
-0.24089638888835907,
0.049323566257953644,
-0.05831364914774895,
-0.07386733591556549,
0.0042531355284154415,
-0.07555601000785828,
-0.004377792589366436,
-0.009976276196539402,
-0.052027877420186996,
0.06056024134159088,
0.07045531272888184,
0.17434631288051605,
-0.02622520551085472,
0.034950148314237595,
-0.003023300087079406,
0.09846385568380356,
0.061934247612953186,
0.09452991932630539,
-0.016337832435965538,
0.003669490572065115,
-0.03439665958285332,
-0.1421661078929901,
-0.024398423731327057,
-0.023964613676071167,
-0.08574594557285309,
-0.06815750896930695,
-0.021677076816558838,
0.13750553131103516,
0.09716083109378815,
-0.009000384248793125,
-0.064858078956604,
0.0029016912449151278,
-0.016150102019309998,
-0.028449013829231262,
-0.018332242965698242,
-0.01491422951221466,
0.008810841478407383,
0.20217148959636688,
-0.0740547627210617,
0.035559944808483124,
-0.03794485703110695,
0.07942146807909012,
-0.05713490769267082,
-0.012707346118986607,
-0.016953542828559875,
0.004092735704034567,
0.07325142621994019,
-0.14529936015605927,
0.0761379674077034,
-0.11128633469343185,
-0.0530666820704937,
0.010738842189311981,
0.039826828986406326,
-0.007288387976586819,
0.013364337384700775,
-0.00496297562494874,
-0.03498060628771782,
-0.04843085631728172,
-0.042080510407686234,
-0.028493894264101982,
-0.06401029229164124,
0.10310972481966019,
0.02460959367454052,
0.05290345847606659,
-0.07853014767169952,
0.058109350502491,
-0.0846443623304367,
0.05912497267127037,
-0.08564113080501556,
-0.03054240718483925,
0.006557095795869827,
0.14549939334392548,
-0.015576763078570366,
-0.08212338387966156,
-0.11409963667392731,
0.027740033343434334,
-0.053119316697120667,
0.18439649045467377,
-0.020203422755002975,
-0.06096222624182701,
0.20603078603744507,
-0.05258876085281372,
-0.22591449320316315,
0.09528607130050659,
0.014472201466560364,
-0.015922337770462036,
0.06649299710988998,
0.16696448624134064,
-0.02836606465280056,
-0.08403000980615616,
0.05722660943865776,
0.09601771831512451,
-0.14638082683086395,
-0.06739111989736557,
0.041363466531038284,
-0.06819164752960205,
-0.10261403769254684,
0.03039921633899212,
0.022978102788329124,
0.040123097598552704,
-0.09880183637142181,
-0.035564400255680084,
-0.021957233548164368,
0.005820423364639282,
0.07812045514583588,
0.07224768400192261,
0.05337744578719139,
-0.08045847713947296,
0.016367411240935326,
-0.06671607494354248,
-0.02639896608889103,
0.05659295618534088,
0.024534814059734344,
-0.060472745448350906,
0.1411253958940506,
-0.09496037662029266,
0.010978637263178825,
-0.2103479951620102,
-0.09936360269784927,
-0.03007683716714382,
0.10939162969589233,
-0.00844697654247284,
0.2283802330493927,
0.09456253796815872,
-0.1266905516386032,
-0.0279947891831398,
-0.027669614180922508,
0.09551308304071426,
-0.00022408452059607953,
-0.02108912542462349,
-0.047720685601234436,
0.05935417488217354,
-0.07844267040491104,
-0.06003495678305626,
-0.01887255720794201,
-0.02543327398598194,
0.10009198635816574,
0.1259925365447998,
-0.01596197672188282,
0.07034676522016525,
-0.012902996502816677,
0.04906214401125908,
0.00961235724389553,
0.04277199134230614,
0.10028186440467834,
-0.050453368574380875,
-0.08891967684030533,
0.09850906580686569,
-0.07384383678436279,
0.2689398229122162,
0.18012812733650208,
-0.3302542567253113,
0.012254694476723671,
-0.0668298602104187,
-0.06122058629989624,
0.026027830317616463,
0.08200651407241821,
0.015896903350949287,
0.12053236365318298,
0.0446479357779026,
0.07763025909662247,
-0.04284319654107094,
-0.06450545787811279,
-0.019556604325771332,
-0.04365973174571991,
-0.05685977265238762,
0.1175435483455658,
0.07962777465581894,
-0.1711888313293457,
0.14563538134098053,
0.3179788887500763,
0.053078748285770416,
0.08422007411718369,
-0.06638453155755997,
-0.03185580298304558,
0.008789473213255405,
0.052693191915750504,
-0.05555678904056549,
0.0461888387799263,
-0.2479572296142578,
-0.004515501670539379,
0.07377790659666061,
0.006885884329676628,
0.07975687086582184,
-0.14399558305740356,
-0.08144205063581467,
0.005531958770006895,
0.02715335600078106,
-0.07751381397247314,
0.1113068163394928,
0.05388462170958519,
0.09544414281845093,
0.03290403634309769,
-0.021968243643641472,
0.10156523436307907,
-0.0052730911411345005,
-0.049597322940826416,
0.15132831037044525,
-0.10830429941415787,
-0.2334621697664261,
-0.011500101536512375,
-0.057101406157016754,
0.007400611415505409,
-0.007941429503262043,
0.06762200593948364,
-0.09479137510061264,
-0.005339943338185549,
0.10453111678361893,
0.048621777445077896,
-0.18483799695968628,
0.005372053012251854,
-0.041124626994132996,
0.05679668113589287,
-0.11415667086839676,
-0.04516572877764702,
-0.0701618567109108,
-0.0829981118440628,
-0.061034753918647766,
0.11211249977350235,
-0.11942803859710693,
0.10162786394357681,
0.10508716106414795,
0.048693303018808365,
0.06383222341537476,
-0.019213112071156502,
0.23387445509433746,
-0.13854765892028809,
-0.03273102268576622,
0.18623313307762146,
-0.005614681635051966,
0.10542936623096466,
0.15426883101463318,
0.024973371997475624,
-0.08421870321035385,
0.005900252610445023,
-0.025188619270920753,
-0.07275056838989258,
-0.23003049194812775,
-0.07237350195646286,
-0.13216517865657806,
0.06402025371789932,
-0.010320395231246948,
0.02357812412083149,
0.10386510193347931,
0.07812341302633286,
0.020367708057165146,
-0.14317408204078674,
-0.031377676874399185,
0.051108043640851974,
0.2559959590435028,
-0.050652723759412766,
0.08180875331163406,
-0.054226990789175034,
-0.1236121729016304,
0.057366132736206055,
0.10254249721765518,
0.14122870564460754,
0.12776078283786774,
-0.0000657088021398522,
0.10650686919689178,
0.1511739045381546,
0.1308283656835556,
0.07326558232307434,
-0.003899491624906659,
-0.06030051410198212,
-0.020170560106635094,
0.0005128532648086548,
-0.05357682332396507,
0.01786681078374386,
0.19056181609630585,
-0.12558391690254211,
-0.04339119791984558,
-0.17808975279331207,
0.09459841996431351,
0.04922287538647652,
0.055490076541900635,
-0.07266069948673248,
0.03199739009141922,
0.08008649945259094,
-0.013718849048018456,
-0.036704178899526596,
0.08629604429006577,
0.006230463273823261,
-0.1589033007621765,
0.030555667355656624,
-0.04035203531384468,
0.14174529910087585,
0.03342347592115402,
0.0872051864862442,
-0.08139977604150772,
-0.1646038442850113,
0.06331168860197067,
0.08525450527667999,
-0.28325921297073364,
0.30988791584968567,
0.010323827154934406,
-0.09941183775663376,
-0.07857246696949005,
-0.0419100821018219,
-0.03460116684436798,
0.15771818161010742,
0.1741008162498474,
0.010560947470366955,
-0.04624219611287117,
-0.0562870092689991,
0.09024083614349365,
0.05014228820800781,
0.14667749404907227,
-0.03068704903125763,
-0.02808554470539093,
0.0030832027550786734,
0.01401334349066019,
-0.04584778472781181,
0.03392453119158745,
0.04591952636837959,
-0.12290157377719879,
0.04883401840925217,
-0.026330456137657166,
0.00835657399147749,
0.000472843850729987,
0.0009281073580496013,
-0.07099012285470963,
0.10444536805152893,
-0.06019560247659683,
-0.05253623053431511,
-0.08908568322658539,
-0.13574717938899994,
0.13619175553321838,
-0.08927323669195175,
0.03654880449175835,
-0.09729776531457901,
-0.08840832859277725,
-0.08035709708929062,
-0.11926822364330292,
0.12577486038208008,
-0.09859512001276016,
-0.002972046844661236,
-0.024976586923003197,
0.20940105617046356,
-0.07337739318609238,
0.02082621306180954,
0.009510005824267864,
0.04831751808524132,
-0.15412619709968567,
-0.10238149017095566,
0.020655984058976173,
-0.10009392350912094,
0.09639652818441391,
0.08248567581176758,
0.002870201366022229,
0.08707939833402634,
0.0030875750817358494,
0.018924569711089134,
0.20816542208194733,
0.21008026599884033,
-0.039555057883262634,
0.08483137935400009,
0.15695181488990784,
0.0007822837214916945,
-0.25174611806869507,
-0.03509318083524704,
-0.15047386288642883,
-0.06932347267866135,
-0.031016534194350243,
-0.09629130363464355,
0.12345662713050842,
0.024078335613012314,
-0.03623834624886513,
0.0862312912940979,
-0.26356202363967896,
-0.02532535418868065,
0.12726053595542908,
-0.002920578932389617,
0.4718998968601227,
-0.10595153272151947,
-0.0839158222079277,
0.03427722305059433,
-0.2623015344142914,
0.09543231874704361,
0.03247299790382385,
0.03328673169016838,
-0.030399799346923828,
0.10247920453548431,
0.041777487844228745,
-0.08636542409658432,
0.15866810083389282,
0.021286148577928543,
0.015980318188667297,
-0.0659697949886322,
-0.12772922217845917,
0.024835217744112015,
0.023051412776112556,
-0.01069681067019701,
0.054782621562480927,
0.03772897645831108,
-0.14700867235660553,
-0.01092476211488247,
-0.1465897411108017,
0.06323495507240295,
0.01314179040491581,
-0.042281974107027054,
-0.032934751361608505,
-0.02331319823861122,
-0.02076621912419796,
0.013151177205145359,
0.25743940472602844,
-0.055393464863300323,
0.15868711471557617,
-0.04069165140390396,
0.14576254785060883,
-0.18746134638786316,
-0.08545863628387451,
-0.058757346123456955,
-0.03511061891913414,
0.0722283124923706,
-0.04034670814871788,
0.0438084602355957,
0.19662413001060486,
-0.01662643440067768,
0.016261203214526176,
0.10446808487176895,
0.030972354114055634,
-0.03177180513739586,
0.09027213603258133,
-0.2086748331785202,
-0.15066787600517273,
-0.01986280456185341,
-0.0141976373270154,
0.0750548243522644,
0.09418825805187225,
0.05410729721188545,
0.11829739809036255,
-0.02944853901863098,
0.008595895953476429,
-0.03920147195458412,
-0.05407204106450081,
-0.01845756731927395,
0.07897710800170898,
0.026709403842687607,
-0.10555907338857651,
0.0509202666580677,
-0.013068197295069695,
-0.2919836640357971,
-0.054920610040426254,
0.10175787657499313,
-0.10040184110403061,
-0.11067645996809006,
-0.10138022154569626,
0.027325602248311043,
-0.15884123742580414,
-0.024758579209446907,
-0.02758355438709259,
-0.113313227891922,
0.059817519038915634,
0.2264188975095749,
0.09046986699104309,
0.06634051352739334,
0.013085346668958664,
-0.03376643359661102,
0.03786036744713783,
-0.031152091920375824,
-0.04071654751896858,
-0.008768146857619286,
-0.04201658070087433,
-0.0777357816696167,
-0.013835371471941471,
0.20792752504348755,
-0.07153749465942383,
-0.08108745515346527,
-0.1514955759048462,
0.09843448549509048,
-0.13872495293617249,
-0.11057572066783905,
-0.13183996081352234,
-0.08379998058080673,
-0.0009522667969577014,
-0.12140075117349625,
-0.02971840277314186,
-0.03692029044032097,
-0.1297248750925064,
0.08592767268419266,
0.06474200636148453,
0.01721017435193062,
-0.07731067389249802,
-0.05436458811163902,
0.1723441630601883,
-0.031278032809495926,
0.09837008267641068,
0.14657042920589447,
-0.09944272041320801,
0.09090875834226608,
-0.09911223500967026,
-0.15744277834892273,
0.06050233170390129,
0.017674099653959274,
0.06499066948890686,
0.04463110491633415,
-0.0034479983150959015,
0.07081998139619827,
0.048809755593538284,
0.08314114809036255,
-0.06866255402565002,
-0.10787207633256912,
0.01015149150043726,
0.0438181534409523,
-0.20334291458129883,
-0.03784900903701782,
-0.09925014525651932,
0.10878699272871017,
-0.0019435918657109141,
0.08538941293954849,
0.0330878347158432,
0.13114184141159058,
-0.030610067769885063,
0.01782662607729435,
0.002447860548272729,
-0.15642191469669342,
0.042816225439310074,
-0.0589258186519146,
0.020403919741511345,
-0.018442915752530098,
0.24190722405910492,
-0.0977708101272583,
0.10406690090894699,
0.05572418496012688,
0.08301203697919846,
0.024591950699687004,
0.012617596425116062,
0.1861165463924408,
0.0851660966873169,
-0.05566543713212013,
-0.07318060845136642,
0.08382830023765564,
-0.06147270277142525,
-0.06796620041131973,
0.1488775610923767,
0.15651075541973114,
0.09120077639818192,
0.05530304089188576,
-0.0014824435347691178,
0.0657489001750946,
-0.021641220897436142,
-0.22849078476428986,
0.02612723596394062,
0.003470479277893901,
0.0005892530316486955,
0.08916196972131729,
0.1331786811351776,
-0.03026561811566353,
0.06559238582849503,
-0.044240403920412064,
-0.009646806865930557,
-0.13259847462177277,
-0.06260180473327637,
-0.04881620407104492,
-0.0621618889272213,
0.04905593395233154,
-0.09821411967277527,
-0.019307460635900497,
0.1188865378499031,
0.0738365426659584,
-0.060844678431749344,
0.12547454237937927,
0.042999494820833206,
-0.06307979673147202,
0.024056799709796906,
0.007171573583036661,
0.1085100919008255,
0.03884889557957649,
0.030034266412258148,
-0.1401544064283371,
-0.08943147957324982,
-0.058143071830272675,
0.044927604496479034,
-0.13452208042144775,
-0.06050639972090721,
-0.1466241180896759,
-0.08965855836868286,
-0.0615372471511364,
0.10353779047727585,
-0.03734123706817627,
0.14832842350006104,
-0.030079077929258347,
0.04224041476845741,
0.015839317813515663,
0.22502440214157104,
-0.06292783468961716,
-0.03746103122830391,
-0.0293760746717453,
0.16037613153457642,
0.030436668545007706,
0.0978035181760788,
0.005211357958614826,
0.02583315595984459,
-0.050649143755435944,
0.3112001419067383,
0.21421493589878082,
-0.06901303678750992,
0.04526463896036148,
0.07299133390188217,
0.05152080953121185,
0.1177046075463295,
-0.009911843575537205,
0.10050208121538162,
0.280469685792923,
-0.10533998161554337,
-0.027401110157370567,
-0.03553851693868637,
0.009572981856763363,
-0.05420788750052452,
0.03810017555952072,
0.06093444675207138,
-0.06288156658411026,
-0.05177457630634308,
0.13123971223831177,
-0.1479090452194214,
0.13860639929771423,
0.04641848802566528,
-0.21458742022514343,
-0.06057359650731087,
-0.02931632287800312,
0.1597113460302353,
-0.004411651287227869,
0.11795978248119354,
-0.034168753772974014,
-0.1367858499288559,
0.03940384462475777,
0.05876689776778221,
-0.231198251247406,
-0.08283437043428421,
0.13448476791381836,
0.03526907414197922,
-0.0023762413766235113,
-0.003228874644264579,
0.04583925008773804,
0.07685376703739166,
0.02111157588660717,
-0.030231956392526627,
0.0068451110273599625,
0.10456935316324234,
-0.08208402991294861,
-0.11190792173147202,
-0.01410099770873785,
0.06107042729854584,
-0.10615644603967667,
0.08707622438669205,
-0.1810407042503357,
0.04971843585371971,
0.004922125954180956,
-0.012089879252016544,
-0.04423108324408531,
0.09330969303846359,
-0.0700090155005455,
0.010445562191307545,
0.06791094690561295,
0.0041661933064460754,
-0.024470878764986992,
-0.030586279928684235,
-0.010663469322025776,
0.04486144706606865,
-0.06892821192741394,
-0.1455780416727066,
0.015286500565707684,
-0.05946185812354088,
0.08771692961454391,
-0.03967692330479622,
-0.0747789666056633,
-0.05156766623258591,
-0.005736036226153374,
0.06301223486661911,
-0.08492705971002579,
0.00442842161282897,
0.04670780897140503,
0.03330080956220627,
0.022213643416762352,
-0.08185914158821106,
0.03503983095288277,
0.05320156738162041,
-0.10944651067256927,
-0.04866700991988182
] |
null | null | transformers |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# distilbert-base-uncased-finetuned-squad
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the squad_v2 dataset.
It achieves the following results on the evaluation set:
- Loss: 2.1323
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 3
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:-----:|:----:|:---------------:|
| 2.8535 | 1.0 | 661 | 2.0684 |
| 1.5385 | 2.0 | 1322 | 2.0954 |
| 1.2312 | 3.0 | 1983 | 2.1323 |
### Framework versions
- Transformers 4.12.5
- Pytorch 1.10.0+cu111
- Datasets 1.16.1
- Tokenizers 0.10.3
| {"license": "apache-2.0", "tags": ["generated_from_trainer"], "datasets": ["squad_v2"], "model-index": [{"name": "distilbert-base-uncased-finetuned-squad", "results": []}]} | question-answering | Raphaelg9/distilbert-base-uncased-finetuned-squad | [
"transformers",
"pytorch",
"tensorboard",
"distilbert",
"question-answering",
"generated_from_trainer",
"dataset:squad_v2",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [] | TAGS
#transformers #pytorch #tensorboard #distilbert #question-answering #generated_from_trainer #dataset-squad_v2 #license-apache-2.0 #endpoints_compatible #region-us
| distilbert-base-uncased-finetuned-squad
=======================================
This model is a fine-tuned version of distilbert-base-uncased on the squad\_v2 dataset.
It achieves the following results on the evaluation set:
* Loss: 2.1323
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 2e-05
* train\_batch\_size: 16
* eval\_batch\_size: 16
* seed: 42
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* num\_epochs: 3
### Training results
### Framework versions
* Transformers 4.12.5
* Pytorch 1.10.0+cu111
* Datasets 1.16.1
* Tokenizers 0.10.3
| [
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 3",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.12.5\n* Pytorch 1.10.0+cu111\n* Datasets 1.16.1\n* Tokenizers 0.10.3"
] | [
"TAGS\n#transformers #pytorch #tensorboard #distilbert #question-answering #generated_from_trainer #dataset-squad_v2 #license-apache-2.0 #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 3",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.12.5\n* Pytorch 1.10.0+cu111\n* Datasets 1.16.1\n* Tokenizers 0.10.3"
] | [
59,
98,
4,
33
] | [
"passage: TAGS\n#transformers #pytorch #tensorboard #distilbert #question-answering #generated_from_trainer #dataset-squad_v2 #license-apache-2.0 #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 3### Training results### Framework versions\n\n\n* Transformers 4.12.5\n* Pytorch 1.10.0+cu111\n* Datasets 1.16.1\n* Tokenizers 0.10.3"
] | [
-0.11150982975959778,
0.07848244905471802,
-0.0022464971989393234,
0.12226062268018723,
0.1614677906036377,
0.022035878151655197,
0.10930634289979935,
0.11743824183940887,
-0.11023393273353577,
0.03155003488063812,
0.1385149210691452,
0.15885910391807556,
0.0022834569681435823,
0.05565059557557106,
-0.04332059621810913,
-0.22579556703567505,
-0.015553553588688374,
0.05318484827876091,
-0.0985267385840416,
0.1465374231338501,
0.08455826342105865,
-0.15500329434871674,
0.08040014654397964,
0.003660980612039566,
-0.20865511894226074,
0.014315593987703323,
-0.001002126489765942,
-0.037273962050676346,
0.14654211699962616,
0.004634014796465635,
0.12169363349676132,
0.008903874084353447,
0.07351299375295639,
-0.18313443660736084,
0.013935187831521034,
0.03859647735953331,
0.01088668406009674,
0.08598355203866959,
0.04025360196828842,
0.008746696636080742,
0.10953417420387268,
-0.07638487964868546,
0.039982665330171585,
0.0282059907913208,
-0.13308070600032806,
-0.25656917691230774,
-0.10196526348590851,
0.015613807365298271,
0.06292916089296341,
0.1212240532040596,
-0.007130688056349754,
0.15944673120975494,
-0.11469590663909912,
0.08769802004098892,
0.26103129982948303,
-0.29166102409362793,
-0.07582348585128784,
0.023967409506440163,
0.01751270703971386,
0.06955454498529434,
-0.10540398955345154,
-0.030764760449528694,
0.056238047778606415,
0.05253089591860771,
0.10836516320705414,
-0.03954014182090759,
-0.11582853645086288,
0.045502521097660065,
-0.15150785446166992,
-0.05203450843691826,
0.15609756112098694,
0.05319274589419365,
-0.029454700648784637,
-0.020823003724217415,
-0.05508257448673248,
-0.13535399734973907,
-0.021189380437135696,
-0.020961111411452293,
0.04279683902859688,
-0.055757082998752594,
-0.09712857753038406,
0.0021452633664011955,
-0.11150576174259186,
-0.09563680738210678,
-0.0684303492307663,
0.12594836950302124,
0.043947961181402206,
0.02687561698257923,
-0.0525965541601181,
0.10938704013824463,
0.019929923117160797,
-0.13555119931697845,
0.010611755773425102,
0.030988719314336777,
-0.010139579884707928,
-0.03702310100197792,
-0.06069008260965347,
-0.05168861150741577,
0.0276875589042902,
0.12038080394268036,
-0.07507329434156418,
0.023633906617760658,
0.05464736372232437,
0.03805939853191376,
-0.09220355749130249,
0.1830453872680664,
-0.06398260593414307,
0.009486718103289604,
-0.02005363069474697,
0.04950757324695587,
-0.007154290564358234,
0.005328728351742029,
-0.0964270830154419,
-0.015359601005911827,
0.09448137879371643,
0.024853812530636787,
-0.04622955620288849,
0.05267594754695892,
-0.046521175652742386,
-0.021662482991814613,
-0.004269272089004517,
-0.0773523822426796,
0.026366090402007103,
0.003766535082831979,
-0.09310909360647202,
-0.01037629321217537,
0.008848682045936584,
0.010402776300907135,
-0.011554084718227386,
0.08856529742479324,
-0.09054607152938843,
0.0331883542239666,
-0.09055687487125397,
-0.09689442068338394,
0.02296186052262783,
-0.08564628660678864,
0.032847288995981216,
-0.08608951419591904,
-0.15781696140766144,
-0.008169854991137981,
0.043509356677532196,
-0.023858871310949326,
-0.03934485465288162,
-0.046564385294914246,
-0.08911320567131042,
-0.01634961925446987,
-0.010302672162652016,
0.14746318757534027,
-0.05793876573443413,
0.12537631392478943,
0.050083260983228683,
0.06819366663694382,
-0.03570094704627991,
0.053063999861478806,
-0.1087137833237648,
0.0155175207182765,
-0.17038998007774353,
0.03265877068042755,
-0.05183928832411766,
0.06631552428007126,
-0.10258295387029648,
-0.1299031525850296,
0.025112196803092957,
-0.021637307479977608,
0.0886128842830658,
0.10419172793626785,
-0.16837792098522186,
-0.06566350162029266,
0.14889761805534363,
-0.058970123529434204,
-0.15457798540592194,
0.12532015144824982,
-0.057340458035469055,
0.0367724634706974,
0.06327765434980392,
0.1653481274843216,
0.06346127390861511,
-0.09703050553798676,
0.01029412355273962,
-0.0035189923364669085,
0.04113675653934479,
-0.06535916775465012,
0.07667838782072067,
-0.008666559122502804,
0.020132986828684807,
0.02384789101779461,
-0.06672535836696625,
0.05777828022837639,
-0.12241614609956741,
-0.09915035963058472,
-0.04880531504750252,
-0.10805047303438187,
0.05010281875729561,
0.08662132173776627,
0.07077346742153168,
-0.0985550507903099,
-0.06197735667228699,
0.06734070181846619,
0.07895397394895554,
-0.06031368300318718,
0.026645448058843613,
-0.06811186671257019,
0.07743967324495316,
-0.060524292290210724,
-0.029261749237775803,
-0.18460489809513092,
-0.016221996396780014,
0.0064208898693323135,
0.008642536588013172,
0.009598520584404469,
0.04800661653280258,
0.07930218428373337,
0.04930705577135086,
-0.05328112840652466,
-0.028445199131965637,
-0.06760639697313309,
-0.007571693044155836,
-0.12365466356277466,
-0.1979861557483673,
-0.03793565183877945,
-0.009720463305711746,
0.09145587682723999,
-0.185153529047966,
0.029401732608675957,
-0.009715549647808075,
0.07441259920597076,
-0.003294769674539566,
-0.011107580736279488,
-0.03934507817029953,
0.07821567356586456,
-0.01774180494248867,
-0.04216040298342705,
0.06614695489406586,
-0.003979031927883625,
-0.09542424976825714,
-0.06495296210050583,
-0.05558587983250618,
0.15911351144313812,
0.1253730058670044,
-0.1303967833518982,
-0.062237538397312164,
-0.0052805556915700436,
-0.07645756006240845,
-0.03375837579369545,
-0.04848552867770195,
0.04102320224046707,
0.16656209528446198,
-0.007157170679420233,
0.12183791399002075,
-0.08840934932231903,
-0.0449737086892128,
0.01959148980677128,
-0.03959357365965843,
0.0438808873295784,
0.1395282745361328,
0.1149747222661972,
-0.06259983777999878,
0.13532032072544098,
0.16168491542339325,
-0.08849847316741943,
0.10813626646995544,
-0.07006167620420456,
-0.09305057674646378,
-0.03788823261857033,
0.004365076310932636,
-0.006719666067510843,
0.12806251645088196,
-0.15611529350280762,
0.01869283616542816,
0.03217831999063492,
0.02292943000793457,
0.016968483105301857,
-0.23233681917190552,
-0.0682251825928688,
0.031194066628813744,
-0.050674375146627426,
-0.04936769977211952,
0.001716769183985889,
0.016733024269342422,
0.10315307974815369,
-0.011658132076263428,
-0.06910274177789688,
0.03503641486167908,
0.0008988300687633455,
-0.07106631249189377,
0.21996188163757324,
-0.06889194250106812,
-0.10578946769237518,
-0.09740490466356277,
-0.038020551204681396,
-0.04250059276819229,
-0.002065529813989997,
0.06875044107437134,
-0.1040864884853363,
-0.006451272405683994,
-0.049199339002370834,
0.03432905301451683,
-0.01354936696588993,
0.032734207808971405,
0.014522549696266651,
0.0028875700663775206,
0.07596521824598312,
-0.11499802768230438,
0.008388633839786053,
-0.060460593551397324,
-0.07941441237926483,
0.054875560104846954,
0.042622681707143784,
0.13803817331790924,
0.13183261454105377,
-0.014684215188026428,
0.013378987088799477,
-0.025128446519374847,
0.2633360028266907,
-0.06662710011005402,
-0.04692712053656578,
0.14886140823364258,
0.013517078012228012,
0.05621350556612015,
0.09926415234804153,
0.07495399564504623,
-0.09078273177146912,
0.004146585240960121,
0.030644305050373077,
-0.035943061113357544,
-0.2495904266834259,
-0.0219603069126606,
-0.05280477553606033,
-0.008817454800009727,
0.07314790785312653,
0.018114611506462097,
0.035731494426727295,
0.07553023844957352,
0.03812779113650322,
0.04805469885468483,
-0.06654037535190582,
0.03833834081888199,
0.10763085633516312,
0.04255922883749008,
0.10675251483917236,
-0.04664669185876846,
-0.064516082406044,
0.023861879482865334,
-0.006263649091124535,
0.25608211755752563,
-0.009899196214973927,
0.14510273933410645,
0.08770857751369476,
0.21451981365680695,
-0.019836094230413437,
0.08179693669080734,
-0.007133472245186567,
-0.05132313817739487,
-0.005636381916701794,
-0.03907174617052078,
-0.0214300025254488,
0.004650745540857315,
-0.04254613071680069,
0.08255895972251892,
-0.11862073838710785,
0.010725179687142372,
0.06009969487786293,
0.2718203365802765,
0.027531573548913002,
-0.30129367113113403,
-0.097066730260849,
-0.015963010489940643,
-0.021149922162294388,
-0.004454485140740871,
0.022895822301506996,
0.12818394601345062,
-0.08842194080352783,
-0.0037483612541109324,
-0.06207936257123947,
0.0956089049577713,
-0.013140026479959488,
0.04822295159101486,
0.06916408240795135,
0.07813670486211777,
0.012320245616137981,
0.09584198147058487,
-0.3213621973991394,
0.2885628044605255,
-0.0007656484376639128,
0.07724250853061676,
-0.07192961126565933,
-0.021918807178735733,
0.003846179461106658,
0.04133070632815361,
0.0886860191822052,
-0.007469405885785818,
0.017856718972325325,
-0.1790827363729477,
-0.03880010172724724,
0.04350654035806656,
0.08747676759958267,
-0.029035519808530807,
0.09758484363555908,
-0.009674768894910812,
0.016580229625105858,
0.07489538192749023,
0.004736360162496567,
-0.05581018701195717,
-0.07005749642848969,
-0.01770479418337345,
0.003678187495097518,
-0.04962708055973053,
-0.07184655964374542,
-0.10278037935495377,
-0.12001753598451614,
0.09434649348258972,
-0.005986284930258989,
-0.03353781998157501,
-0.09975739568471909,
0.09256492555141449,
0.11002492159605026,
-0.08952901512384415,
0.030917184427380562,
0.017572887241840363,
0.033067140728235245,
0.04645436257123947,
-0.047393810003995895,
0.09540706872940063,
-0.06115768477320671,
-0.17200936377048492,
-0.047829631716012955,
0.11327466368675232,
0.05819747969508171,
0.07102100551128387,
-0.009448590688407421,
0.005617950577288866,
-0.05510175973176956,
-0.10203198343515396,
0.032271578907966614,
-0.03899984061717987,
0.08592764288187027,
0.01896059140563011,
-0.022024154663085938,
0.06756249070167542,
-0.06798482686281204,
-0.02578815072774887,
0.18958690762519836,
0.24630698561668396,
-0.10082559287548065,
0.004232932347804308,
0.0333731509745121,
-0.049205243587493896,
-0.1753806471824646,
0.05219753459095955,
0.06774263828992844,
-0.005670195911079645,
0.05066134035587311,
-0.165505051612854,
0.13894516229629517,
0.10661971569061279,
-0.0031755599193274975,
0.09239965677261353,
-0.37862667441368103,
-0.11200164258480072,
0.10372737795114517,
0.15607672929763794,
0.11325988173484802,
-0.1512717455625534,
-0.019323386251926422,
0.006337879225611687,
-0.16790397465229034,
0.09732450544834137,
-0.10046491026878357,
0.11136257648468018,
-0.04630636051297188,
0.10206442326307297,
0.003650352358818054,
-0.07127633690834045,
0.13255521655082703,
0.04357816278934479,
0.10565652698278427,
-0.04257804527878761,
-0.03788834810256958,
0.07167694717645645,
-0.02277504839003086,
0.017199022695422173,
-0.060422033071517944,
0.04886789247393608,
-0.10378392040729523,
-0.009701265022158623,
-0.10865776985883713,
0.035007916390895844,
-0.04585724323987961,
-0.05753262713551521,
-0.04287276789546013,
0.019436681643128395,
0.04954148456454277,
-0.008629655465483665,
0.11341211944818497,
0.03317056968808174,
0.14014646410942078,
0.07879598438739777,
0.0729752779006958,
-0.0526728518307209,
-0.1024816483259201,
-0.012431601993739605,
-0.002871003933250904,
0.059265006333589554,
-0.13852161169052124,
0.025139816105365753,
0.1586272120475769,
0.05058387294411659,
0.11947368085384369,
0.07910828292369843,
-0.03219638392329216,
0.012729079462587833,
0.03659619390964508,
-0.16593532264232635,
-0.14158710837364197,
0.0007194791687652469,
-0.06350906193256378,
-0.12478424608707428,
0.06974491477012634,
0.06468627601861954,
-0.05896945297718048,
-0.010753092356026173,
-0.006693624891340733,
-0.009056233800947666,
-0.06682948768138885,
0.20841661095619202,
0.08166371285915375,
0.05730023235082626,
-0.1106608510017395,
0.07247037440538406,
0.032492831349372864,
-0.07964061945676804,
-0.00823501218110323,
0.04631282761693001,
-0.0728156790137291,
-0.04226803779602051,
0.08448489010334015,
0.1622045934200287,
-0.06093582883477211,
-0.03734132647514343,
-0.14316225051879883,
-0.11613649874925613,
0.0802081897854805,
0.14976029098033905,
0.11455577611923218,
0.015894776210188866,
-0.0447583943605423,
0.012748432345688343,
-0.12161780148744583,
0.08703673630952835,
0.04289615899324417,
0.058644454926252365,
-0.13177058100700378,
0.1438056081533432,
-0.002163206459954381,
0.06213667616248131,
-0.014370857737958431,
0.028986383229494095,
-0.0985160768032074,
0.036861881613731384,
-0.144964799284935,
-0.04015577211976051,
-0.030134662985801697,
-0.005010158289223909,
-0.00773527380079031,
-0.08877526223659515,
-0.05854165926575661,
0.022546054795384407,
-0.12621566653251648,
-0.012523260898888111,
0.057215552777051926,
0.05427289381623268,
-0.1409880667924881,
-0.04280024766921997,
0.03826498985290527,
-0.0553714893758297,
0.06981248408555984,
0.07210478186607361,
0.012607654556632042,
0.05518702417612076,
-0.13462191820144653,
-0.011001119390130043,
0.050040844827890396,
0.01673915609717369,
0.0790460854768753,
-0.0918925330042839,
-0.018529020249843597,
0.0033737998455762863,
0.0580887533724308,
0.018051497638225555,
0.03280702605843544,
-0.1388327181339264,
-0.009344855323433876,
-0.019823327660560608,
-0.07248985022306442,
-0.07400539517402649,
0.015316066332161427,
0.09527413547039032,
0.029487356543540955,
0.1925184726715088,
-0.05798767879605293,
0.05951610207557678,
-0.22770041227340698,
-0.004966119769960642,
-0.008156226947903633,
-0.09451781213283539,
-0.1262035369873047,
-0.05076805502176285,
0.0670202374458313,
-0.06406060606241226,
0.11508350819349289,
-0.018639741465449333,
0.057786956429481506,
0.019391389563679695,
-0.012694582343101501,
0.029518570750951767,
0.013746420852839947,
0.23300577700138092,
0.020404869690537453,
-0.03257223218679428,
0.07822825014591217,
0.04784706234931946,
0.07908838987350464,
0.12617777287960052,
0.20020370185375214,
0.17019183933734894,
0.008301720023155212,
0.07068832963705063,
0.04507927596569061,
-0.038983430713415146,
-0.13901524245738983,
0.03828674554824829,
-0.016681497916579247,
0.07965803891420364,
-0.015935910865664482,
0.23780332505702972,
0.07020948082208633,
-0.17296460270881653,
0.04988893121480942,
-0.05878032371401787,
-0.09494620561599731,
-0.08477441966533661,
-0.02509339712560177,
-0.06757156550884247,
-0.15831725299358368,
0.014193188399076462,
-0.1254003643989563,
0.01343528088182211,
0.11527931690216064,
0.011871743947267532,
-0.02973010577261448,
0.18265578150749207,
0.07391917705535889,
0.037089865654706955,
0.04512234032154083,
-0.00673059793189168,
-0.022983359172940254,
-0.0867912545800209,
-0.044487398117780685,
0.009353320114314556,
-0.028687039390206337,
0.03517380356788635,
-0.054791562259197235,
-0.07963769882917404,
0.023758690804243088,
-0.033384207636117935,
-0.09804455190896988,
0.005577802658081055,
0.02635898068547249,
0.06255986541509628,
0.04483041912317276,
0.016615336760878563,
0.028941193595528603,
-0.0188615620136261,
0.21912193298339844,
-0.07515410333871841,
-0.08644826710224152,
-0.09779539704322815,
0.23743322491645813,
0.03940745070576668,
-0.015949513763189316,
0.04819853976368904,
-0.06739012151956558,
0.00830706674605608,
0.2445833683013916,
0.17704953253269196,
-0.09097414463758469,
-0.011766128242015839,
0.008043467998504639,
-0.008394567295908928,
-0.03953494876623154,
0.07717172801494598,
0.14113663136959076,
0.03366560861468315,
-0.11060873419046402,
-0.04348177835345268,
-0.08410710841417313,
-0.010066229850053787,
-0.020941907539963722,
0.04992351308465004,
0.06073589622974396,
-0.001227971282787621,
-0.04376448318362236,
0.07069313526153564,
-0.0803132951259613,
-0.11241906136274338,
0.06702692806720734,
-0.19269521534442902,
-0.1514378935098648,
-0.022403676062822342,
0.11089149862527847,
0.01062016375362873,
0.0677703469991684,
-0.042013466358184814,
0.00629494059830904,
0.08407262712717056,
-0.015441621653735638,
-0.09340674430131912,
-0.08311767131090164,
0.12296862155199051,
-0.11036628484725952,
0.18345437943935394,
-0.043756384402513504,
0.0907086655497551,
0.1289547234773636,
0.05896754562854767,
-0.08392810076475143,
0.0593009814620018,
0.06526932120323181,
-0.08578328043222427,
0.005651972722262144,
0.0796269103884697,
-0.006250518374145031,
0.05627156421542168,
0.03901292383670807,
-0.11690153181552887,
0.0163422804325819,
-0.036761268973350525,
-0.025720668956637383,
-0.06528064608573914,
-0.03652406111359596,
-0.04777064546942711,
0.11738734692335129,
0.21176373958587646,
-0.04521868750452995,
0.017737537622451782,
-0.07251780480146408,
0.017484666779637337,
0.06150136515498161,
0.0222212765365839,
-0.0678640753030777,
-0.20843420922756195,
0.020710423588752747,
0.06192799657583237,
-0.03214678168296814,
-0.19697272777557373,
-0.09393768012523651,
0.02721068076789379,
-0.08834069222211838,
-0.06343812495470047,
0.06160794571042061,
0.0753621980547905,
0.06654619425535202,
-0.04527541995048523,
-0.04561535269021988,
-0.09012782573699951,
0.16247236728668213,
-0.1466871052980423,
-0.08588044345378876
] |
null | null | transformers |
# Rick Morty DialoGPT Model | {"tags": ["conversational"]} | text-generation | Rashid11/DialoGPT-small-rick | [
"transformers",
"pytorch",
"gpt2",
"text-generation",
"conversational",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [] | TAGS
#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
# Rick Morty DialoGPT Model | [
"# Rick Morty DialoGPT Model"
] | [
"TAGS\n#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n",
"# Rick Morty DialoGPT Model"
] | [
51,
9
] | [
"passage: TAGS\n#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n# Rick Morty DialoGPT Model"
] | [
-0.027416273951530457,
0.12086749076843262,
-0.006109944544732571,
0.009889360517263412,
0.12381330132484436,
0.0038806977681815624,
0.14245158433914185,
0.1331283152103424,
-0.0212846826761961,
-0.045859768986701965,
0.14985723793506622,
0.21308201551437378,
-0.019527088850736618,
0.051007822155952454,
-0.06853066384792328,
-0.3212457299232483,
0.04747724160552025,
0.05849539116024971,
-0.031212788075208664,
0.11445578187704086,
0.09293332695960999,
-0.04094693437218666,
0.0733533725142479,
0.011670893989503384,
-0.1314198225736618,
0.010235695168375969,
0.014730604365468025,
-0.09850499778985977,
0.1117473691701889,
0.07090220600366592,
0.01977115496993065,
0.04605070501565933,
-0.042774006724357605,
-0.12270724773406982,
0.04736309126019478,
0.0012515645939856768,
-0.045511845499277115,
0.06668319553136826,
0.012820800766348839,
-0.09431180357933044,
0.14217764139175415,
0.12549172341823578,
-0.007773073390126228,
0.040497276932001114,
-0.15369699895381927,
-0.028457961976528168,
-0.004497861489653587,
0.051908574998378754,
0.046925999224185944,
0.11125461012125015,
-0.03784911334514618,
0.12445349991321564,
-0.04202383756637573,
0.11295074224472046,
0.11681167036294937,
-0.2876027524471283,
-0.016340676695108414,
0.14467903971672058,
0.03499724715948105,
0.02552611380815506,
-0.04558492824435234,
0.09456686675548553,
0.014267511665821075,
-0.004964208230376244,
-0.03697703033685684,
-0.07767035067081451,
-0.05041099339723587,
0.03769579529762268,
-0.08099090307950974,
-0.00252055493183434,
0.22437700629234314,
-0.023694775998592377,
0.09231439232826233,
-0.060666441917419434,
-0.0907888412475586,
-0.00829416885972023,
-0.040286462754011154,
-0.03791467100381851,
-0.09851202368736267,
0.07553289085626602,
-0.04743136838078499,
-0.0896044448018074,
-0.11248479783535004,
-0.018256433308124542,
-0.15938088297843933,
0.15307071805000305,
0.026864582672715187,
0.03729034215211868,
-0.19902122020721436,
0.07844828069210052,
-0.028603488579392433,
-0.09441527724266052,
0.024891193956136703,
-0.08897717297077179,
0.007059852592647076,
0.014940117485821247,
-0.02692650444805622,
-0.01080795843154192,
0.08948459476232529,
0.11486934125423431,
0.0012382101267576218,
0.022374704480171204,
-0.021753402426838875,
0.046682097017765045,
0.052672360092401505,
0.03954516351222992,
-0.04186785966157913,
-0.022741666063666344,
0.02572580985724926,
-0.08474642038345337,
-0.008516058325767517,
-0.06357598304748535,
-0.19882962107658386,
0.0066541084088385105,
0.07609006762504578,
0.04447868838906288,
0.030165988951921463,
0.13503077626228333,
0.004542340524494648,
-0.05563449114561081,
0.03441212326288223,
-0.005266434978693724,
-0.023943031206727028,
0.018793204799294472,
0.0034354126546531916,
0.15178295969963074,
0.024059079587459564,
0.053748175501823425,
-0.10243912786245346,
0.016247184947133064,
-0.05016466975212097,
-0.014364240691065788,
-0.030889246612787247,
-0.05143284797668457,
0.002065480686724186,
-0.01656651869416237,
0.018873749300837517,
-0.1348172426223755,
-0.15529313683509827,
-0.00524118123576045,
-0.004736831411719322,
-0.045283541083335876,
-0.11088961362838745,
-0.10792574286460876,
-0.027836862951517105,
0.040399227291345596,
-0.05653241649270058,
0.024379350244998932,
-0.05022101104259491,
0.09343878924846649,
-0.03229687735438347,
0.07440964877605438,
-0.09111195057630539,
0.08398482203483582,
-0.06504840403795242,
-0.04608788341283798,
-0.06951715052127838,
0.11532726138830185,
0.00948856957256794,
0.0517040491104126,
-0.03399192541837692,
-0.018159443512558937,
-0.10607455670833588,
0.06333806365728378,
-0.04592961072921753,
0.23252448439598083,
-0.0882263258099556,
-0.10362039506435394,
0.23659279942512512,
-0.041976332664489746,
-0.12399233877658844,
0.11402548849582672,
-0.027552654966711998,
0.09993235021829605,
0.12294723093509674,
0.1687079519033432,
0.028866015374660492,
-0.0066484189592301846,
0.10477101802825928,
0.1148667186498642,
-0.07858583331108093,
-0.007384664379060268,
0.023825330659747124,
-0.018311534076929092,
-0.09742707014083862,
0.024217721074819565,
0.0863247737288475,
0.049028586596250534,
-0.06375037133693695,
-0.013463132083415985,
0.012806696817278862,
-0.005671072751283646,
0.07319371402263641,
-0.012223406694829464,
0.12824711203575134,
-0.025656385347247124,
-0.0770145058631897,
-0.01428163331001997,
0.025753408670425415,
-0.05255761742591858,
0.027070246636867523,
-0.08988162875175476,
0.035605765879154205,
-0.01898045837879181,
0.0671210065484047,
-0.16663384437561035,
-0.07745291292667389,
-0.06126396730542183,
0.2262658029794693,
0.07519219070672989,
0.1284416913986206,
0.0619715116918087,
-0.058878444135189056,
-0.006196749862283468,
0.03376108035445213,
0.1913164108991623,
-0.0039115422405302525,
-0.08089020848274231,
-0.11742129176855087,
0.09549184888601303,
-0.0750381201505661,
0.06824305653572083,
-0.053482670336961746,
0.010319244116544724,
0.009465949609875679,
0.1013946607708931,
-0.038084983825683594,
0.038067691028118134,
0.019796738401055336,
-0.03743954747915268,
-0.06533488631248474,
0.0009406007011421025,
0.10463766008615494,
0.008402220904827118,
-0.10139253735542297,
0.22547724843025208,
-0.2005029320716858,
0.14411064982414246,
0.17982447147369385,
-0.2346627563238144,
0.0021273153834044933,
-0.11201058328151703,
-0.021471746265888214,
0.007533633150160313,
0.0470501147210598,
-0.04458373039960861,
0.22299137711524963,
-0.011466647498309612,
0.17279048264026642,
-0.039442989975214005,
-0.04193522036075592,
-0.04108773171901703,
-0.04188798740506172,
0.006027073599398136,
0.12033126503229141,
0.10017445683479309,
-0.1822369396686554,
0.17901694774627686,
0.06407493352890015,
0.04607775807380676,
0.16021881997585297,
0.029854416847229004,
0.01598851941525936,
0.05563560873270035,
-0.01471908763051033,
-0.037844810634851456,
-0.07636579871177673,
-0.23525193333625793,
-0.007929623126983643,
0.07535190880298615,
0.035712987184524536,
0.10723735392093658,
-0.09158799052238464,
-0.038234591484069824,
-0.006720248609781265,
-0.017972785979509354,
0.029511768370866776,
0.13821794092655182,
0.012823565863072872,
0.12136346101760864,
-0.02067670226097107,
-0.06758338212966919,
0.07061983644962311,
0.01762859895825386,
-0.08836925029754639,
0.18357424437999725,
-0.11606878787279129,
-0.3341638445854187,
-0.09832020103931427,
-0.18778936564922333,
-0.044912174344062805,
0.048682037740945816,
0.10484113544225693,
-0.1378994584083557,
-0.020967360585927963,
-0.0011182050220668316,
0.07453688979148865,
-0.12740632891654968,
0.007425905205309391,
-0.021490052342414856,
-0.004774138797074556,
-0.13142900168895721,
-0.09953810274600983,
-0.04875548928976059,
-0.06333116441965103,
-0.04007769003510475,
0.11915352195501328,
-0.15440979599952698,
0.007438643369823694,
0.23077727854251862,
0.06644120812416077,
0.07287882268428802,
-0.033953361213207245,
0.18404237926006317,
-0.08294825255870819,
0.010298850014805794,
0.2278866469860077,
-0.044359900057315826,
0.07007310539484024,
0.08790004998445511,
-0.005820757243782282,
-0.05185883864760399,
0.03261810541152954,
0.005810468923300505,
-0.07390361279249191,
-0.20164833962917328,
-0.11046743392944336,
-0.1097269132733345,
0.06003160402178764,
0.049688130617141724,
0.04269523173570633,
0.16941851377487183,
0.06732624769210815,
-0.050071004778146744,
-0.0010007480159401894,
0.06692192703485489,
0.07866300642490387,
0.24961450695991516,
-0.07098972052335739,
0.14394886791706085,
-0.018865063786506653,
-0.16527363657951355,
0.07400073856115341,
0.0514012947678566,
0.07847563922405243,
0.07295927405357361,
0.10060954838991165,
0.010725021362304688,
0.011121176183223724,
0.12659968435764313,
0.06625896692276001,
-0.0038331167306751013,
-0.03917814791202545,
-0.049783237278461456,
-0.04524265229701996,
-0.023619912564754486,
0.04664705693721771,
0.05461094528436661,
-0.15899211168289185,
-0.01097947172820568,
0.007825072854757309,
0.048257775604724884,
0.00985618308186531,
0.0866815447807312,
-0.19567669928073883,
-0.019957082346081734,
0.0636926218867302,
-0.005147154442965984,
-0.09536939859390259,
0.07866992056369781,
-0.0072094290517270565,
-0.10218027234077454,
0.03940068185329437,
-0.030383454635739326,
0.1246940940618515,
-0.0801611840724945,
0.07795576006174088,
-0.12091593444347382,
-0.041307736188173294,
-0.005253799259662628,
0.11947031319141388,
-0.29642704129219055,
0.18476703763008118,
-0.008458035998046398,
-0.05291203409433365,
-0.10281132906675339,
-0.02201247401535511,
0.03507980331778526,
0.10506751388311386,
0.1118205264210701,
-0.02476685494184494,
-0.01724824495613575,
0.049101583659648895,
-0.06962887942790985,
0.029461484402418137,
0.08798205107450485,
-0.050502385944128036,
-0.011574916541576385,
-0.04603515937924385,
-0.0029110098257660866,
0.01779703050851822,
-0.09948457777500153,
0.003308283630758524,
-0.18711978197097778,
0.08715172111988068,
0.0664692223072052,
0.06770443916320801,
0.0410330630838871,
-0.0376434326171875,
-0.11037451028823853,
0.2640511095523834,
0.006874449551105499,
-0.09938371181488037,
-0.1063869297504425,
0.040492862462997437,
0.04723961278796196,
-0.06468400359153748,
-0.02421484887599945,
-0.07383155077695847,
0.047784969210624695,
-0.06310353428125381,
-0.20025654137134552,
0.11153385043144226,
-0.09598832577466965,
-0.04565293341875076,
-0.038614898920059204,
0.22017183899879456,
-0.02087371051311493,
0.01834278367459774,
0.040803052484989166,
-0.004591372795403004,
-0.12742213904857635,
-0.10001911222934723,
0.017327459529042244,
-0.0009376577800139785,
0.021842550486326218,
0.003586926031857729,
-0.04634168744087219,
-0.006295070517808199,
-0.05597352981567383,
-0.0032472964376211166,
0.314273864030838,
0.11974360793828964,
-0.0459052175283432,
0.15168564021587372,
0.11110243201255798,
-0.06336724013090134,
-0.2841157019138336,
-0.10526396334171295,
-0.07726700603961945,
-0.06251303106546402,
-0.07681839168071747,
-0.19412246346473694,
0.09324176609516144,
-0.04012000188231468,
-0.006594877690076828,
0.06911571323871613,
-0.2687768042087555,
-0.09850376099348068,
0.19593903422355652,
-0.029114339500665665,
0.44064584374427795,
-0.10917924344539642,
-0.07919815182685852,
-0.059960875660181046,
-0.1383652538061142,
0.19179140031337738,
-0.02703654207289219,
0.10186978429555893,
-0.0003646083641797304,
0.19195830821990967,
0.057588085532188416,
-0.0009007352637127042,
0.06601071357727051,
0.01702183485031128,
-0.04900797829031944,
-0.08880552649497986,
-0.09949428588151932,
-0.0028341286815702915,
0.014665309339761734,
0.020054258406162262,
-0.07335546612739563,
0.04037605598568916,
-0.12153704464435577,
-0.057532791048288345,
-0.08231829106807709,
0.03708001598715782,
0.02363327331840992,
-0.06965623795986176,
0.003611034480854869,
-0.041214823722839355,
-0.003743773326277733,
0.001163402572274208,
0.19916509091854095,
-0.11463721841573715,
0.13475215435028076,
0.055799905210733414,
0.14584854245185852,
-0.11059413850307465,
-0.036196619272232056,
-0.049671463668346405,
-0.056701723486185074,
0.0657535120844841,
-0.11300431936979294,
0.03300803527235985,
0.1006435751914978,
-0.021579928696155548,
0.07627937942743301,
0.11079196631908417,
-0.023353170603513718,
0.006963616237044334,
0.07969304174184799,
-0.2467534989118576,
-0.08276727795600891,
-0.07502299547195435,
0.047773830592632294,
0.07212468981742859,
0.10943278670310974,
0.21149371564388275,
0.014789524488151073,
-0.025001436471939087,
0.03222371265292168,
0.023720046505331993,
-0.018071206286549568,
0.057205770164728165,
0.006658707745373249,
0.02543378621339798,
-0.14630788564682007,
0.04069182276725769,
-0.014969375915825367,
-0.07975641638040543,
0.018297437578439713,
0.14578038454055786,
-0.10994893312454224,
-0.12492318451404572,
-0.06442593038082123,
0.15935027599334717,
-0.1318066418170929,
0.00008887145668268204,
-0.04528146982192993,
-0.11942756175994873,
0.08018366992473602,
0.1163475513458252,
0.0548318587243557,
0.04320308938622475,
-0.09725219011306763,
-0.03015247732400894,
-0.0463523268699646,
0.009109904989600182,
0.025928247720003128,
-0.025241773575544357,
-0.042393237352371216,
0.04041439667344093,
-0.037384483963251114,
0.1199554055929184,
-0.09227344393730164,
-0.10210355371236801,
-0.16372735798358917,
0.04211808368563652,
-0.04951796308159828,
-0.08286435902118683,
-0.1046765148639679,
-0.043875209987163544,
0.00679828692227602,
-0.03306660056114197,
-0.02686910517513752,
-0.028210319578647614,
-0.10169406235218048,
0.0354401171207428,
-0.039222512394189835,
-0.0010834659915417433,
-0.07553835958242416,
0.024027099832892418,
0.045587822794914246,
-0.02624446526169777,
0.1485806107521057,
0.12215639650821686,
-0.11098423600196838,
0.08989319205284119,
-0.16241876780986786,
-0.06881432235240936,
0.09718970209360123,
0.021595995873212814,
0.04444918408989906,
0.043577950447797775,
0.006596237421035767,
0.04742807894945145,
0.06705378741025925,
0.040454499423503876,
0.055872101336717606,
-0.07414437085390091,
0.05340878292918205,
-0.05155640468001366,
-0.10819453746080399,
-0.04804424196481705,
-0.013407588936388493,
0.008997448720037937,
0.06238336116075516,
0.11464203894138336,
-0.0483878068625927,
0.08908496052026749,
-0.05675007402896881,
0.049880947917699814,
0.027069102972745895,
-0.16960401833057404,
0.014258208684623241,
-0.0808192640542984,
0.053306471556425095,
0.004392252769321203,
0.18324851989746094,
0.012561025097966194,
-0.029430944472551346,
0.02732483670115471,
0.05722617357969284,
0.05216764658689499,
-0.010767760686576366,
0.1901388317346573,
0.10632782429456711,
-0.038889240473508835,
-0.09260699898004532,
0.09756447374820709,
0.04513990134000778,
0.06597290933132172,
0.14979833364486694,
-0.02466198429465294,
-0.010925143957138062,
0.07297184318304062,
0.004247481003403664,
0.02110392414033413,
-0.0873403549194336,
-0.0998910591006279,
-0.02316814288496971,
0.03743436560034752,
-0.033713094890117645,
0.11821766197681427,
0.16003753244876862,
0.004898495972156525,
0.015858691185712814,
-0.019366057589650154,
-0.053652193397283554,
-0.19227364659309387,
-0.19133047759532928,
-0.0922650396823883,
-0.13585466146469116,
0.005800233688205481,
-0.1313185691833496,
0.035255029797554016,
0.03759025037288666,
0.10348966717720032,
-0.04712950438261032,
0.02449607104063034,
0.03599223494529724,
-0.11023194342851639,
0.053648680448532104,
-0.04251009225845337,
0.07534263283014297,
-0.0288483127951622,
0.00955510139465332,
-0.051057491451501846,
0.036296628415584564,
0.014495961368083954,
0.03052874095737934,
-0.02045595459640026,
0.0203727874904871,
-0.12031914293766022,
-0.09146412461996078,
-0.0652654618024826,
0.0565146766602993,
0.0026318831369280815,
0.16578586399555206,
0.01112611684948206,
-0.02614348568022251,
0.026491381227970123,
0.2084609717130661,
-0.06327962130308151,
-0.09746730327606201,
-0.07132168114185333,
0.203248530626297,
-0.012181129306554794,
0.0951991155743599,
-0.03945131227374077,
0.013144226744771004,
-0.06770433485507965,
0.33865463733673096,
0.3012107312679291,
-0.104325070977211,
0.010152887552976608,
0.002433057874441147,
0.04099532216787338,
0.12672795355319977,
0.08416569232940674,
0.10599584132432938,
0.2619067132472992,
-0.069271981716156,
-0.06182895973324776,
-0.010734140872955322,
-0.025832153856754303,
-0.059502504765987396,
0.04846819490194321,
0.060089610517024994,
-0.05902278795838356,
-0.01576954498887062,
0.11261053383350372,
-0.2549726963043213,
0.05737382918596268,
-0.1458929032087326,
-0.14792890846729279,
-0.06993737071752548,
-0.0008564922027289867,
0.09068048745393753,
0.010102925822138786,
0.08812552690505981,
-0.011714844033122063,
-0.06707380712032318,
0.044450268149375916,
0.019938640296459198,
-0.21181021630764008,
0.008332989178597927,
0.07651379704475403,
-0.031228363513946533,
-0.05378149822354317,
-0.02486795000731945,
0.08115644752979279,
0.08993051201105118,
0.031433310359716415,
-0.016278380528092384,
0.04931800812482834,
-0.00844922848045826,
-0.07117252051830292,
0.03346596658229828,
0.012209481559693813,
0.01669057458639145,
-0.05868103355169296,
0.0727684423327446,
-0.16378995776176453,
0.030053474009037018,
-0.030178112909197807,
-0.06673619151115417,
-0.00835607573390007,
0.03437380492687225,
-0.06272272765636444,
0.08189035207033157,
0.08107085525989532,
-0.017087949439883232,
-0.029968449845910072,
-0.025353075936436653,
-0.009976672008633614,
-0.033645838499069214,
-0.0770852267742157,
-0.10150755196809769,
-0.16576941311359406,
-0.11129410564899445,
0.07681435346603394,
0.002545040100812912,
-0.2132008969783783,
0.01989331841468811,
-0.13140831887722015,
0.04490850493311882,
-0.10383498668670654,
0.09425871074199677,
0.07931430637836456,
0.020193548873066902,
-0.0043848673813045025,
0.012049510143697262,
0.040718138217926025,
0.07634101808071136,
-0.13649077713489532,
-0.08752022683620453
] |
null | null | transformers |
# Harry Potter DialoGPT Model | {"tags": ["conversational"]} | text-generation | Rathod/DialoGPT-small-harrypotter | [
"transformers",
"pytorch",
"gpt2",
"text-generation",
"conversational",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [] | TAGS
#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
# Harry Potter DialoGPT Model | [
"# Harry Potter DialoGPT Model"
] | [
"TAGS\n#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n",
"# Harry Potter DialoGPT Model"
] | [
51,
8
] | [
"passage: TAGS\n#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n# Harry Potter DialoGPT Model"
] | [
-0.0009023238671943545,
0.07815738022327423,
-0.006546166725456715,
0.07792752981185913,
0.10655936598777771,
0.048972971737384796,
0.17639793455600739,
0.12185695022344589,
0.016568755730986595,
-0.04774167761206627,
0.11647630482912064,
0.2130284160375595,
-0.002118367003276944,
0.024608047679066658,
-0.05022026598453522,
-0.3065771162509918,
0.0474756620824337,
0.014356585219502449,
-0.07174845039844513,
0.11724270135164261,
0.09064973145723343,
-0.046179238706827164,
0.08330509811639786,
-0.009135239757597446,
-0.13198648393154144,
-0.039482954889535904,
0.019292812794446945,
-0.11745545268058777,
0.1662212759256363,
0.05298272892832756,
0.02469746209681034,
-0.008447164669632912,
-0.06598151475191116,
-0.15036040544509888,
0.037190426141023636,
-0.027472136542201042,
-0.01080626156181097,
0.05462246760725975,
0.023526115342974663,
-0.07521048933267593,
0.170567125082016,
0.17678891122341156,
0.0833497866988182,
0.0349111407995224,
-0.14917024970054626,
-0.045548245310783386,
0.008950977586209774,
0.05421316996216774,
-0.017893504351377487,
0.09349167346954346,
-0.019903047010302544,
0.11801653355360031,
-0.04491448402404785,
0.09210366010665894,
0.15255063772201538,
-0.4016275703907013,
-0.027563704177737236,
0.08920855820178986,
0.05989706888794899,
0.12076901644468307,
-0.10560955852270126,
0.03972794860601425,
-0.0039703017100691795,
0.01236654631793499,
-0.014540530741214752,
-0.08304883539676666,
-0.07308239489793777,
0.032504837960004807,
-0.1272556483745575,
0.008525865152478218,
0.23756256699562073,
-0.10643257945775986,
0.037069112062454224,
-0.09791990369558334,
-0.07414398342370987,
0.048336777836084366,
-0.053761593997478485,
-0.081727035343647,
-0.054839808493852615,
0.06347949057817459,
0.004366500303149223,
-0.06301609426736832,
-0.08326146006584167,
-0.0006536149303428829,
-0.12781435251235962,
0.17595994472503662,
0.061243366450071335,
0.041611745953559875,
-0.21322020888328552,
0.08940251916646957,
0.04477722570300102,
-0.04711297154426575,
0.007116159424185753,
-0.11796226352453232,
0.04023287072777748,
0.005483259446918964,
-0.03256071358919144,
-0.021854614838957787,
0.0393419973552227,
0.13909944891929626,
-0.01777748204767704,
0.03252175822854042,
0.006831915583461523,
0.05811219662427902,
0.08162496984004974,
0.02222144603729248,
0.019291909411549568,
-0.0818009302020073,
0.019385190680623055,
-0.08128736168146133,
-0.0030400939285755157,
-0.048940129578113556,
-0.17071883380413055,
-0.07477642595767975,
0.052610911428928375,
0.020047198981046677,
0.03746970370411873,
0.08054786175489426,
-0.0017944995779544115,
-0.05560554191470146,
0.03284840285778046,
0.01671096310019493,
-0.020622212439775467,
-0.010361049324274063,
-0.02412462793290615,
0.19123271107673645,
0.019619356840848923,
0.014111656695604324,
-0.12379156798124313,
0.10023640841245651,
-0.08179095387458801,
0.0037731381598860025,
0.02743307314813137,
-0.04204464703798294,
-0.004716555587947369,
0.02917117439210415,
0.023101668804883957,
-0.1252521574497223,
-0.1099385917186737,
-0.0030569476075470448,
-0.012054097838699818,
-0.036421261727809906,
-0.10490952432155609,
-0.08483029156923294,
-0.012153145857155323,
0.0449371263384819,
-0.013397793285548687,
0.007936403155326843,
-0.05143149942159653,
0.0985720232129097,
-0.0514979362487793,
0.09873400628566742,
-0.08342572301626205,
0.06359215080738068,
-0.09124887734651566,
-0.061886150389909744,
-0.11452563107013702,
0.05216052383184433,
0.012905281968414783,
0.066250741481781,
0.016998225823044777,
-0.044836658984422684,
-0.014836243353784084,
0.05253177136182785,
-0.07656687498092651,
0.1940697431564331,
-0.041674621403217316,
-0.12459053844213486,
0.24146439135074615,
-0.09138800948858261,
-0.1802034229040146,
0.12973085045814514,
-0.022254703566432,
0.08523941785097122,
0.12802475690841675,
0.20380465686321259,
-0.00019822151807602495,
-0.01302915159612894,
0.07281201332807541,
0.07031642645597458,
-0.09803894907236099,
0.06239739805459976,
0.029653839766979218,
-0.008071083575487137,
-0.08906278014183044,
0.05762826278805733,
0.046033453196287155,
-0.010650773532688618,
-0.035073768347501755,
-0.001896020956337452,
-0.012895751744508743,
-0.022185025736689568,
0.14126582443714142,
-0.02006692811846733,
0.1300428807735443,
-0.06926563382148743,
-0.03515486419200897,
-0.009500149637460709,
0.03533667325973511,
-0.04091939330101013,
0.08151165395975113,
-0.0436173714697361,
0.10586477071046829,
0.09034156054258347,
0.053724925965070724,
-0.13120363652706146,
0.00466286763548851,
-0.015246815048158169,
0.17014820873737335,
0.08964069187641144,
0.05222717300057411,
0.06265474855899811,
-0.0020888058934360743,
-0.06708643585443497,
0.045407816767692566,
0.13778303563594818,
-0.037020038813352585,
-0.12218865007162094,
-0.1755627691745758,
0.051157694309949875,
-0.045444171875715256,
0.10855234414339066,
-0.10010123997926712,
0.022670533508062363,
-0.055906031280756,
0.07772238552570343,
-0.024998966604471207,
0.020512236282229424,
-0.0013405600329861045,
-0.021700702607631683,
-0.08356887847185135,
-0.002377772703766823,
0.08597290515899658,
-0.02048647589981556,
-0.06707409024238586,
0.16556480526924133,
-0.16400809586048126,
0.1631954461336136,
0.2116095870733261,
-0.28542569279670715,
-0.005696662236005068,
-0.15163889527320862,
-0.0208092350512743,
0.019645055755972862,
0.07834604382514954,
0.026225795969367027,
0.2044338881969452,
-0.012928472831845284,
0.16565458476543427,
-0.05699567869305611,
-0.07730039209127426,
-0.06881127506494522,
-0.048101142048835754,
0.013522743247449398,
0.09095205366611481,
0.04542696103453636,
-0.11962861567735672,
0.13119758665561676,
0.1054433062672615,
0.06484298408031464,
0.12711186707019806,
0.1030748188495636,
-0.008113685995340347,
0.07252490520477295,
-0.03624548763036728,
-0.03462279960513115,
-0.09254947304725647,
-0.30446043610572815,
-0.04840317741036415,
0.0939924493432045,
0.007963384501636028,
0.09285714477300644,
-0.0919896736741066,
-0.03311870992183685,
0.006042704917490482,
0.009473444893956184,
0.028337622061371803,
0.09653715789318085,
0.013490920886397362,
0.15320514142513275,
-0.008011690340936184,
-0.03430786728858948,
0.05891305208206177,
0.017982570454478264,
-0.09147711098194122,
0.17280617356300354,
-0.17050009965896606,
-0.27190929651260376,
-0.06990014761686325,
-0.21745692193508148,
-0.013139115646481514,
0.05258983001112938,
0.0786920040845871,
-0.11818131804466248,
-0.018352627754211426,
-0.006239492911845446,
0.05685517191886902,
-0.2425733357667923,
0.0004911290016025305,
-0.1354890614748001,
0.0501418262720108,
-0.1974833607673645,
-0.09718500077724457,
-0.02271542325615883,
-0.013450481928884983,
-0.0464281290769577,
0.13365240395069122,
-0.1448695808649063,
-0.011572926305234432,
0.2329535037279129,
0.032479673624038696,
0.027794739231467247,
-0.05020907148718834,
0.19788463413715363,
-0.0958966314792633,
-0.023973820731043816,
0.11024576425552368,
-0.05038975924253464,
0.04834126681089401,
0.06649978458881378,
-0.012981836684048176,
-0.08557141572237015,
0.023789849132299423,
-0.068336620926857,
-0.03150583803653717,
-0.27926525473594666,
-0.0930178239941597,
-0.09319330751895905,
0.11305391043424606,
0.04079577326774597,
0.06421639025211334,
0.16545771062374115,
0.05191578343510628,
-0.024325082078576088,
-0.03006586618721485,
0.11609793454408646,
0.12905290722846985,
0.2277202159166336,
-0.06067761778831482,
0.10221996158361435,
0.009445492178201675,
-0.08203992247581482,
0.06062209978699684,
0.056782789528369904,
0.06324724853038788,
0.02584579586982727,
0.03694582358002663,
-0.030939655378460884,
0.1121687963604927,
0.12571842968463898,
0.05258069559931755,
0.0481170229613781,
0.0002127334737451747,
-0.0561506561934948,
-0.008168719708919525,
-0.05726633965969086,
0.06774696707725525,
0.061340972781181335,
-0.12918008863925934,
-0.08061543852090836,
0.0011613310780376196,
0.06660808622837067,
-0.016230419278144836,
0.06823775917291641,
-0.13560809195041656,
-0.03582429885864258,
0.0790911465883255,
-0.07693151384592056,
-0.14156894385814667,
0.11972879618406296,
-0.026570770889520645,
-0.19904157519340515,
0.05265914276242256,
0.007704653777182102,
0.0908159390091896,
-0.06360849738121033,
0.05343840271234512,
-0.13023801147937775,
-0.12935101985931396,
-0.018437571823596954,
0.07945099472999573,
-0.3450873792171478,
0.13536721467971802,
-0.013286802917718887,
-0.02876877970993519,
-0.06474969536066055,
-0.02640824392437935,
0.013905409723520279,
0.12719078361988068,
0.08667250722646713,
0.0008821099763736129,
0.0991629809141159,
0.03823768347501755,
0.04188435152173042,
-0.002011700300499797,
0.10950417071580887,
0.0050011589191854,
0.004797275178134441,
-0.04982118681073189,
0.007274609990417957,
-0.05164213851094246,
-0.07472953200340271,
0.08393982797861099,
-0.20678792893886566,
0.09087453782558441,
-0.03378438204526901,
0.08427679538726807,
0.04304937273263931,
-0.018965769559144974,
-0.1001204177737236,
0.19745583832263947,
-0.012206900864839554,
-0.11405988782644272,
-0.07517550885677338,
-0.02810264565050602,
0.09103139489889145,
-0.013817726634442806,
0.012886416167020798,
-0.045470476150512695,
0.032183047384023666,
-0.1263762265443802,
-0.1597503274679184,
0.08734500408172607,
-0.04441224783658981,
-0.10894393920898438,
-0.025462759658694267,
0.20382575690746307,
-0.007266622502356768,
0.08242089301347733,
0.01605331338942051,
0.010653935372829437,
-0.18066231906414032,
-0.04018142446875572,
0.02645772136747837,
-0.0016437612939625978,
0.005979063920676708,
0.047698814421892166,
0.019091911613941193,
0.06207629665732384,
-0.1069745197892189,
-0.013920160941779613,
0.3158324360847473,
0.15978319942951202,
-0.00912671908736229,
0.14943915605545044,
0.1093616932630539,
-0.08669080585241318,
-0.17238758504390717,
-0.1171615794301033,
-0.1210922971367836,
-0.08425768464803696,
-0.10681738704442978,
-0.1525043100118637,
0.09535340964794159,
-0.03392014652490616,
0.03498011827468872,
0.14615866541862488,
-0.280263751745224,
-0.10949636250734329,
0.13820378482341766,
0.010744688101112843,
0.3510635495185852,
-0.12303631007671356,
-0.044944874942302704,
-0.06214528530836105,
-0.16933435201644897,
0.08021392673254013,
-0.031203703954815865,
0.11581093072891235,
-0.0744495838880539,
0.19395925104618073,
0.01719796098768711,
0.014287159778177738,
0.0916559100151062,
0.05038322135806084,
-0.05808406323194504,
-0.07368700206279755,
-0.10248131304979324,
0.010812131687998772,
0.03546109423041344,
0.010252019390463829,
-0.008802837692201138,
0.0211968794465065,
-0.11341743916273117,
-0.050869911909103394,
-0.06302189081907272,
0.0072614275850355625,
-0.01001308299601078,
-0.042155615985393524,
-0.05533592775464058,
-0.022557416930794716,
-0.020093943923711777,
0.02266426384449005,
0.14185629785060883,
-0.07527699321508408,
0.18586260080337524,
0.02357078716158867,
0.1586609035730362,
-0.11956068128347397,
-0.06724818795919418,
-0.029193658381700516,
-0.05280323326587677,
0.06468886137008667,
-0.08884575963020325,
-0.027708567678928375,
0.1332162618637085,
-0.01903904788196087,
0.04655366763472557,
0.12936700880527496,
0.02046884410083294,
0.015383756719529629,
0.034968774765729904,
-0.2578005790710449,
-0.07463036477565765,
-0.03505445644259453,
-0.012416874058544636,
0.05272092670202255,
0.05525677278637886,
0.19735674560070038,
-0.03551921248435974,
-0.08521962910890579,
0.020131373777985573,
0.02735883742570877,
-0.02776256389915943,
0.10749414563179016,
0.019579345360398293,
-0.004837906453758478,
-0.16151933372020721,
0.08257976174354553,
-0.005964108742773533,
-0.08297000825405121,
0.028665626421570778,
0.2024049311876297,
-0.12141239643096924,
-0.10309756547212601,
-0.06804922968149185,
0.07315051555633545,
-0.09220825880765915,
0.016043387353420258,
-0.005091092549264431,
-0.1521538347005844,
0.06916408240795135,
0.07598215341567993,
0.04075418785214424,
0.06513199955224991,
-0.11743064224720001,
-0.015730571001768112,
-0.04170290008187294,
-0.002195435343310237,
0.03521120920777321,
0.01863143965601921,
-0.057492829859256744,
0.15846455097198486,
-0.0676199421286583,
0.08538917452096939,
-0.0744810476899147,
-0.1058846190571785,
-0.1395980566740036,
0.04660497233271599,
-0.08038312196731567,
-0.07247276604175568,
-0.12832807004451752,
-0.052204377949237823,
-0.0067099276930093765,
-0.03388519585132599,
0.006552806124091148,
-0.06627799570560455,
-0.10922821611166,
0.01822470687329769,
-0.00743203004822135,
-0.009385870769619942,
-0.06096754968166351,
0.026706209406256676,
0.06246216222643852,
-0.039788868278265,
0.15730851888656616,
0.22509248554706573,
-0.13591648638248444,
0.11564400047063828,
-0.09797432273626328,
-0.105463907122612,
0.046008042991161346,
0.009427277371287346,
0.03594303876161575,
0.0503489226102829,
-0.03594081476330757,
0.0044484552927315235,
0.03905477747321129,
0.08074651658535004,
0.08456914126873016,
-0.06776505708694458,
0.020801106467843056,
-0.05122765153646469,
-0.14904099702835083,
-0.016655439510941505,
-0.0464773029088974,
0.06876829266548157,
-0.006725262850522995,
0.11020535975694656,
-0.0515950471162796,
0.07739507406949997,
-0.07558431476354599,
0.050614211708307266,
0.021146971732378006,
-0.14688286185264587,
-0.006612539757043123,
-0.07093682140111923,
0.042144812643527985,
-0.008834975771605968,
0.20241086184978485,
-0.03228091076016426,
0.010342049412429333,
0.033811055123806,
0.06203942745923996,
-0.01957780309021473,
0.009357001632452011,
0.2014283686876297,
0.12640917301177979,
-0.08496357500553131,
-0.02679651789367199,
0.06793134659528732,
0.07248228788375854,
0.07093550264835358,
0.10807815194129944,
-0.015352966263890266,
0.028434239327907562,
0.07829629629850388,
-0.060215238481760025,
0.07576877623796463,
-0.08603982627391815,
-0.11668483167886734,
0.05793621391057968,
0.012955795042216778,
-0.055695828050374985,
0.20305177569389343,
0.19142870604991913,
-0.026278704404830933,
0.018410727381706238,
-0.0029499190859496593,
-0.10117456316947937,
-0.15619947016239166,
-0.05423750728368759,
-0.07170962542295456,
-0.1319410353899002,
-0.004549739416688681,
-0.16646917164325714,
0.022016216069459915,
-0.01132756657898426,
0.09506805986166,
-0.06855440139770508,
-0.01345991250127554,
0.1364889293909073,
-0.1055467277765274,
0.0847758799791336,
-0.024517204612493515,
0.07877567410469055,
-0.03746940940618515,
-0.018209461122751236,
-0.10342709720134735,
0.007514837197959423,
0.01131442841142416,
0.06840907037258148,
-0.10897937417030334,
0.02432350255548954,
-0.12208317965269089,
-0.08617185056209564,
-0.026142612099647522,
0.09279687702655792,
-0.0403008833527565,
0.15116846561431885,
0.02645145356655121,
-0.06710928678512573,
-0.004313822835683823,
0.2646709978580475,
-0.08046227693557739,
-0.08319197595119476,
-0.030799202620983124,
0.2152107208967209,
0.04053696244955063,
0.06396269053220749,
0.019140036776661873,
0.038027774542570114,
-0.07184682041406631,
0.2957373559474945,
0.34401440620422363,
-0.1318037211894989,
-0.007773484103381634,
0.04225075617432594,
0.04406323283910751,
0.14687567949295044,
0.07998795062303543,
0.11360671371221542,
0.2849363386631012,
-0.09197647124528885,
0.016657205298542976,
-0.04230864346027374,
-0.01424806285649538,
-0.06908884644508362,
0.045314885675907135,
0.08216670155525208,
-0.09241747111082077,
-0.022950593382120132,
0.08125471323728561,
-0.29741767048835754,
0.10791494697332382,
-0.15600289404392242,
-0.14948409795761108,
-0.05027429759502411,
-0.008771711029112339,
0.014683255925774574,
0.019041186198592186,
0.09663030505180359,
0.025651484727859497,
-0.07275258749723434,
0.07816889137029648,
0.024486342445015907,
-0.23020237684249878,
-0.01345184724777937,
0.1456068754196167,
-0.06789913028478622,
-0.025938833132386208,
-0.021313713863492012,
0.051610056310892105,
0.05763651058077812,
0.09027529507875443,
-0.03809558227658272,
-0.0746568813920021,
-0.007141788024455309,
-0.022818787023425102,
0.01914946548640728,
0.0597183033823967,
0.06841408461332321,
-0.0920223817229271,
0.1167774423956871,
-0.07350476831197739,
0.0650370642542839,
0.037623800337314606,
-0.022277191281318665,
0.0018526542698964477,
0.013183658011257648,
-0.06512464582920074,
0.05533479526638985,
0.1295643299818039,
-0.025459708645939827,
-0.002524374984204769,
-0.028180841356515884,
-0.0767761766910553,
-0.024015206843614578,
-0.04643676429986954,
-0.09101243317127228,
-0.18130090832710266,
-0.12738600373268127,
0.041754670441150665,
-0.03240608796477318,
-0.2046082615852356,
0.0060346988029778,
-0.1128578633069992,
0.03700976446270943,
-0.14154092967510223,
0.10004086047410965,
0.07216610759496689,
0.004716616589576006,
0.006774604320526123,
0.0675399899482727,
0.045677728950977325,
0.14796748757362366,
-0.16543124616146088,
-0.04919974133372307
] |
null | null | transformers |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# wav2vec2-thai-ASR
This model is a fine-tuned version of [facebook/wav2vec2-large-xlsr-53](https://huggingface.co/facebook/wav2vec2-large-xlsr-53) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.6108
- Wer: 0.5636
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0003
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 32
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 500
- num_epochs: 20
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Wer |
|:-------------:|:-----:|:----:|:---------------:|:------:|
| 7.1123 | 2.65 | 400 | 3.3946 | 1.0002 |
| 1.5734 | 5.3 | 800 | 0.6881 | 0.7290 |
| 0.5934 | 7.94 | 1200 | 0.5789 | 0.6402 |
| 0.4059 | 10.59 | 1600 | 0.5496 | 0.5976 |
| 0.3136 | 13.24 | 2000 | 0.6109 | 0.5863 |
| 0.2546 | 15.89 | 2400 | 0.6113 | 0.5865 |
| 0.2184 | 18.54 | 2800 | 0.6108 | 0.5636 |
### Framework versions
- Transformers 4.16.2
- Pytorch 1.10.0+cu111
- Datasets 1.18.3
- Tokenizers 0.11.0
| {"license": "apache-2.0", "tags": ["generated_from_trainer"], "model-index": [{"name": "wav2vec2-thai-ASR", "results": []}]} | automatic-speech-recognition | Rattana/wav2vec2-thai-ASR | [
"transformers",
"pytorch",
"tensorboard",
"wav2vec2",
"automatic-speech-recognition",
"generated_from_trainer",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [] | TAGS
#transformers #pytorch #tensorboard #wav2vec2 #automatic-speech-recognition #generated_from_trainer #license-apache-2.0 #endpoints_compatible #region-us
| wav2vec2-thai-ASR
=================
This model is a fine-tuned version of facebook/wav2vec2-large-xlsr-53 on the None dataset.
It achieves the following results on the evaluation set:
* Loss: 0.6108
* Wer: 0.5636
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 0.0003
* train\_batch\_size: 16
* eval\_batch\_size: 8
* seed: 42
* gradient\_accumulation\_steps: 2
* total\_train\_batch\_size: 32
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* lr\_scheduler\_warmup\_steps: 500
* num\_epochs: 20
* mixed\_precision\_training: Native AMP
### Training results
### Framework versions
* Transformers 4.16.2
* Pytorch 1.10.0+cu111
* Datasets 1.18.3
* Tokenizers 0.11.0
| [
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 0.0003\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 8\n* seed: 42\n* gradient\\_accumulation\\_steps: 2\n* total\\_train\\_batch\\_size: 32\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 500\n* num\\_epochs: 20\n* mixed\\_precision\\_training: Native AMP",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.16.2\n* Pytorch 1.10.0+cu111\n* Datasets 1.18.3\n* Tokenizers 0.11.0"
] | [
"TAGS\n#transformers #pytorch #tensorboard #wav2vec2 #automatic-speech-recognition #generated_from_trainer #license-apache-2.0 #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 0.0003\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 8\n* seed: 42\n* gradient\\_accumulation\\_steps: 2\n* total\\_train\\_batch\\_size: 32\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 500\n* num\\_epochs: 20\n* mixed\\_precision\\_training: Native AMP",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.16.2\n* Pytorch 1.10.0+cu111\n* Datasets 1.18.3\n* Tokenizers 0.11.0"
] | [
56,
158,
4,
35
] | [
"passage: TAGS\n#transformers #pytorch #tensorboard #wav2vec2 #automatic-speech-recognition #generated_from_trainer #license-apache-2.0 #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 0.0003\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 8\n* seed: 42\n* gradient\\_accumulation\\_steps: 2\n* total\\_train\\_batch\\_size: 32\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 500\n* num\\_epochs: 20\n* mixed\\_precision\\_training: Native AMP### Training results### Framework versions\n\n\n* Transformers 4.16.2\n* Pytorch 1.10.0+cu111\n* Datasets 1.18.3\n* Tokenizers 0.11.0"
] | [
-0.1353192776441574,
0.08287180960178375,
-0.0021243419032543898,
0.05959745869040489,
0.12394244968891144,
0.0037996824830770493,
0.12123222649097443,
0.1329241245985031,
-0.10531143844127655,
0.07109106332063675,
0.11423744261264801,
0.11040595918893814,
0.04502908140420914,
0.10410506278276443,
-0.031035253778100014,
-0.3138936161994934,
0.013049645349383354,
0.03153720125555992,
-0.1574816107749939,
0.12401346117258072,
0.10318645089864731,
-0.11258160322904587,
0.04630584642291069,
0.056675929576158524,
-0.1509484350681305,
0.003687629709020257,
-0.014361594803631306,
-0.0832381397485733,
0.117533378303051,
0.038135726004838943,
0.09820997714996338,
0.028332142159342766,
0.08238213509321213,
-0.20671893656253815,
0.00995259452611208,
0.057087864726781845,
0.03252735361456871,
0.08421735465526581,
0.08478651940822601,
-0.010455715470016003,
0.16216981410980225,
-0.06146654859185219,
0.07737254351377487,
0.056562431156635284,
-0.1067618653178215,
-0.32964423298835754,
-0.08695787191390991,
0.07116943597793579,
0.10413642227649689,
0.08654357492923737,
-0.01624532975256443,
0.10536391288042068,
-0.053233202546834946,
0.0908001959323883,
0.2405892312526703,
-0.28455692529678345,
-0.08589577674865723,
-0.032382670789957047,
0.05386868491768837,
0.02727987989783287,
-0.11240366101264954,
-0.01346200704574585,
0.03538290783762932,
0.033819932490587234,
0.10956477373838425,
0.011128528043627739,
-0.01097126118838787,
0.016434533521533012,
-0.15027138590812683,
-0.060868993401527405,
0.14129658043384552,
0.07624214142560959,
-0.05082866549491882,
-0.09350262582302094,
-0.03504396229982376,
-0.20103511214256287,
-0.04528571665287018,
-0.007574558258056641,
0.028566215187311172,
-0.056599847972393036,
-0.13757000863552094,
-0.007750360760837793,
-0.08564475923776627,
-0.10540102422237396,
0.004522913601249456,
0.21765275299549103,
0.048547595739364624,
0.00022512732539325953,
-0.014894220978021622,
0.11214520037174225,
0.05669621378183365,
-0.15555459260940552,
-0.029773911461234093,
0.04091648757457733,
-0.07037749886512756,
-0.021766530349850655,
-0.05587043985724449,
-0.021117214113473892,
-0.006462617311626673,
0.15884660184383392,
-0.038518454879522324,
0.06853058189153671,
0.03629586100578308,
0.02586360089480877,
-0.11265285313129425,
0.21183738112449646,
-0.056375738233327866,
-0.004595942795276642,
-0.034463297575712204,
0.09456518292427063,
0.009752312675118446,
-0.01558311004191637,
-0.07756584137678146,
0.027367454022169113,
0.10301017761230469,
0.03808891400694847,
-0.03100123628973961,
0.036123599857091904,
-0.04668622836470604,
-0.030844856053590775,
0.02163253352046013,
-0.09074850380420685,
0.02232452854514122,
0.018638135865330696,
-0.10023096948862076,
-0.007364179473370314,
0.017627423629164696,
0.02106279321014881,
-0.0012795450165867805,
0.10609130561351776,
-0.08319703489542007,
-0.002732400316745043,
-0.08394477516412735,
-0.09843988716602325,
0.025171766057610512,
-0.035159602761268616,
0.0065809208899736404,
-0.08434225618839264,
-0.1279488354921341,
-0.01682589389383793,
0.05199576914310455,
-0.03511572629213333,
-0.06892997026443481,
-0.04492740333080292,
-0.08398064225912094,
0.05008457601070404,
-0.019184840843081474,
0.15032443404197693,
-0.052530162036418915,
0.10935714840888977,
0.07764513045549393,
0.0621536448597908,
0.03521507978439331,
0.051354072988033295,
-0.05973805859684944,
0.037823569029569626,
-0.1587243378162384,
0.06923960149288177,
-0.08749077469110489,
0.06526675075292587,
-0.13151885569095612,
-0.12900014221668243,
0.00662154471501708,
-0.0038180439732968807,
0.09184451401233673,
0.09907606244087219,
-0.14465002715587616,
-0.11035333573818207,
0.15581868588924408,
-0.07814937084913254,
-0.13444733619689941,
0.12214063853025436,
-0.015719827264547348,
-0.004574209451675415,
0.045357897877693176,
0.1381564885377884,
0.09356901049613953,
-0.09486214071512222,
-0.0059408145025372505,
-0.040612854063510895,
0.10644865036010742,
-0.009662440977990627,
0.11140536516904831,
-0.029685528948903084,
0.01619773730635643,
0.01082470640540123,
-0.058251895010471344,
0.05085013061761856,
-0.10809552669525146,
-0.096368707716465,
-0.04049309343099594,
-0.09581955522298813,
0.033325307071208954,
0.05985740199685097,
0.06734362244606018,
-0.09796889871358871,
-0.14185747504234314,
0.04647738113999367,
0.11781507730484009,
-0.09019942581653595,
0.02968648448586464,
-0.09987734258174896,
0.05743267014622688,
-0.043184954673051834,
-0.009502463974058628,
-0.17550155520439148,
-0.02494010515511036,
0.017316799610853195,
-0.061127983033657074,
0.02278718538582325,
-0.02666359208524227,
0.0911674052476883,
0.06310530751943588,
-0.04970768094062805,
-0.06631867587566376,
-0.0881999209523201,
-0.014995494857430458,
-0.07819581031799316,
-0.20633593201637268,
-0.10469017922878265,
-0.02026965655386448,
0.1518138200044632,
-0.20201437175273895,
0.025674834847450256,
0.03010762855410576,
0.1244608610868454,
0.0314985066652298,
-0.044567760080099106,
-0.022216184064745903,
0.06876497715711594,
-0.028733814135193825,
-0.06688813120126724,
0.03427889570593834,
0.007141138892620802,
-0.1302155703306198,
-0.00955839455127716,
-0.10618849098682404,
0.14179053902626038,
0.11630674451589584,
-0.020631374791264534,
-0.0706968829035759,
-0.01604267954826355,
-0.07945337146520615,
-0.04834816977381706,
-0.012090715579688549,
0.006468383129686117,
0.16727225482463837,
0.026461927220225334,
0.13386909663677216,
-0.0798453614115715,
-0.060740500688552856,
0.039747774600982666,
0.003208504058420658,
-0.015635058283805847,
0.12206641584634781,
0.050693873316049576,
-0.06715305894613266,
0.10301114618778229,
0.09828367084264755,
-0.0965651273727417,
0.14315500855445862,
-0.07468224316835403,
-0.10102850198745728,
-0.026370203122496605,
0.011681271716952324,
0.044521380215883255,
0.11589634418487549,
-0.13848082721233368,
-0.018223261460661888,
0.027299275621771812,
0.007331055123358965,
0.02058015763759613,
-0.21452666819095612,
-0.008913631550967693,
0.05293908715248108,
-0.06076863408088684,
-0.047364141792058945,
0.0001846749655669555,
-0.013052272610366344,
0.08177275955677032,
0.01506213191896677,
-0.060948446393013,
0.003915005829185247,
-0.0007065574172884226,
-0.06697604805231094,
0.2008271962404251,
-0.07387378066778183,
-0.13093455135822296,
-0.15763096511363983,
-0.027121296152472496,
-0.058931246399879456,
-0.004001930821686983,
0.047999441623687744,
-0.1062549278140068,
-0.03295836225152016,
-0.04866928979754448,
0.048694051802158356,
-0.04939420893788338,
0.04446186497807503,
0.03301718458533287,
0.00560538936406374,
0.09475474059581757,
-0.12047694623470306,
0.02093273028731346,
-0.03137366473674774,
-0.05080794543027878,
0.013939431868493557,
0.04260668903589249,
0.11056061834096909,
0.1559198647737503,
0.0197709109634161,
0.038145512342453,
-0.025637805461883545,
0.1900915801525116,
-0.10038866847753525,
-0.048304613679647446,
0.13682828843593597,
0.011371586471796036,
0.04244399815797806,
0.08080106973648071,
0.0693555474281311,
-0.08664016425609589,
0.01605088822543621,
0.0417100265622139,
-0.026308344677090645,
-0.22545114159584045,
-0.021204810589551926,
-0.056815486401319504,
-0.016587939113378525,
0.11776621639728546,
0.033020179718732834,
0.04641502723097801,
0.04528838023543358,
-0.014196149073541164,
0.020133135840296745,
-0.016756141558289528,
0.08530820906162262,
0.0935153141617775,
0.06209028139710426,
0.13451236486434937,
-0.03345886990427971,
-0.04934517666697502,
0.020400723442435265,
-0.013165382668375969,
0.23569819331169128,
0.004029049072414637,
0.18488003313541412,
0.051772620528936386,
0.15881025791168213,
0.01253980677574873,
0.08741862326860428,
0.016035836189985275,
-0.03921615704894066,
0.02253148704767227,
-0.05837436020374298,
-0.03024522215127945,
0.042070500552654266,
0.050496600568294525,
0.06445617228746414,
-0.13504791259765625,
-0.015003146603703499,
0.020642027258872986,
0.3574531078338623,
0.055674195289611816,
-0.34064146876335144,
-0.1254921704530716,
-0.00044041709043085575,
-0.07225100696086884,
-0.03432746231555939,
0.019007032737135887,
0.08481479436159134,
-0.08777321130037308,
0.06890270859003067,
-0.0819801464676857,
0.09856024384498596,
-0.038444314152002335,
0.005069245118647814,
0.08169903606176376,
0.08875464648008347,
0.0013204417191445827,
0.059585150331258774,
-0.24353361129760742,
0.2907114028930664,
-0.008182978257536888,
0.0972382053732872,
-0.04672999680042267,
0.02957816794514656,
0.03702873736619949,
-0.0005754946614615619,
0.053215254098176956,
-0.026306750252842903,
-0.07689279317855835,
-0.19234570860862732,
-0.07285016775131226,
0.02499544247984886,
0.12215746194124222,
-0.07357854396104813,
0.12963096797466278,
-0.02758304961025715,
-0.018434692174196243,
0.06506625562906265,
-0.05636175721883774,
-0.08967060595750809,
-0.09665432572364807,
0.020936207845807076,
0.019851896911859512,
0.058149151504039764,
-0.10793156921863556,
-0.12376696616411209,
-0.05145315080881119,
0.1604686975479126,
-0.07783526927232742,
-0.026133103296160698,
-0.13453854620456696,
0.07481402903795242,
0.15800248086452484,
-0.06990533322095871,
0.05685294792056084,
0.01033300906419754,
0.13756005465984344,
0.026630140841007233,
-0.03684735670685768,
0.09750469028949738,
-0.08573748916387558,
-0.2160804271697998,
-0.029908541589975357,
0.14902834594249725,
0.02529277838766575,
0.05692125856876373,
-0.025755979120731354,
0.03588774800300598,
-0.03505901247262955,
-0.08782794326543808,
0.05573633685708046,
-0.018602248281240463,
0.020970473065972328,
0.016633301973342896,
0.001138885854743421,
0.04242681711912155,
-0.06897083669900894,
-0.03776567801833153,
0.13363045454025269,
0.28332963585853577,
-0.08274998515844345,
-0.011102666147053242,
0.04270729050040245,
-0.02119138091802597,
-0.13123758137226105,
0.0169252660125494,
0.12293596565723419,
0.021553272381424904,
-0.011258884333074093,
-0.2109525501728058,
0.06293422728776932,
0.08034610003232956,
-0.0326831080019474,
0.10335120558738708,
-0.30765485763549805,
-0.1452551782131195,
0.12455465644598007,
0.11080410331487656,
0.0055872476659715176,
-0.15768839418888092,
-0.06393653899431229,
-0.016953760758042336,
-0.1234428808093071,
0.0930631086230278,
-0.04360714182257652,
0.11964232474565506,
-0.018350420519709587,
0.06842653453350067,
0.01317005604505539,
-0.05179896205663681,
0.1519871950149536,
-0.00871429406106472,
0.060508061200380325,
-0.0026642607990652323,
0.04084523394703865,
0.04664811119437218,
-0.05912262201309204,
0.015393571928143501,
-0.08667127788066864,
0.030178692191839218,
-0.11652616411447525,
-0.03656160831451416,
-0.09408709406852722,
0.04430775344371796,
-0.03404216095805168,
-0.037112388759851456,
-0.01861308328807354,
0.023280516266822815,
0.022562207654118538,
-0.00936230830848217,
0.17186643183231354,
-0.01588904857635498,
0.16883637011051178,
0.1152707040309906,
0.08809897303581238,
-0.018711943179368973,
-0.10339462012052536,
-0.010981856845319271,
-0.016446644440293312,
0.07609789073467255,
-0.1438990980386734,
0.014141116291284561,
0.1326301544904709,
0.0623478926718235,
0.1278829723596573,
0.07543107122182846,
-0.0667240247130394,
0.024635396897792816,
0.07321786880493164,
-0.09883742779493332,
-0.1299966722726822,
-0.032317765057086945,
0.027272306382656097,
-0.1412716656923294,
0.06562960892915726,
0.0981980562210083,
-0.06693506985902786,
-0.009034103713929653,
0.00835849717259407,
-0.004060840699821711,
-0.0568622387945652,
0.2239811271429062,
0.05091779679059982,
0.08762574195861816,
-0.10465353727340698,
0.07566040009260178,
0.03930683434009552,
-0.14264705777168274,
0.011938405223190784,
0.06842564046382904,
-0.046197760850191116,
-0.012910234741866589,
0.012044484727084637,
0.08435387909412384,
-0.049821916967630386,
-0.061683230102062225,
-0.14844860136508942,
-0.1387277990579605,
0.08885867893695831,
0.12404676526784897,
0.055768050253391266,
0.029248690232634544,
-0.05727395415306091,
0.06108638271689415,
-0.11613176763057709,
0.09215826541185379,
0.07843878120183945,
0.08364368975162506,
-0.1557675302028656,
0.15910713374614716,
0.014232046902179718,
0.030103011056780815,
0.0014126815367490053,
-0.006995649542659521,
-0.08995320647954941,
0.022197850048542023,
-0.13218174874782562,
-0.04590383917093277,
-0.049526654183864594,
0.0018819872057065368,
0.005690243095159531,
-0.0662367194890976,
-0.07955598086118698,
0.03396051749587059,
-0.12415951490402222,
-0.04850421100854874,
0.012836137786507607,
0.04373406246304512,
-0.12586559355258942,
-0.009947946295142174,
0.048334162682294846,
-0.12514829635620117,
0.08181144297122955,
0.06997759640216827,
0.027891289442777634,
0.053972188383340836,
-0.05000549554824829,
0.013221384026110172,
0.04752499237656593,
-0.00653601810336113,
0.03946799412369728,
-0.13639788329601288,
-0.004490274470299482,
-0.022731680423021317,
0.05487306788563728,
-0.005936981178820133,
0.046171002089977264,
-0.13129884004592896,
-0.04452880471944809,
-0.019383102655410767,
-0.053847040981054306,
-0.06553096324205399,
0.045849449932575226,
0.08357244729995728,
0.04013293236494064,
0.1808517724275589,
-0.07184247672557831,
0.020689550787210464,
-0.22461792826652527,
0.010063901543617249,
-0.02631484903395176,
-0.09965252876281738,
-0.07765785604715347,
-0.027511892840266228,
0.07713142037391663,
-0.07099589705467224,
0.08060182631015778,
-0.0647740364074707,
0.07221229374408722,
0.04217942804098129,
-0.057083677500486374,
0.028031280264258385,
0.04570433497428894,
0.2413053810596466,
0.057514578104019165,
-0.01055949367582798,
0.08075172454118729,
0.018732037395238876,
0.06855052709579468,
0.11378665268421173,
0.16876283288002014,
0.14361445605754852,
-0.01148573774844408,
0.11052581667900085,
0.06728927791118622,
-0.08680564165115356,
-0.16936379671096802,
0.0682840496301651,
-0.03437810391187668,
0.13201187551021576,
-0.0026252148672938347,
0.2040315866470337,
0.12179264426231384,
-0.17597618699073792,
0.0444829985499382,
-0.026577332988381386,
-0.0781695619225502,
-0.10021908581256866,
-0.03660377487540245,
-0.06904138624668121,
-0.18731677532196045,
0.022803455591201782,
-0.10337767750024796,
0.04614895582199097,
0.05281072109937668,
0.02748861163854599,
0.010583953931927681,
0.15601497888565063,
0.04875882714986801,
0.013105776160955429,
0.09218060970306396,
0.0031068124808371067,
-0.03165982663631439,
-0.0465131476521492,
-0.09745809435844421,
0.03155435994267464,
-0.03199702128767967,
0.05133255198597908,
-0.06155034527182579,
-0.12449290603399277,
0.06460575014352798,
0.01609036698937416,
-0.11338188499212265,
0.023877089843153954,
0.0034458443988114595,
0.08098756521940231,
0.039360083639621735,
0.016408303752541542,
0.004285427741706371,
-0.020122475922107697,
0.24095167219638824,
-0.1129845529794693,
-0.05873718857765198,
-0.12865863740444183,
0.26048743724823,
0.016500307247042656,
-0.021652277559041977,
0.03456471860408783,
-0.07368964701890945,
-0.029607877135276794,
0.1656544804573059,
0.1282142847776413,
-0.01057505328208208,
-0.025124182924628258,
0.005163045600056648,
-0.01807192713022232,
-0.055107880383729935,
0.08036269247531891,
0.11980008333921432,
0.07728100568056107,
-0.06796935945749283,
-0.041883353143930435,
-0.038403891026973724,
-0.050926897674798965,
-0.013499880209565163,
0.09950720518827438,
0.020698295906186104,
-0.02190803736448288,
-0.03199826180934906,
0.08918410539627075,
-0.05911443755030632,
-0.1032702848315239,
0.058205969631671906,
-0.169388085603714,
-0.18281510472297668,
-0.03472890332341194,
0.06710685789585114,
0.01438694167882204,
0.069815494120121,
0.001558233518153429,
-0.03339182212948799,
0.08852623403072357,
0.0018739871447905898,
-0.06512696295976639,
-0.12468824535608292,
0.11480341106653214,
-0.0794353112578392,
0.19048012793064117,
-0.055064789950847626,
0.04053196683526039,
0.1281130015850067,
0.06572692841291428,
-0.083732970058918,
0.03402812033891678,
0.06698964536190033,
-0.13908597826957703,
0.040645986795425415,
0.17773811519145966,
-0.03100661374628544,
0.1067022755742073,
0.022215479984879494,
-0.14390511810779572,
0.008559462614357471,
-0.08069147169589996,
-0.034339789301157,
-0.06168817728757858,
-0.027157362550497055,
-0.041362520307302475,
0.1237998679280281,
0.2177313268184662,
-0.06361078470945358,
-0.01497628167271614,
-0.058638811111450195,
0.03431519493460655,
0.07148587703704834,
0.09510143101215363,
-0.04235456511378288,
-0.28629040718078613,
0.008915677666664124,
0.017999842762947083,
-0.012156421318650246,
-0.2725609540939331,
-0.09688368439674377,
0.04234253615140915,
-0.06654196232557297,
-0.03622165694832802,
0.09041828662157059,
0.08989552408456802,
0.046529997140169144,
-0.052696555852890015,
-0.05839604139328003,
-0.0657094195485115,
0.17696581780910492,
-0.18506132066249847,
-0.06225229799747467
] |
null | null | transformers |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# wav2vec2-thai-colab
This model is a fine-tuned version of [facebook/wav2vec2-large-xlsr-53](https://huggingface.co/facebook/wav2vec2-large-xlsr-53) on the None dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0003
- train_batch_size: 4
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 500
- num_epochs: 20
- mixed_precision_training: Native AMP
### Framework versions
- Transformers 4.11.3
- Pytorch 1.10.0+cu111
- Datasets 1.18.3
- Tokenizers 0.10.3
| {"license": "apache-2.0", "tags": ["generated_from_trainer"], "model-index": [{"name": "wav2vec2-thai-colab", "results": []}]} | automatic-speech-recognition | Rattana/wav2vec2-thai-colab | [
"transformers",
"pytorch",
"tensorboard",
"wav2vec2",
"automatic-speech-recognition",
"generated_from_trainer",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [] | TAGS
#transformers #pytorch #tensorboard #wav2vec2 #automatic-speech-recognition #generated_from_trainer #license-apache-2.0 #endpoints_compatible #region-us
|
# wav2vec2-thai-colab
This model is a fine-tuned version of facebook/wav2vec2-large-xlsr-53 on the None dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0003
- train_batch_size: 4
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 500
- num_epochs: 20
- mixed_precision_training: Native AMP
### Framework versions
- Transformers 4.11.3
- Pytorch 1.10.0+cu111
- Datasets 1.18.3
- Tokenizers 0.10.3
| [
"# wav2vec2-thai-colab\n\nThis model is a fine-tuned version of facebook/wav2vec2-large-xlsr-53 on the None dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 0.0003\n- train_batch_size: 4\n- eval_batch_size: 8\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_steps: 500\n- num_epochs: 20\n- mixed_precision_training: Native AMP",
"### Framework versions\n\n- Transformers 4.11.3\n- Pytorch 1.10.0+cu111\n- Datasets 1.18.3\n- Tokenizers 0.10.3"
] | [
"TAGS\n#transformers #pytorch #tensorboard #wav2vec2 #automatic-speech-recognition #generated_from_trainer #license-apache-2.0 #endpoints_compatible #region-us \n",
"# wav2vec2-thai-colab\n\nThis model is a fine-tuned version of facebook/wav2vec2-large-xlsr-53 on the None dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 0.0003\n- train_batch_size: 4\n- eval_batch_size: 8\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_steps: 500\n- num_epochs: 20\n- mixed_precision_training: Native AMP",
"### Framework versions\n\n- Transformers 4.11.3\n- Pytorch 1.10.0+cu111\n- Datasets 1.18.3\n- Tokenizers 0.10.3"
] | [
56,
42,
6,
12,
8,
3,
117,
35
] | [
"passage: TAGS\n#transformers #pytorch #tensorboard #wav2vec2 #automatic-speech-recognition #generated_from_trainer #license-apache-2.0 #endpoints_compatible #region-us \n# wav2vec2-thai-colab\n\nThis model is a fine-tuned version of facebook/wav2vec2-large-xlsr-53 on the None dataset.## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 0.0003\n- train_batch_size: 4\n- eval_batch_size: 8\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_steps: 500\n- num_epochs: 20\n- mixed_precision_training: Native AMP### Framework versions\n\n- Transformers 4.11.3\n- Pytorch 1.10.0+cu111\n- Datasets 1.18.3\n- Tokenizers 0.10.3"
] | [
-0.09788200259208679,
0.11390691250562668,
-0.0024988888762891293,
0.05787764489650726,
0.12330528348684311,
0.00790488626807928,
0.10984362661838531,
0.1260383278131485,
-0.049589067697525024,
0.08363527059555054,
0.07504524290561676,
0.04261627048254013,
0.07992567121982574,
0.16437067091464996,
-0.026724085211753845,
-0.2398584485054016,
0.0264145378023386,
-0.0348849818110466,
-0.07721684873104095,
0.10104292631149292,
0.07651378214359283,
-0.10438928008079529,
0.06533338129520416,
0.019090771675109863,
-0.14515459537506104,
0.017204472795128822,
-0.04109756648540497,
-0.06977483630180359,
0.11230340600013733,
0.00491297198459506,
0.06365248560905457,
0.03995335102081299,
0.13847263157367706,
-0.2451086789369583,
0.0030227540992200375,
0.08941488713026047,
0.03516311198472977,
0.07762178033590317,
0.070844866335392,
0.0016779439756646752,
0.09525575488805771,
-0.14695115387439728,
0.10191935300827026,
0.040346547961235046,
-0.06689602136611938,
-0.18959815800189972,
-0.087274469435215,
0.06433621793985367,
0.11212922632694244,
0.10813673585653305,
-0.004531007260084152,
0.09554143249988556,
-0.0826331377029419,
0.0690288245677948,
0.22671407461166382,
-0.2598608136177063,
-0.05756840482354164,
-0.007199753541499376,
0.04030652716755867,
0.04090150445699692,
-0.09899620711803436,
0.01255465392023325,
0.021428190171718597,
0.03175617381930351,
0.11544912308454514,
-0.00997289177030325,
-0.07010721415281296,
-0.01799112930893898,
-0.12244699150323868,
-0.008498191833496094,
0.10079550743103027,
0.05465668812394142,
-0.04191293194890022,
-0.12262944132089615,
-0.05127473175525665,
-0.08060921728610992,
-0.01497435662895441,
-0.05418175831437111,
0.03433968499302864,
-0.04228614643216133,
-0.05666382238268852,
-0.046183206140995026,
-0.0708284005522728,
-0.061656344681978226,
0.011597276665270329,
0.11704094707965851,
0.02761012315750122,
0.01607600413262844,
-0.033084847033023834,
0.08686137944459915,
0.05884712561964989,
-0.11312617361545563,
0.0016020889161154628,
-0.0016578525537624955,
-0.12705887854099274,
-0.022968700155615807,
-0.024873947724699974,
0.0013087757397443056,
0.001787656219676137,
0.1274648904800415,
-0.042372386902570724,
0.10646811127662659,
-0.013093393296003342,
-0.010949106886982918,
-0.02148367464542389,
0.13604168593883514,
-0.05260298028588295,
-0.031615931540727615,
-0.03026970662176609,
0.08386339247226715,
-0.002113102935254574,
-0.02941165119409561,
-0.05687633156776428,
-0.00012084381887689233,
0.06182799115777016,
0.07380713522434235,
-0.04457377642393112,
0.010857458226382732,
-0.04158588871359825,
-0.02974286675453186,
0.00887217465788126,
-0.12467217445373535,
0.04160520061850548,
0.02830926701426506,
-0.059389542788267136,
0.021438071504235268,
0.01599065028131008,
0.012325156480073929,
-0.04727400839328766,
0.10268265008926392,
-0.05447448417544365,
-0.005540609359741211,
-0.044775865972042084,
-0.059287749230861664,
0.014522489160299301,
-0.10278520733118057,
-0.02615107223391533,
-0.0672963559627533,
-0.14779341220855713,
-0.037394728511571884,
0.06489282846450806,
-0.06558476388454437,
-0.02447057142853737,
-0.04293190687894821,
-0.04355509579181671,
0.030629053711891174,
-0.02864273451268673,
0.1703532487154007,
-0.05469518527388573,
0.058698445558547974,
-0.03663348779082298,
0.053360309451818466,
0.052245885133743286,
0.056450504809617996,
-0.05456147342920303,
0.022950870916247368,
-0.09623625874519348,
0.08258698135614395,
-0.09843914210796356,
-0.003825681982561946,
-0.1389971226453781,
-0.08837014436721802,
0.022068046033382416,
-0.0083360830321908,
0.06534532457590103,
0.12531247735023499,
-0.2113341987133026,
-0.05236334726214409,
0.13619278371334076,
-0.08220119029283524,
-0.05151497572660446,
0.08525916188955307,
-0.02857535146176815,
0.01389955636113882,
0.03771177679300308,
0.18564040958881378,
0.06282101571559906,
-0.1614425778388977,
-0.002751612802967429,
-0.007813002914190292,
0.04866456612944603,
0.014161491766571999,
0.03923012316226959,
-0.03483838215470314,
0.04168908670544624,
0.008816269226372242,
-0.04968634247779846,
-0.004784380551427603,
-0.07166484743356705,
-0.07893354445695877,
-0.039642684161663055,
-0.07820568233728409,
0.014604474417865276,
0.002628636546432972,
0.018701860681176186,
-0.0611015185713768,
-0.10730748623609543,
0.06162078678607941,
0.11203358322381973,
-0.0809735432267189,
0.027253368869423866,
-0.0689307302236557,
-0.008820764720439911,
0.022093920037150383,
-0.020649341866374016,
-0.1771927773952484,
-0.06904319673776627,
0.03684213012456894,
-0.09856951981782913,
0.00996861420571804,
0.007448929361999035,
0.059319887310266495,
0.043560583144426346,
-0.041409410536289215,
-0.02777203358709812,
-0.08376723527908325,
0.01621944271028042,
-0.07396189123392105,
-0.20404785871505737,
-0.05825584754347801,
-0.03394605219364166,
0.14798109233379364,
-0.19279839098453522,
-0.004093872383236885,
0.024077974259853363,
0.1483256071805954,
0.028641758486628532,
-0.05558855086565018,
0.007626494392752647,
0.04148191958665848,
0.027727510780096054,
-0.0967586562037468,
0.04390198364853859,
-0.0034718657843768597,
-0.0864400640130043,
-0.009324201382696629,
-0.13711650669574738,
0.02892671525478363,
0.08977528661489487,
0.08560739457607269,
-0.07433486729860306,
-0.025420300662517548,
-0.0739743560552597,
-0.04831390827894211,
-0.08722375333309174,
0.03316274657845497,
0.1954108327627182,
0.04030081257224083,
0.10499390959739685,
-0.057381968945264816,
-0.06364374607801437,
0.03125341981649399,
0.024753516539931297,
-0.022638417780399323,
0.08846575766801834,
0.11311795562505722,
-0.06583595275878906,
0.0671326145529747,
0.09426639974117279,
-0.0006828512996435165,
0.13964508473873138,
-0.04085175320506096,
-0.08257698267698288,
-0.013408936560153961,
-0.008718895725905895,
-0.022639576345682144,
0.1322425752878189,
-0.08278470486402512,
0.011601751670241356,
0.027438916265964508,
0.03845435380935669,
0.03091445006430149,
-0.1692187786102295,
0.0019472414860501885,
0.015029773116111755,
-0.06715862452983856,
-0.036272868514060974,
-0.005665816832333803,
0.026147965341806412,
0.07700183987617493,
0.018192263320088387,
-0.009065700694918633,
0.011265327222645283,
-0.012955217622220516,
-0.09955962747335434,
0.16708123683929443,
-0.12676391005516052,
-0.20601755380630493,
-0.08282072097063065,
0.03391949459910393,
-0.04214762523770332,
-0.05179463326931,
0.027318332344293594,
-0.12706074118614197,
-0.07227854430675507,
-0.08536344021558762,
0.005212515592575073,
-0.0310666561126709,
0.023347025737166405,
0.08181075751781464,
0.006377840414643288,
0.06862810254096985,
-0.11377028375864029,
0.0005816764896735549,
-0.04073857516050339,
-0.052825767546892166,
0.00035597567330114543,
0.07693904638290405,
0.06345182657241821,
0.09526357054710388,
0.010059239342808723,
0.037186890840530396,
-0.03156831115484238,
0.2000913769006729,
-0.08119979500770569,
0.029981650412082672,
0.12190617620944977,
-0.002830376150086522,
0.04297155514359474,
0.11209231615066528,
0.031560108065605164,
-0.10858257859945297,
0.03299085423350334,
0.0773860514163971,
-0.019438117742538452,
-0.25351184606552124,
-0.048191219568252563,
-0.027922876179218292,
-0.050131622701883316,
0.10990908741950989,
0.04753457382321358,
-0.03563796728849411,
0.017586765810847282,
0.01079204585403204,
-0.022877948358654976,
-0.002912018680945039,
0.052542757242918015,
0.08157972246408463,
0.04370826482772827,
0.09850747883319855,
-0.017815163359045982,
0.004992397967725992,
0.07883580774068832,
0.0019019319443032146,
0.26073363423347473,
0.001539811142720282,
0.07005321979522705,
0.04608900845050812,
0.13398326933383942,
-0.014143338426947594,
0.06019880622625351,
0.023555615916848183,
-0.013670953921973705,
-0.0014238418079912663,
-0.05992673337459564,
-0.012134646996855736,
0.0313558503985405,
0.028321832418441772,
-0.011363222263753414,
-0.09483535587787628,
0.05072875693440437,
0.022515172138810158,
0.3352184593677521,
0.03798436000943184,
-0.23683950304985046,
-0.06091306731104851,
-0.00581321818754077,
-0.06092335283756256,
-0.05490617826581001,
0.02360948920249939,
0.12249311804771423,
-0.13392572104930878,
0.09195391833782196,
-0.06043127551674843,
0.07782814651727676,
-0.043349988758563995,
0.012864209711551666,
0.05151082202792168,
0.11636253446340561,
-0.002039574319496751,
0.05134273320436478,
-0.2299124002456665,
0.24350634217262268,
0.013863061554729939,
0.10999259352684021,
-0.07642998546361923,
0.031478650867938995,
0.008147339336574078,
-0.00890322681516409,
0.11542593687772751,
0.004916145000606775,
-0.09636487066745758,
-0.11621905863285065,
-0.10504189133644104,
0.05099526792764664,
0.1271548718214035,
-0.029483474791049957,
0.061607975512742996,
-0.03279995173215866,
-0.009837914258241653,
0.039219193160533905,
-0.061230581253767014,
-0.1761055588722229,
-0.13590817153453827,
0.026395028457045555,
0.026480812579393387,
-0.012607062235474586,
-0.0696723535656929,
-0.10608602315187454,
-0.025955207645893097,
0.15774519741535187,
-0.022042900323867798,
-0.03716198354959488,
-0.16288508474826813,
0.042409997433423996,
0.1445862501859665,
-0.046503402292728424,
0.03718302398920059,
0.026006953790783882,
0.12695498764514923,
0.013331299647688866,
-0.09050028026103973,
0.0734531432390213,
-0.09609264880418777,
-0.189055398106575,
-0.04487905651330948,
0.14457346498966217,
0.08237764984369278,
0.0337953083217144,
0.000204856667551212,
0.028296567499637604,
0.016547108069062233,
-0.09991046041250229,
0.07011593878269196,
0.0938311219215393,
0.01652667485177517,
0.020764900371432304,
-0.055318333208560944,
-0.007853853516280651,
-0.04051338881254196,
-0.04773891344666481,
0.11793062835931778,
0.22653846442699432,
-0.08671388030052185,
0.14272461831569672,
0.10649282485246658,
-0.0659419521689415,
-0.15385127067565918,
0.05252121388912201,
0.10948292911052704,
0.019325392320752144,
0.045392006635665894,
-0.21709060668945312,
0.11279300600290298,
0.10795216262340546,
-0.015270361676812172,
0.02075192891061306,
-0.2864609658718109,
-0.13512729108333588,
0.08312948048114777,
0.09748461842536926,
0.030235685408115387,
-0.08648224920034409,
-0.027857787907123566,
-0.035846732556819916,
-0.10978182405233383,
0.14196494221687317,
-0.09526176750659943,
0.10398711264133453,
-0.0011185907060280442,
0.0696181058883667,
0.006438111420720816,
-0.03344358876347542,
0.12973757088184357,
0.035672541707754135,
0.06157883256673813,
-0.020106293261051178,
0.06555856764316559,
-0.008261083625257015,
-0.0511772446334362,
0.04349875822663307,
-0.058684092015028,
0.05208849161863327,
-0.10936394333839417,
-0.027498262003064156,
-0.08615868538618088,
0.05690242722630501,
-0.05138852447271347,
-0.057215601205825806,
-0.019059784710407257,
0.05175962299108505,
0.080032579600811,
-0.036775145679712296,
-0.010387775488197803,
-0.003907497506588697,
0.065139040350914,
0.11327153444290161,
0.10377827286720276,
-0.03538054972887039,
-0.07611926645040512,
-0.020314674824476242,
-0.03301836550235748,
0.056250348687171936,
-0.07909747958183289,
0.03333839774131775,
0.12268661707639694,
0.03671543672680855,
0.12993870675563812,
0.028112933039665222,
-0.06803345680236816,
-0.004240470938384533,
0.023944463580846786,
-0.11532753705978394,
-0.14904765784740448,
0.015487441793084145,
-0.0250018872320652,
-0.09100858122110367,
0.009428076446056366,
0.1297035813331604,
-0.045890118926763535,
-0.012807448394596577,
-0.007497149053961039,
0.035599276423454285,
-0.015721026808023453,
0.17933763563632965,
0.012726019136607647,
0.07118009775876999,
-0.10196748375892639,
0.14821314811706543,
0.04308614134788513,
-0.10471288859844208,
0.08909086883068085,
0.09352540969848633,
-0.09857013076543808,
-0.012294979766011238,
0.058526411652565,
0.11184663325548172,
-0.002732133725658059,
-0.05638866871595383,
-0.07960733771324158,
-0.13198277354240417,
0.07275397330522537,
0.13122420012950897,
0.012509697116911411,
-0.01873983070254326,
-0.05176835134625435,
0.024729106575250626,
-0.12590967118740082,
0.0622275173664093,
0.05180421099066734,
0.036425113677978516,
-0.12083451449871063,
0.09939476102590561,
0.03873100131750107,
0.02988906018435955,
-0.013805214315652847,
0.008229873143136501,
-0.10168839991092682,
-0.006551028229296207,
-0.12422560155391693,
-0.025174908339977264,
-0.030000753700733185,
-0.0045102969743311405,
-0.016022715717554092,
-0.05392352491617203,
-0.04527698829770088,
0.041570499539375305,
-0.07615400105714798,
-0.047187428921461105,
-0.0010525548132136464,
0.0411333367228508,
-0.15581776201725006,
0.005040760617703199,
0.027821779251098633,
-0.10406799614429474,
0.09624871611595154,
0.06122670695185661,
0.00016068982949946076,
0.04155504330992699,
-0.11349880695343018,
-0.03968508169054985,
0.028889434412121773,
0.013903548941016197,
0.07166606932878494,
-0.13479670882225037,
-0.022104892879724503,
-0.01295128371566534,
0.040277980268001556,
0.01750171184539795,
0.09282054007053375,
-0.10347778350114822,
-0.02180485799908638,
-0.0680757537484169,
-0.04618385061621666,
-0.054819636046886444,
0.03916331008076668,
0.11284546554088593,
0.03806460276246071,
0.1448061317205429,
-0.08716943114995956,
0.05171642452478409,
-0.1822526603937149,
-0.02765769511461258,
-0.013605240732431412,
-0.01823810487985611,
-0.030589405447244644,
-0.027740810066461563,
0.09313669055700302,
-0.05904252454638481,
0.11598709970712662,
-0.02230432815849781,
0.08122191578149796,
0.04448280110955238,
-0.10727489739656448,
-0.1027618944644928,
0.021673720329999924,
0.15936043858528137,
0.05734366551041603,
0.00041180106927640736,
0.09870045632123947,
-0.02481282874941826,
0.04385620355606079,
0.09959986805915833,
0.2239469438791275,
0.165445014834404,
0.00810493715107441,
0.0695495754480362,
0.07195217162370682,
-0.14288808405399323,
-0.15402278304100037,
0.1264239102602005,
-0.08768332004547119,
0.1395721733570099,
-0.055877555161714554,
0.16652154922485352,
0.06552504003047943,
-0.1803741753101349,
0.04955882579088211,
-0.030440591275691986,
-0.09689836204051971,
-0.12958237528800964,
-0.030686743557453156,
-0.07262279093265533,
-0.132034569978714,
0.02645697258412838,
-0.09855339676141739,
0.057350438088178635,
0.062444090843200684,
0.042756710201501846,
0.03258397802710533,
0.1285007745027542,
-0.007780625484883785,
-0.003037279937416315,
0.09041314572095871,
0.027272986248135567,
-0.03093276172876358,
-0.05315239727497101,
-0.06462226808071136,
0.027894897386431694,
0.028061795979738235,
0.06728940457105637,
-0.036863673478364944,
-0.05290357768535614,
0.04491899162530899,
-0.0006656387122347951,
-0.06736760586500168,
0.02899286337196827,
-0.015441804192960262,
0.035953860729932785,
0.06951364874839783,
0.06793888658285141,
-0.010588066652417183,
-0.03493491932749748,
0.2598130702972412,
-0.08247675001621246,
-0.07270555943250656,
-0.14870184659957886,
0.18232396245002747,
0.012463782913982868,
-0.01069255918264389,
0.05575365945696831,
-0.08041062951087952,
-0.06276053190231323,
0.1820152997970581,
0.13882750272750854,
-0.061155419796705246,
-0.009599843062460423,
-0.017491420730948448,
-0.009948252700269222,
-0.04811307042837143,
0.13935121893882751,
0.11596568673849106,
0.043259382247924805,
-0.060865964740514755,
-0.0070582288317382336,
-0.01985708624124527,
-0.08110368251800537,
-0.06948113441467285,
0.09532888978719711,
0.013495742343366146,
-0.006398646626621485,
-0.03267141059041023,
0.07839956134557724,
-0.006542198359966278,
-0.2022809386253357,
0.015257444232702255,
-0.15855394303798676,
-0.18532633781433105,
-0.0152123486623168,
0.055822085589170456,
-0.005045852158218622,
0.03630607947707176,
0.004140540026128292,
-0.0049493457190692425,
0.1687122881412506,
-0.0025012120604515076,
-0.027040526270866394,
-0.10933035612106323,
0.09072504192590714,
-0.09865979105234146,
0.19055058062076569,
-0.005307212937623262,
0.03792231157422066,
0.08490964770317078,
0.05589072406291962,
-0.11866690963506699,
0.034666724503040314,
0.06208354979753494,
-0.08460021764039993,
0.011594688519835472,
0.1665361374616623,
-0.05310448631644249,
0.09509392082691193,
0.04456986486911774,
-0.12394390255212784,
-0.010081442072987556,
-0.04797748103737831,
-0.02745486982166767,
-0.05033865198493004,
0.007477302569895983,
-0.0531473308801651,
0.13831652700901031,
0.19268308579921722,
-0.037945982068777084,
0.01183896604925394,
-0.07805123180150986,
0.04246136173605919,
0.00862874649465084,
0.07960895448923111,
-0.02807432785630226,
-0.22049710154533386,
0.03148140758275986,
0.026685353368520737,
0.023165009915828705,
-0.20125725865364075,
-0.09708300232887268,
0.02147337794303894,
-0.05842671915888786,
-0.057688768953084946,
0.1021716445684433,
0.027575742453336716,
0.036486636847257614,
-0.03488166630268097,
-0.09063009917736053,
-0.018119391053915024,
0.14747488498687744,
-0.16263653337955475,
-0.025715772062540054
] |
null | null | transformers |
This model is finetuned for masked language modeling.
I have used xlm-roberta-large model for pretraining over half a million tokens of
Hindi fraud call transcripts.
You can import this model with pretrained() method from the transformer library.
please note this works well on general Hindi but it's result on native language dialogues are enhanced
in comparison to general libraries. | {} | fill-mask | Raviraj/xlm-roberta-large-MLMfintune-hi-fraudcall | [
"transformers",
"pytorch",
"safetensors",
"roberta",
"fill-mask",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [] | TAGS
#transformers #pytorch #safetensors #roberta #fill-mask #autotrain_compatible #endpoints_compatible #region-us
|
This model is finetuned for masked language modeling.
I have used xlm-roberta-large model for pretraining over half a million tokens of
Hindi fraud call transcripts.
You can import this model with pretrained() method from the transformer library.
please note this works well on general Hindi but it's result on native language dialogues are enhanced
in comparison to general libraries. | [] | [
"TAGS\n#transformers #pytorch #safetensors #roberta #fill-mask #autotrain_compatible #endpoints_compatible #region-us \n"
] | [
42
] | [
"passage: TAGS\n#transformers #pytorch #safetensors #roberta #fill-mask #autotrain_compatible #endpoints_compatible #region-us \n"
] | [
-0.0688251480460167,
-0.0006219777278602123,
-0.008977851830422878,
0.016062406823039055,
0.107199527323246,
0.009290448389947414,
0.1326877623796463,
0.06040089577436447,
0.05801795423030853,
-0.0015195317100733519,
0.15865245461463928,
0.19646146893501282,
-0.03707929328083992,
0.2128314971923828,
-0.0636933371424675,
-0.22736719250679016,
0.0831809863448143,
0.02399665117263794,
-0.04596315324306488,
0.13289432227611542,
0.07743001729249954,
-0.09078354388475418,
0.0527360662817955,
-0.02064482681453228,
-0.08989014476537704,
0.02641587145626545,
0.07978507876396179,
-0.131717249751091,
0.13515058159828186,
0.003648138139396906,
0.23696967959403992,
0.0329892635345459,
-0.0428297221660614,
-0.08689593523740768,
0.058646947145462036,
0.009419716894626617,
-0.07162898778915405,
0.045247264206409454,
0.012312451377511024,
-0.0952560156583786,
-0.028858594596385956,
0.023897429928183556,
0.04606865718960762,
0.04695940390229225,
-0.13590773940086365,
-0.15055418014526367,
-0.018137050792574883,
0.03945290669798851,
0.06293665617704391,
0.07064250111579895,
0.013664761558175087,
0.23681387305259705,
-0.1127508208155632,
0.10498198121786118,
0.13958081603050232,
-0.29747292399406433,
-0.01152300275862217,
0.05015658214688301,
0.08048558980226517,
-0.04954744875431061,
-0.03653336316347122,
0.05459093675017357,
0.02065996453166008,
0.013696311973035336,
0.06716012954711914,
-0.06490619480609894,
-0.07321973145008087,
-0.019362011924386024,
-0.07978556305170059,
-0.04271140694618225,
0.12449634820222855,
-0.050029631704092026,
0.023060845211148262,
-0.024884890764951706,
-0.13363070785999298,
-0.007576487492769957,
-0.02042660489678383,
-0.01475053932517767,
-0.04255635663866997,
0.020066484808921814,
-0.039881251752376556,
0.007181993220001459,
-0.11874480545520782,
0.011201693676412106,
-0.20752952992916107,
0.2996160387992859,
0.026865923777222633,
0.06728407740592957,
-0.1646410971879959,
0.04177381843328476,
0.003358201589435339,
-0.13362184166908264,
0.05642756074666977,
-0.1071522980928421,
0.03924284502863884,
0.0038877343758940697,
-0.029814448207616806,
-0.04734843596816063,
0.11512459814548492,
0.21394294500350952,
0.026571810245513916,
0.021820439025759697,
0.028108498081564903,
0.08810366690158844,
-0.0027927905321121216,
0.06596379727125168,
0.02563805691897869,
-0.023241594433784485,
0.08313966542482376,
-0.06935924291610718,
0.0690278708934784,
-0.04288124293088913,
-0.0787995457649231,
-0.030246542766690254,
0.041086263954639435,
0.1088138148188591,
0.03869536519050598,
0.05423571169376373,
-0.09292128682136536,
0.029257358983159065,
0.10439173877239227,
-0.08559732139110565,
-0.014207343570888042,
-0.01759207621216774,
0.07325529307126999,
0.047364406287670135,
0.030340086668729782,
-0.01156437024474144,
-0.008168106898665428,
0.12468189001083374,
-0.07715177536010742,
-0.04298538714647293,
-0.04617367684841156,
-0.056599896401166916,
0.027763713151216507,
-0.10458268225193024,
0.040537770837545395,
-0.206930473446846,
-0.1462651491165161,
0.05532076209783554,
0.043738726526498795,
0.023577570915222168,
-0.020184574648737907,
0.03890783339738846,
-0.00874484982341528,
-0.006991815287619829,
-0.044706083834171295,
-0.07091522961854935,
-0.04514395073056221,
0.11232571303844452,
0.015658317133784294,
0.10528077930212021,
-0.0777982845902443,
0.010684207081794739,
-0.11341524869203568,
0.016014378517866135,
-0.16240505874156952,
-0.06330683082342148,
-0.04187411442399025,
0.16276919841766357,
0.010246393270790577,
-0.037461139261722565,
-0.11326467245817184,
0.03330032154917717,
-0.006152770482003689,
0.18608681857585907,
-0.04071662575006485,
-0.11029182374477386,
0.2646951973438263,
-0.15657702088356018,
-0.13690154254436493,
0.0860963985323906,
0.009122688323259354,
-0.0057576983235776424,
0.06757233291864395,
0.08094562590122223,
0.03860408812761307,
-0.14955967664718628,
0.06095769628882408,
0.09209023416042328,
-0.1611911803483963,
-0.11135231703519821,
0.017745625227689743,
0.0018904941389337182,
-0.12247883528470993,
0.0410340391099453,
0.0897873044013977,
0.11386444419622421,
-0.07747910171747208,
-0.06999378651380539,
-0.029918141663074493,
-0.056157421320676804,
0.14125216007232666,
0.03649308532476425,
0.06930318474769592,
-0.09963353723287582,
-0.04323158040642738,
-0.09411724656820297,
0.008646666072309017,
0.05785860866308212,
0.011921326629817486,
-0.11069417744874954,
0.11718270927667618,
-0.035843897610902786,
0.0005573738599196076,
-0.15535257756710052,
-0.153195321559906,
-0.016895392909646034,
0.03642438352108002,
-0.04854366183280945,
0.057937514036893845,
0.12380260229110718,
-0.00011181021545780823,
-0.017216209322214127,
-0.04645795002579689,
0.09210273623466492,
0.03138435631990433,
-0.0027514188550412655,
-0.11111054569482803,
0.037421371787786484,
-0.0892220288515091,
0.04393728822469711,
0.010283181443810463,
0.016323521733283997,
0.0004950435250066221,
0.1374184936285019,
0.01080115232616663,
0.033423565328121185,
-0.04046564921736717,
0.022223934531211853,
-0.025598395615816116,
-0.001965498086065054,
0.07476434856653214,
0.0014406339032575488,
-0.05923888832330704,
0.13430321216583252,
-0.15225130319595337,
0.38713914155960083,
0.18093988299369812,
-0.22953103482723236,
-0.03764719143509865,
0.05116825923323631,
-0.027407558634877205,
0.0073933652602136135,
0.02287248708307743,
0.004810950253158808,
-0.01865869201719761,
0.004673311952501535,
0.13358916342258453,
-0.022487031295895576,
-0.024890096858143806,
0.043635763227939606,
-0.08484487980604172,
-0.03190680220723152,
0.02564266510307789,
0.054780054837465286,
-0.11089853942394257,
0.1687205731868744,
0.23971906304359436,
-0.007523845881223679,
0.13747845590114594,
-0.0026165680028498173,
0.010958119295537472,
0.010664247907698154,
-0.01725691184401512,
-0.006481867749243975,
0.07005836814641953,
-0.1239960640668869,
-0.015148865059018135,
0.06297904998064041,
-0.05033835023641586,
0.032147154211997986,
-0.11893332749605179,
-0.05839109793305397,
0.0035508563742041588,
0.04602736979722977,
-0.04908382520079613,
0.12398181855678558,
0.02204352617263794,
0.09029968827962875,
-0.02611580304801464,
-0.1103406772017479,
0.09383521229028702,
0.00245275953784585,
-0.03193211555480957,
0.17028820514678955,
-0.10446375608444214,
-0.3464675843715668,
-0.1259882003068924,
-0.12774357199668884,
0.02516629919409752,
0.04019615054130554,
0.058636318892240524,
-0.09796301275491714,
-0.07177054136991501,
0.0682084709405899,
-0.005043081007897854,
0.03562590852379799,
0.07588322460651398,
-0.03419438749551773,
0.033425360918045044,
0.005091528873890638,
-0.07121611386537552,
-0.06781496852636337,
-0.038503941148519516,
-0.04744443669915199,
0.16396194696426392,
-0.04235904663801193,
0.0809316411614418,
0.11432496458292007,
0.0012378271203488111,
0.027840884402394295,
-0.010946919210255146,
0.1599227488040924,
-0.07578068971633911,
0.005515585653483868,
0.20724400877952576,
-0.059252023696899414,
0.09972857683897018,
0.17700767517089844,
0.028870997950434685,
-0.04296499863266945,
0.018551424145698547,
-0.055706512182950974,
-0.11921961605548859,
-0.17289718985557556,
-0.11158410459756851,
-0.09002871811389923,
-0.0055516003631055355,
0.03155292570590973,
0.05334706977009773,
0.1416819989681244,
0.11530177295207977,
0.019823599606752396,
-0.06931319087743759,
-0.031015805900096893,
0.04986131563782692,
0.12003350257873535,
-0.018341857939958572,
0.14294087886810303,
-0.0445069782435894,
-0.1757907271385193,
0.049781061708927155,
-0.014339353889226913,
0.11693016439676285,
0.0880466103553772,
-0.05013655126094818,
0.05452413111925125,
0.1706349104642868,
0.15710151195526123,
0.1805204302072525,
0.03927961736917496,
-0.07411108165979385,
0.009676824323832989,
-0.01824476197361946,
-0.05436078459024429,
0.004668451379984617,
0.05767247825860977,
-0.058285366743803024,
-0.031117094680666924,
-0.09087805449962616,
0.07718160003423691,
0.11688245087862015,
0.0501069612801075,
-0.2427925169467926,
0.0074403067119419575,
0.06104592606425285,
0.010644664987921715,
-0.06283354759216309,
0.04722225293517113,
-0.018410205841064453,
-0.11335457116365433,
0.07663505524396896,
-0.06961451470851898,
0.06052969768643379,
0.05981376767158508,
0.06980801373720169,
-0.07291334867477417,
-0.03583504632115364,
0.014538745395839214,
0.05307612195611,
-0.21066462993621826,
0.27787938714027405,
-0.008731366135179996,
-0.004078966565430164,
-0.06967110186815262,
-0.004882844164967537,
0.057438503950834274,
0.11181603372097015,
0.1471492350101471,
0.019239328801631927,
-0.042640406638383865,
-0.12966156005859375,
-0.03984291851520538,
0.044542256742715836,
0.08876435458660126,
-0.005527072586119175,
0.004304515663534403,
-0.03564295917749405,
-0.04699314758181572,
-0.002112128771841526,
0.02340524084866047,
-0.0385478250682354,
-0.10352825373411179,
0.05593708157539368,
0.0583610013127327,
0.016178064048290253,
-0.04928440600633621,
-0.06313423812389374,
-0.09654681384563446,
0.17471195757389069,
-0.0484326146543026,
-0.06301755458116531,
-0.10402138531208038,
-0.12029380351305008,
0.0906924456357956,
-0.10038914531469345,
0.12365206331014633,
-0.08979000896215439,
0.0227696280926466,
-0.10237706452608109,
-0.16006512939929962,
0.13053953647613525,
-0.14751577377319336,
-0.04590624198317528,
-0.07399910688400269,
0.16047577559947968,
-0.05456206947565079,
0.013567314483225346,
0.011484915390610695,
0.03608345985412598,
-0.08742481470108032,
-0.0497378334403038,
0.025199690833687782,
-0.07429742813110352,
0.04325621947646141,
0.054578687995672226,
-0.05533887445926666,
-0.12304601818323135,
0.006209481041878462,
0.026294395327568054,
0.1943604201078415,
0.2771334946155548,
-0.0486922562122345,
0.11734589189291,
0.20023834705352783,
0.00994977168738842,
-0.34405654668807983,
-0.13089178502559662,
-0.14237752556800842,
-0.012246720492839813,
0.02961466833949089,
-0.07376152276992798,
0.10872874408960342,
0.005023897159844637,
-0.07435531169176102,
0.11933901906013489,
-0.13693319261074066,
-0.0926717072725296,
0.24763408303260803,
0.03487536311149597,
0.46856287121772766,
-0.11937931180000305,
-0.04748120903968811,
-0.03427453711628914,
-0.0915013924241066,
0.0324133038520813,
-0.04207684472203255,
0.06873893737792969,
-0.01158691756427288,
0.03162335976958275,
0.0327749140560627,
-0.09997168928384781,
0.10008327662944794,
-0.07803241908550262,
0.03347526863217354,
-0.10799343883991241,
-0.07165541499853134,
0.11102275550365448,
-0.0043715531937778,
-0.0014464337145909667,
-0.008712195791304111,
0.02369030937552452,
0.028963135555386543,
-0.029879385605454445,
-0.08703889697790146,
0.12468701601028442,
0.025782128795981407,
-0.07377615571022034,
0.04348239675164223,
-0.011057124473154545,
-0.03217460587620735,
-0.01672946661710739,
0.1929038017988205,
0.004056375473737717,
0.21974478662014008,
0.09365689009428024,
0.042503856122493744,
-0.1200207844376564,
-0.027849243953824043,
-0.03244537487626076,
-0.0955234095454216,
0.07935938984155655,
0.018465716391801834,
0.055734921246767044,
0.08323386311531067,
-0.007944685406982899,
0.03765840455889702,
0.10141787678003311,
0.007455390878021717,
-0.029081737622618675,
0.17868104577064514,
-0.23860225081443787,
0.011583268642425537,
-0.004838151857256889,
0.00384862395003438,
0.04024059697985649,
0.09525461494922638,
0.1023617833852768,
0.029329271987080574,
-0.04179186373949051,
-0.02279057912528515,
0.01067307498306036,
-0.060275450348854065,
0.0723762959241867,
0.07140426337718964,
0.06617273390293121,
-0.1139318123459816,
0.0016795189585536718,
-0.030260629951953888,
-0.16257520020008087,
-0.028657520189881325,
0.06757321208715439,
-0.12481003999710083,
-0.11039397865533829,
0.01371186226606369,
0.08227931708097458,
-0.05491776019334793,
-0.04936935752630234,
-0.09326868504285812,
-0.1333298683166504,
0.02520179934799671,
0.24295537173748016,
0.10754555463790894,
0.08063920587301254,
0.026035599410533905,
-0.00930735468864441,
-0.028910141438245773,
0.014930606819689274,
0.029536133632063866,
0.02968282252550125,
-0.11068468540906906,
0.020151183009147644,
0.0005312743596732616,
0.12716573476791382,
-0.11579538136720657,
-0.0348488874733448,
-0.1602787971496582,
0.03679540753364563,
-0.04914093017578125,
-0.07634355127811432,
-0.09182701259851456,
-0.06435422599315643,
0.0054199229925870895,
-0.06719081103801727,
-0.04067748412489891,
-0.029460789635777473,
-0.09341930598020554,
0.043434690684080124,
0.043024469166994095,
-0.03261801227927208,
-0.09196880459785461,
-0.05036372318863869,
0.12147179991006851,
-0.0518278107047081,
0.08237647265195847,
0.1296815127134323,
-0.07424261420965195,
0.08181357383728027,
-0.16238614916801453,
-0.10176516324281693,
0.11024286597967148,
-0.0010352354729548097,
0.05713340640068054,
0.052967142313718796,
0.02467094361782074,
0.05374768003821373,
0.027874615043401718,
0.04975787550210953,
0.0672139897942543,
-0.11410431563854218,
0.10611274838447571,
0.024332774803042412,
-0.17593923211097717,
-0.02476082742214203,
-0.11788693815469742,
0.07900475710630417,
-0.04151597619056702,
0.15114019811153412,
-0.05968455225229263,
0.08940088003873825,
-0.06256740540266037,
0.026371417567133904,
-0.03228489309549332,
-0.16128042340278625,
-0.04073919355869293,
-0.020588424056768417,
0.004760523326694965,
-0.009928334504365921,
0.21964116394519806,
-0.01724170334637165,
0.008178074844181538,
0.04715742915868759,
0.058323197066783905,
0.011437716893851757,
0.01226725522428751,
0.1351718306541443,
0.06987974047660828,
-0.051175717264413834,
-0.08483447879552841,
0.053636904805898666,
0.01643437147140503,
-0.14209318161010742,
0.11375506967306137,
0.08563282340765,
0.04509470611810684,
0.08045034110546112,
0.017236821353435516,
0.045946914702653885,
-0.08998570591211319,
-0.22730939090251923,
-0.06151463836431503,
0.009942760691046715,
0.044349391013383865,
-0.01188048068434,
0.18757560849189758,
0.014865344390273094,
0.03246641159057617,
-0.03184039518237114,
-0.009588652290403843,
-0.1962994635105133,
-0.10682153701782227,
-0.08970396220684052,
-0.044044189155101776,
0.03794223815202713,
-0.021908139809966087,
-0.05430839955806732,
0.0972302183508873,
0.030880749225616455,
-0.024846548214554787,
0.18594205379486084,
0.016331827268004417,
0.012787791900336742,
0.021319715306162834,
0.012808633036911488,
0.010594449937343597,
0.043500371277332306,
-0.035239532589912415,
-0.1601358950138092,
0.007052420172840357,
-0.052134472876787186,
-0.006543252617120743,
-0.09636235237121582,
0.03597145527601242,
-0.09118634462356567,
-0.1218310222029686,
-0.06243554502725601,
0.03594132512807846,
-0.03617075830698013,
0.059506457298994064,
-0.010642040520906448,
0.04763888195157051,
0.0201286431401968,
0.1349562555551529,
-0.048453353345394135,
-0.15274065732955933,
-0.05100299045443535,
0.17701692879199982,
0.022324077785015106,
0.09128759056329727,
-0.011590758338570595,
0.02056429535150528,
-0.06949435919523239,
0.2920604348182678,
0.3091326653957367,
-0.025553744286298752,
0.09547238796949387,
0.016072168946266174,
0.02998320199549198,
0.03373197466135025,
0.10352040082216263,
0.09161051362752914,
0.29790550470352173,
-0.08114448189735413,
-0.025506330654025078,
-0.04841962456703186,
-0.029307257384061813,
-0.13417719304561615,
-0.009733080863952637,
0.014419588260352612,
-0.00890540424734354,
-0.06487515568733215,
0.08193706721067429,
-0.1444881409406662,
0.112643301486969,
0.07179682701826096,
-0.19154706597328186,
-0.04973611235618591,
-0.01681424304842949,
0.17391452193260193,
0.02009991742670536,
0.09472446888685226,
-0.03980322927236557,
-0.0814620852470398,
0.010553027503192425,
0.02547893300652504,
-0.16823089122772217,
-0.06515674293041229,
0.0797896683216095,
0.005974937230348587,
0.1305229663848877,
-0.020956432446837425,
0.02883455716073513,
0.08777786046266556,
0.023830696940422058,
-0.03404496982693672,
0.07159103453159332,
0.026214104145765305,
-0.11450250446796417,
-0.042458537966012955,
0.008912011981010437,
-0.0018211461137980223,
-0.10276545584201813,
0.020372914150357246,
-0.14524897933006287,
0.034636374562978745,
-0.07989880442619324,
-0.04103260114789009,
-0.008193403482437134,
0.08703237771987915,
-0.030989376828074455,
0.04809832572937012,
0.047750964760780334,
0.009180465713143349,
-0.015037352219223976,
-0.04807599261403084,
0.019667912274599075,
0.0668729618191719,
-0.10455524176359177,
-0.13590572774410248,
-0.11254703253507614,
-0.039686936885118484,
0.04806091636419296,
-0.008270339109003544,
-0.16902868449687958,
-0.04859336093068123,
-0.13280287384986877,
0.01657620258629322,
-0.17883232235908508,
0.014348543249070644,
0.07266537100076675,
0.05465002730488777,
0.02939320169389248,
-0.02748057432472706,
0.03074190393090248,
0.04932467266917229,
-0.15602757036685944,
-0.09359626471996307
] |
null | null | transformers | DO NOT USE THIS | {} | text-classification | Raychanan/chinese-roberta-wwm-ext-FineTuned-Binary | [
"transformers",
"pytorch",
"jax",
"bert",
"text-classification",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [] | TAGS
#transformers #pytorch #jax #bert #text-classification #autotrain_compatible #endpoints_compatible #region-us
| DO NOT USE THIS | [] | [
"TAGS\n#transformers #pytorch #jax #bert #text-classification #autotrain_compatible #endpoints_compatible #region-us \n"
] | [
39
] | [
"passage: TAGS\n#transformers #pytorch #jax #bert #text-classification #autotrain_compatible #endpoints_compatible #region-us \n"
] | [
-0.01809483766555786,
0.06673542410135269,
-0.007573992013931274,
0.02542104199528694,
0.19456367194652557,
0.04259642958641052,
0.07180475443601608,
0.11746370047330856,
0.065577931702137,
-0.0401817262172699,
0.11666674166917801,
0.23135067522525787,
-0.03061060607433319,
0.11772982031106949,
-0.10850205272436142,
-0.29288947582244873,
0.06133841350674629,
0.06536684185266495,
0.00420401943847537,
0.11130789667367935,
0.08473420888185501,
-0.09304895997047424,
0.0721178650856018,
-0.03993720933794975,
-0.13622891902923584,
0.041352782398462296,
0.05172041431069374,
-0.13071946799755096,
0.09408676624298096,
0.04389055818319321,
0.15670926868915558,
0.026289479807019234,
-0.05535130202770233,
-0.1438279002904892,
0.03692479059100151,
0.0005937846144661307,
-0.08006128668785095,
0.05240355059504509,
0.08632553368806839,
-0.10694190114736557,
0.013523316942155361,
0.043739791959524155,
0.027750981971621513,
0.04982184246182442,
-0.14666184782981873,
-0.09967022389173508,
-0.0019326872425153852,
0.029109641909599304,
0.06774714589118958,
0.05822955071926117,
0.0010195786599069834,
0.14306534826755524,
-0.1304115504026413,
0.12180308252573013,
0.11148694157600403,
-0.29323816299438477,
-0.01943984627723694,
0.09444136917591095,
0.04154133424162865,
0.05044957250356674,
-0.056781791150569916,
0.043631937354803085,
0.02468041144311428,
0.000946011976338923,
0.014533239416778088,
-0.07892842590808868,
-0.1071673110127449,
0.032310470938682556,
-0.07385499775409698,
-0.04111378639936447,
0.21185067296028137,
-0.049293242394924164,
0.06449735909700394,
-0.02123348042368889,
-0.08135563135147095,
-0.056714851409196854,
-0.03172755241394043,
0.008735861629247665,
-0.03984520584344864,
0.06747489422559738,
0.02921498380601406,
0.013714022003114223,
-0.10257585346698761,
0.021764980629086494,
-0.19535581767559052,
0.1972673535346985,
0.012189923785626888,
0.05059242621064186,
-0.1859312504529953,
0.04326876252889633,
0.02285129576921463,
-0.10319478064775467,
0.05386510491371155,
-0.09923137724399567,
0.028100797906517982,
-0.04234573617577553,
-0.06257519870996475,
-0.04020916298031807,
0.08136691898107529,
0.12868155539035797,
0.04702077805995941,
0.06588312983512878,
-0.029047340154647827,
0.08873984217643738,
0.03929751738905907,
0.12775880098342896,
0.04212596267461777,
-0.0374809093773365,
0.05176495015621185,
-0.11413440853357315,
-0.014094071462750435,
-0.07464064657688141,
-0.15375252068042755,
-0.016235198825597763,
0.08695993572473526,
0.07627255469560623,
0.011462262831628323,
0.08710302412509918,
-0.06540420651435852,
-0.03694767504930496,
0.06530382484197617,
-0.07531051337718964,
0.027230482548475266,
0.023209715262055397,
0.02529485896229744,
0.09126318246126175,
-0.0292835533618927,
0.0028054488357156515,
-0.06840378791093826,
0.13082213699817657,
-0.06151483207941055,
0.005951870232820511,
-0.04267766699194908,
-0.07654917240142822,
0.03411971405148506,
-0.11691025644540787,
0.03771146014332771,
-0.16825729608535767,
-0.09779638797044754,
0.0062947659753263,
0.03158242627978325,
-0.0005342107615433633,
-0.03975434601306915,
-0.030474286526441574,
0.004082287661731243,
0.05146385729312897,
-0.056856293231248856,
-0.06116197258234024,
-0.07278820127248764,
0.09648430347442627,
-0.033643417060375214,
0.07996653765439987,
-0.10526403039693832,
0.07087257504463196,
-0.09409809857606888,
-0.02970806322991848,
-0.1284375637769699,
0.022813236340880394,
-0.046654168516397476,
0.18474748730659485,
0.009493359364569187,
-0.037707120180130005,
-0.04483810439705849,
0.05696108564734459,
-0.07555562257766724,
0.1717727780342102,
-0.0793110728263855,
-0.11701413244009018,
0.20788724720478058,
-0.07954380661249161,
-0.13876761496067047,
0.08493359386920929,
-0.01615220494568348,
0.017958471551537514,
0.10674294829368591,
0.19496823847293854,
0.08540180325508118,
-0.003906393423676491,
0.08904492110013962,
0.11408630758523941,
-0.07959926128387451,
-0.1108965128660202,
0.005469737574458122,
0.007978282868862152,
-0.15063118934631348,
0.05894370377063751,
0.07545538246631622,
0.07129368931055069,
-0.0513894185423851,
-0.04089389741420746,
-0.003922690637409687,
-0.005997346714138985,
0.11302582919597626,
0.06324242800474167,
0.12578195333480835,
-0.08385848253965378,
-0.004017953295260668,
-0.0010093949967995286,
-0.010608415119349957,
0.025180591270327568,
0.0166823398321867,
-0.06790076941251755,
0.10298717767000198,
0.004510062281042337,
0.032306913286447525,
-0.2183763086795807,
-0.08230480551719666,
-0.0019306221511214972,
0.12755239009857178,
-0.019294047728180885,
0.11435849964618683,
0.053465988487005234,
-0.06457269191741943,
-0.013794570229947567,
-0.023813555017113686,
0.17901155352592468,
0.022746099159121513,
-0.06343037635087967,
-0.08461648225784302,
0.059753432869911194,
-0.07299339026212692,
-0.012081671506166458,
-0.08169719576835632,
0.012660477310419083,
0.06630835682153702,
0.10648472607135773,
0.019467314705252647,
0.05848253145813942,
-0.02924266830086708,
0.05626295506954193,
-0.06475984305143356,
0.027140533551573753,
0.1179659441113472,
-0.006632041186094284,
-0.056153301149606705,
0.15833188593387604,
-0.13553740084171295,
0.32089653611183167,
0.19996315240859985,
-0.29291605949401855,
-0.003792791860178113,
-0.017230169847607613,
0.002490597777068615,
0.025653522461652756,
0.034587517380714417,
0.009043003432452679,
0.09345389157533646,
-0.008162388578057289,
0.20210638642311096,
-0.030431343242526054,
-0.04563094303011894,
-0.006599363870918751,
-0.045313142240047455,
-0.035156700760126114,
0.0899621918797493,
0.0700104758143425,
-0.21636348962783813,
0.19578737020492554,
0.23056425154209137,
0.014083887450397015,
0.16511985659599304,
-0.0034332124050706625,
0.03841010108590126,
0.07709190249443054,
-0.054916925728321075,
-0.024765321984887123,
-0.06462280452251434,
-0.1797722727060318,
-0.0519074946641922,
0.07089265435934067,
0.026776624843478203,
0.05498277395963669,
-0.10467023402452469,
-0.041482336819171906,
-0.001121786772273481,
0.03263681009411812,
-0.03239976987242699,
0.07532784342765808,
0.06611774861812592,
0.10848522931337357,
0.007556584198027849,
-0.07854785025119781,
0.10835256427526474,
-0.004704562947154045,
-0.08111206442117691,
0.18040013313293457,
-0.14344702661037445,
-0.34762370586395264,
-0.13627612590789795,
-0.20350822806358337,
-0.009711001999676228,
0.04890860617160797,
0.09789098799228668,
-0.11364968121051788,
-0.041684478521347046,
0.04273238033056259,
-0.01656302809715271,
-0.07931388914585114,
0.04226658120751381,
-0.06387604027986526,
0.07653088867664337,
-0.05215940251946449,
-0.05663241818547249,
-0.07558713108301163,
-0.03065023384988308,
-0.011166092939674854,
0.1459382027387619,
-0.12642274796962738,
0.06581402570009232,
0.16313746571540833,
-0.005721473600715399,
0.0682130977511406,
-0.034285202622413635,
0.17564783990383148,
-0.08818710595369339,
-0.03183111548423767,
0.16446498036384583,
-0.0764211043715477,
0.06765677034854889,
0.15842288732528687,
0.03146690875291824,
-0.06440786272287369,
0.02619570679962635,
-0.04366680607199669,
-0.08154077082872391,
-0.2115747332572937,
-0.12536035478115082,
-0.11997322738170624,
0.06674559414386749,
0.0812050923705101,
0.06874731183052063,
0.1319999098777771,
0.054959509521722794,
0.02062775380909443,
0.013091953471302986,
0.011458709836006165,
0.08337462693452835,
0.21880970895290375,
-0.0013363112229853868,
0.14448416233062744,
-0.048022862523794174,
-0.13040171563625336,
0.0763016939163208,
0.00851297378540039,
0.08838732540607452,
0.1124393567442894,
0.02980091981589794,
-0.003721019485965371,
0.08009619265794754,
0.1710960417985916,
0.12011133134365082,
0.02604288049042225,
-0.025675294920802116,
-0.02881801873445511,
-0.00018281576922163367,
-0.07662030309438705,
0.018537059426307678,
0.07926761358976364,
-0.12852145731449127,
-0.07732073217630386,
-0.1605582982301712,
0.07910539954900742,
0.0842241495847702,
0.04793410003185272,
-0.21289688348770142,
0.011515513993799686,
0.09760864078998566,
-0.03371579200029373,
-0.09820050001144409,
0.07870060950517654,
-0.05545899644494057,
-0.14528445899486542,
0.08979272842407227,
-0.0350475050508976,
0.14132331311702728,
-0.0825844332575798,
0.08438211679458618,
-0.038387857377529144,
-0.11987438797950745,
0.03140055760741234,
0.10854005068540573,
-0.28070196509361267,
0.2154713273048401,
0.012321638874709606,
-0.06612604111433029,
-0.07430197298526764,
-0.024338167160749435,
0.04101184755563736,
0.20623712241649628,
0.07433765381574631,
0.0005549621419049799,
-0.08459877222776413,
-0.17614704370498657,
-0.014856216497719288,
0.01006323006004095,
0.11086399853229523,
-0.04531329125165939,
-0.0162801556289196,
-0.04873865842819214,
-0.032460931688547134,
-0.027329416945576668,
-0.036445118486881256,
0.02981879748404026,
-0.16195650398731232,
0.057034727185964584,
0.02994452603161335,
0.0795973539352417,
0.01832551695406437,
-0.054240815341472626,
-0.10906636714935303,
0.20394474267959595,
-0.07091160863637924,
-0.06390447169542313,
-0.11201474070549011,
-0.07614374160766602,
0.015175776556134224,
-0.08341611921787262,
0.04845348000526428,
-0.0809352919459343,
0.02468218095600605,
-0.053535107523202896,
-0.21068227291107178,
0.14016199111938477,
-0.10740610957145691,
-0.024658169597387314,
-0.07782392203807831,
0.13864511251449585,
-0.0789090171456337,
0.024532439187169075,
0.030736668035387993,
0.0298960842192173,
-0.09730913490056992,
-0.06820393353700638,
0.003594380570575595,
0.012733125127851963,
0.050164107233285904,
0.035532619804143906,
-0.09746024757623672,
-0.06438707560300827,
-0.030386028811335564,
0.013417999260127544,
0.2935657799243927,
0.16158808767795563,
-0.06732630729675293,
0.15693336725234985,
0.13342545926570892,
-0.07602952420711517,
-0.3328520953655243,
-0.0860983356833458,
-0.09572628885507584,
-0.04271673038601875,
-0.039317432790994644,
-0.16259606182575226,
0.1202382817864418,
-0.010392402298748493,
-0.02311943843960762,
0.08732824772596359,
-0.14364519715309143,
-0.08308403193950653,
0.18343095481395721,
-0.016732893884181976,
0.4004720151424408,
-0.11624830961227417,
-0.09180434048175812,
-0.05751292034983635,
-0.12759536504745483,
0.14510828256607056,
0.017240293323993683,
0.07660862803459167,
-0.009753178805112839,
0.03606290742754936,
0.04270001873373985,
-0.038564011454582214,
0.09122075885534286,
-0.01076544914394617,
0.013019079342484474,
-0.11043259501457214,
-0.11935783922672272,
0.015380576252937317,
-0.021142421290278435,
-0.02111625112593174,
-0.015980258584022522,
0.002749914303421974,
-0.17170843482017517,
-0.036437150090932846,
-0.0778099000453949,
0.05227721482515335,
0.03218749910593033,
-0.03192440792918205,
0.018695617094635963,
-0.01605111174285412,
-0.003546775784343481,
0.0006239944486878812,
0.2988184094429016,
-0.045211080461740494,
0.19050680100917816,
0.09056143462657928,
0.13118194043636322,
-0.15948843955993652,
0.012801270000636578,
-0.065861776471138,
-0.06396611779928207,
0.07773881405591965,
-0.08855278044939041,
0.07514625042676926,
0.12905162572860718,
-0.05594771355390549,
0.07226260006427765,
0.11423998326063156,
0.051702432334423065,
-0.03501610830426216,
0.159623384475708,
-0.2296721488237381,
0.03815079107880592,
-0.05544954165816307,
-0.0018353760242462158,
0.06467600911855698,
0.04922626167535782,
0.1242470070719719,
0.03997598588466644,
-0.05610830336809158,
0.005762273445725441,
-0.004814090207219124,
-0.003274239832535386,
0.05463983863592148,
0.058361686766147614,
0.04381057992577553,
-0.134842187166214,
0.04394451528787613,
0.05474484711885452,
-0.17748622596263885,
-0.01261796522885561,
0.14715729653835297,
-0.15453575551509857,
-0.12290617823600769,
0.006111426278948784,
0.15460239350795746,
-0.08248449862003326,
-0.04330691695213318,
-0.07279440015554428,
-0.12399396300315857,
0.0693398267030716,
0.19586825370788574,
0.12060099095106125,
0.0778300017118454,
-0.04378306493163109,
-0.042797669768333435,
0.00712791969999671,
0.007344595156610012,
-0.007872226648032665,
0.023717369884252548,
-0.11883671581745148,
0.01251294557005167,
-0.011050865054130554,
0.15153150260448456,
-0.09586972743272781,
-0.07557052373886108,
-0.19212299585342407,
0.041787270456552505,
-0.06242881715297699,
-0.03224949538707733,
-0.07766050845384598,
-0.022732099518179893,
0.004333932884037495,
-0.05241712927818298,
-0.04193229600787163,
-0.06532690674066544,
-0.1307416558265686,
0.032795269042253494,
-0.017392637208104134,
0.04657064378261566,
-0.05510725453495979,
-0.048384327441453934,
0.10124502331018448,
-0.03388887643814087,
0.08910121768712997,
0.10563872754573822,
-0.07282912731170654,
0.11062116175889969,
-0.1283218264579773,
-0.11185171455144882,
0.12193284928798676,
0.024926789104938507,
0.07325582951307297,
0.059187017381191254,
0.03799671307206154,
0.06631976366043091,
0.01165806408971548,
0.06664728373289108,
0.05780986323952675,
-0.12386669963598251,
0.04742882400751114,
-0.019212204962968826,
-0.1837417036294937,
-0.04135940968990326,
-0.042410314083099365,
0.09787000715732574,
-0.003808269975706935,
0.15671633183956146,
-0.05287281423807144,
0.09793701767921448,
-0.042132921516895294,
0.012803508900105953,
-0.023692073300480843,
-0.2205476611852646,
-0.07365892827510834,
-0.08585358411073685,
0.026862455531954765,
-0.004772793967276812,
0.22753120958805084,
0.07045473903417587,
0.03557093068957329,
0.05545854941010475,
0.0576339028775692,
-0.003454787889495492,
0.03151566907763481,
0.18229901790618896,
0.0989723727107048,
-0.05670816823840141,
-0.04705695062875748,
0.07534193992614746,
0.028033806011080742,
0.01864645630121231,
0.12431015819311142,
0.07321953028440475,
-0.012309289537370205,
0.07366625964641571,
-0.022612236440181732,
0.04702895134687424,
-0.10678938031196594,
-0.16222508251667023,
-0.03640706464648247,
0.08139578998088837,
0.01754310168325901,
0.06620614230632782,
0.10388407856225967,
-0.02347712777554989,
0.04392615333199501,
-0.0683072879910469,
-0.047119513154029846,
-0.19352172315120697,
-0.08100902289152145,
-0.10630819201469421,
-0.1066746711730957,
0.0025971289724111557,
-0.07839768379926682,
-0.006405447609722614,
0.07019750773906708,
0.04243552312254906,
-0.04944124445319176,
0.05090705305337906,
0.027607275173068047,
-0.058226849883794785,
0.08426842838525772,
-0.0374758280813694,
0.019005026668310165,
-0.01705215685069561,
-0.015734592452645302,
-0.1414513885974884,
-0.025720052421092987,
-0.04888613149523735,
0.03702576458454132,
-0.06654901802539825,
0.007938358001410961,
-0.1370028704404831,
-0.13525722920894623,
-0.022378528490662575,
0.045916568487882614,
-0.05562008172273636,
0.1206224113702774,
0.0006713551701977849,
0.010930516757071018,
0.049446847289800644,
0.2095327526330948,
-0.06220695748925209,
-0.029235756024718285,
-0.04284632205963135,
0.256413072347641,
0.07069311290979385,
0.1109071746468544,
-0.008587480522692204,
0.0032135164365172386,
-0.0791885107755661,
0.3118537962436676,
0.2932111918926239,
-0.04928450286388397,
0.04837615042924881,
0.01816660352051258,
0.03399038687348366,
0.1385296732187271,
0.14495183527469635,
0.08771999180316925,
0.22766800224781036,
-0.07150748372077942,
-0.021090956404805183,
-0.019412636756896973,
-0.015336482785642147,
-0.11512982845306396,
0.06193622201681137,
0.06605460494756699,
-0.04299676790833473,
-0.06651457399129868,
0.09843623638153076,
-0.19360409677028656,
0.13690705597400665,
0.0028531919233500957,
-0.2249731421470642,
-0.07498964667320251,
-0.03641308471560478,
0.14264386892318726,
-0.0050855111330747604,
0.080597423017025,
-0.0021270429715514183,
-0.10633035004138947,
0.020437195897102356,
0.014396386221051216,
-0.2136639505624771,
-0.04259442165493965,
0.07654968649148941,
-0.05085624009370804,
0.014857068657875061,
-0.025858590379357338,
0.028786558657884598,
0.07043676823377609,
0.051624055951833725,
-0.02011995017528534,
0.024657901376485825,
-0.000578725419472903,
-0.04388389736413956,
0.00410191947594285,
0.026280947029590607,
0.001854399568401277,
-0.06594856828451157,
0.0698620155453682,
-0.1482185274362564,
0.0447433777153492,
-0.11199703812599182,
-0.06083088740706444,
-0.007630004547536373,
0.05529272183775902,
-0.049834296107292175,
0.05740377679467201,
0.10056748241186142,
0.005429130047559738,
-0.041779059916734695,
-0.05086144804954529,
-0.03192179277539253,
0.010067981667816639,
-0.11767444759607315,
-0.15252991020679474,
-0.09751367568969727,
-0.094491146504879,
0.09518249332904816,
0.004092264920473099,
-0.16197417676448822,
0.00041187918395735323,
-0.10356339812278748,
0.055325187742710114,
-0.16911841928958893,
0.08710888773202896,
0.04155683144927025,
0.016016297042369843,
-0.014515646733343601,
-0.07223077863454819,
0.057068344205617905,
0.07699248194694519,
-0.12468259036540985,
-0.09319192171096802
] |
null | null | transformers |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# QAIDeptModel
This model is a fine-tuned version of [aubmindlab/bert-base-arabertv2](https://huggingface.co/aubmindlab/bert-base-arabertv2) on the None dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 1
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:-----:|:----:|:---------------:|
| No log | 1.0 | 105 | 2.6675 |
### Framework versions
- Transformers 4.11.3
- Pytorch 1.9.0+cu111
- Datasets 1.13.3
- Tokenizers 0.10.3
| {"tags": ["generated_from_trainer"], "model-index": [{"name": "QAIDeptModel", "results": []}]} | fill-mask | Razan/QAIDeptModel | [
"transformers",
"pytorch",
"tensorboard",
"bert",
"fill-mask",
"generated_from_trainer",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [] | TAGS
#transformers #pytorch #tensorboard #bert #fill-mask #generated_from_trainer #autotrain_compatible #endpoints_compatible #region-us
| QAIDeptModel
============
This model is a fine-tuned version of aubmindlab/bert-base-arabertv2 on the None dataset.
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 5e-05
* train\_batch\_size: 8
* eval\_batch\_size: 8
* seed: 42
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* num\_epochs: 1
### Training results
### Framework versions
* Transformers 4.11.3
* Pytorch 1.9.0+cu111
* Datasets 1.13.3
* Tokenizers 0.10.3
| [
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 5e-05\n* train\\_batch\\_size: 8\n* eval\\_batch\\_size: 8\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 1",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.11.3\n* Pytorch 1.9.0+cu111\n* Datasets 1.13.3\n* Tokenizers 0.10.3"
] | [
"TAGS\n#transformers #pytorch #tensorboard #bert #fill-mask #generated_from_trainer #autotrain_compatible #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 5e-05\n* train\\_batch\\_size: 8\n* eval\\_batch\\_size: 8\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 1",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.11.3\n* Pytorch 1.9.0+cu111\n* Datasets 1.13.3\n* Tokenizers 0.10.3"
] | [
47,
98,
4,
34
] | [
"passage: TAGS\n#transformers #pytorch #tensorboard #bert #fill-mask #generated_from_trainer #autotrain_compatible #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 5e-05\n* train\\_batch\\_size: 8\n* eval\\_batch\\_size: 8\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 1### Training results### Framework versions\n\n\n* Transformers 4.11.3\n* Pytorch 1.9.0+cu111\n* Datasets 1.13.3\n* Tokenizers 0.10.3"
] | [
-0.10317882150411606,
0.022239046171307564,
-0.0017435980262234807,
0.11754067987203598,
0.20162378251552582,
0.038211289793252945,
0.11892721801996231,
0.08896594494581223,
-0.10868004709482193,
0.03761880844831467,
0.12871047854423523,
0.14885737001895905,
0.0008000953821465373,
0.10742561519145966,
-0.039918821305036545,
-0.264445424079895,
-0.02950453944504261,
0.03148648887872696,
-0.10379514843225479,
0.12269559502601624,
0.07040615379810333,
-0.1594192534685135,
0.07331300526857376,
-0.005209606606513262,
-0.2437342256307602,
0.017794493585824966,
0.03372946381568909,
-0.05747678503394127,
0.14731089770793915,
0.005095785949379206,
0.17730098962783813,
-0.007122774608433247,
0.10701018571853638,
-0.14504075050354004,
0.017475958913564682,
0.06398153305053711,
0.009278852492570877,
0.077909916639328,
0.04712569713592529,
-0.000457330810604617,
0.0989937037229538,
-0.10649377852678299,
0.07024277001619339,
0.006162276025861502,
-0.13432519137859344,
-0.20662926137447357,
-0.0695531815290451,
-0.011909103021025658,
0.04803026095032692,
0.10017850995063782,
-0.005828937981277704,
0.15889166295528412,
-0.10995274782180786,
0.09765250980854034,
0.24112431704998016,
-0.2684791088104248,
-0.09245483577251434,
0.033659014850854874,
0.006947540678083897,
0.07246804982423782,
-0.11698209494352341,
-0.008799049071967602,
0.051830142736434937,
0.05781159922480583,
0.12953761219978333,
-0.028748685494065285,
-0.09428371489048004,
0.020044885575771332,
-0.14702266454696655,
0.005844178609549999,
0.032747190445661545,
0.018838023766875267,
-0.017600292339920998,
-0.004249947611242533,
-0.07334916293621063,
-0.16116079688072205,
-0.049554914236068726,
-0.028946056962013245,
0.04396919161081314,
-0.06911729276180267,
-0.12000755965709686,
0.011954596266150475,
-0.09601078182458878,
-0.06982466578483582,
-0.07595428824424744,
0.18632251024246216,
0.035698022693395615,
0.0203503780066967,
-0.04496336355805397,
0.0976797565817833,
-0.02306164987385273,
-0.1478622704744339,
0.052125878632068634,
0.03902437537908554,
-0.03918188065290451,
-0.05582892894744873,
-0.07940533757209778,
-0.11025617271661758,
0.005158726125955582,
0.0888589471578598,
-0.03687724098563194,
0.04723774641752243,
0.04621817544102669,
0.038437195122241974,
-0.10173475742340088,
0.19673138856887817,
-0.05006857588887215,
-0.03256263583898544,
-0.00022938412439543754,
0.0492217130959034,
-0.0019060983322560787,
-0.017157411202788353,
-0.1022895947098732,
0.01225265022367239,
0.09797035157680511,
-0.001773762865923345,
-0.06887093186378479,
0.057007357478141785,
-0.03788359463214874,
-0.005337336100637913,
-0.03342136740684509,
-0.0954795554280281,
0.05239998921751976,
-0.00810555461794138,
-0.08087896555662155,
0.0075643728487193584,
0.019513225182890892,
0.009988507255911827,
-0.0016937231412157416,
0.18228113651275635,
-0.10334166139364243,
0.04903026297688484,
-0.1221267357468605,
-0.12895631790161133,
0.0031701764091849327,
-0.07888808101415634,
0.014602811075747013,
-0.09226682782173157,
-0.1269168108701706,
-0.012501058168709278,
0.06587579101324081,
-0.030316481366753578,
-0.019619982689619064,
-0.03896636515855789,
-0.08097860217094421,
0.010954211466014385,
-0.006662057712674141,
0.167181134223938,
-0.047865383327007294,
0.11552990972995758,
0.056333027780056,
0.09230691194534302,
-0.060412485152482986,
0.047209419310092926,
-0.08372119814157486,
0.0018666404066607356,
-0.2330751121044159,
0.02543661557137966,
-0.05207216367125511,
0.06488080322742462,
-0.058435264974832535,
-0.11793100833892822,
0.0019179218215867877,
-0.0040091886185109615,
0.10593221336603165,
0.08934272825717926,
-0.1882433146238327,
-0.09295228123664856,
0.16013288497924805,
-0.05749143660068512,
-0.07934343814849854,
0.12414435297250748,
-0.06405976414680481,
0.013607277534902096,
0.0736267939209938,
0.12930820882320404,
0.04002488777041435,
-0.11717549711465836,
0.020859481766819954,
-0.031725745648145676,
0.055789198726415634,
-0.05263780802488327,
0.03920429199934006,
0.014636789448559284,
0.01436692290008068,
0.02575857751071453,
-0.01560581848025322,
0.062408555299043655,
-0.12715555727481842,
-0.09116113930940628,
-0.03443602845072746,
-0.10721521079540253,
0.07538500428199768,
0.08223703503608704,
0.09238673746585846,
-0.1035018265247345,
-0.08146178722381592,
0.080267034471035,
0.04588065668940544,
-0.04115653038024902,
0.023860951885581017,
-0.05926484242081642,
0.058808255940675735,
-0.06338009238243103,
-0.027545839548110962,
-0.19292575120925903,
-0.0576009601354599,
-0.0032050719019025564,
0.029167236760258675,
0.03260212391614914,
0.016339072957634926,
0.09839852154254913,
0.07975931465625763,
-0.06177356839179993,
0.003118674736469984,
-0.046584319323301315,
-0.00925357360392809,
-0.15658710896968842,
-0.20999692380428314,
-0.03707577660679817,
-0.01745428703725338,
0.08958669006824493,
-0.19744455814361572,
0.019806908443570137,
-0.06371957808732986,
0.08407668769359589,
0.011243407614529133,
-0.01838091015815735,
-0.07375124096870422,
0.11329417675733566,
-0.009680884890258312,
-0.046726156026124954,
0.057937223464250565,
-0.024497009813785553,
-0.07986567914485931,
-0.08560934662818909,
-0.10829086601734161,
0.20010022819042206,
0.13599133491516113,
-0.15839225053787231,
-0.10241542756557465,
0.043152932077646255,
-0.061191707849502563,
-0.020518016070127487,
-0.06332548707723618,
0.05029129981994629,
0.18568313121795654,
0.002719037001952529,
0.1359536051750183,
-0.04866404086351395,
-0.031161168590188026,
0.031073760241270065,
-0.03083384409546852,
0.046689681708812714,
0.09529309719800949,
0.15026912093162537,
-0.03531224653124809,
0.12042082846164703,
0.14699862897396088,
-0.13497817516326904,
0.13305731117725372,
-0.01212221384048462,
-0.08286573737859726,
-0.020291205495595932,
-0.04569277912378311,
0.011697188951075077,
0.12697026133537292,
-0.12265946716070175,
-0.018922429531812668,
-0.001465588342398405,
0.006721006706357002,
0.020936576649546623,
-0.23788881301879883,
-0.05530029162764549,
0.03261100500822067,
-0.009556656703352928,
-0.027608217671513557,
-0.018808305263519287,
0.020749663934111595,
0.11767369508743286,
0.0008302085334435105,
-0.07188265770673752,
0.015809332951903343,
0.0031566787511110306,
-0.060112956911325455,
0.21554920077323914,
-0.06595135480165482,
-0.10588302463293076,
-0.09754503518342972,
-0.08884673565626144,
-0.037608031183481216,
0.013088258914649487,
0.0381486751139164,
-0.12716569006443024,
-0.022853953763842583,
-0.025500651448965073,
0.03060084767639637,
0.02029694989323616,
0.07062256336212158,
0.008488200604915619,
-0.020259160548448563,
0.07137517631053925,
-0.10242126882076263,
-0.0038570333272218704,
-0.07697782665491104,
-0.08470913022756577,
0.059029243886470795,
0.07791474461555481,
0.1333930790424347,
0.16890598833560944,
-0.04256826266646385,
0.0031075291335582733,
-0.016570141538977623,
0.24605420231819153,
-0.07842182368040085,
-0.04036274552345276,
0.0942867174744606,
-0.02720288746058941,
0.05465121939778328,
0.09846951067447662,
0.08910936862230301,
-0.0948871374130249,
0.009913988411426544,
0.046022795140743256,
-0.050131313502788544,
-0.19286002218723297,
-0.03041207045316696,
-0.054043062031269073,
-0.05989258363842964,
0.08187496662139893,
0.015852132812142372,
0.026176709681749344,
0.06532768905162811,
0.06498532742261887,
0.09606348723173141,
-0.07750000059604645,
0.035512544214725494,
0.06148006394505501,
0.04812704399228096,
0.1304578185081482,
-0.028835371136665344,
-0.09902887791395187,
0.011253558099269867,
-0.04860587418079376,
0.22904475033283234,
0.0007121928501874208,
0.06963925808668137,
0.050861358642578125,
0.18554262816905975,
0.0007118091452866793,
0.0971166342496872,
0.007881979458034039,
-0.07750874012708664,
0.0041081709787249565,
-0.04500085860490799,
-0.027743836864829063,
0.009770382195711136,
-0.02252907119691372,
0.0687708854675293,
-0.12140421569347382,
-0.0013284791493788362,
0.049317073076963425,
0.23543477058410645,
0.02845277264714241,
-0.3217484951019287,
-0.07374406605958939,
-0.013907968997955322,
-0.022557534277439117,
-0.00023911305470392108,
-0.001685302471742034,
0.12039417773485184,
-0.0837680846452713,
0.025921553373336792,
-0.07063721120357513,
0.08697622269392014,
0.004134449642151594,
0.04974023997783661,
0.06699445098638535,
0.13485512137413025,
-0.009627135470509529,
0.047996364533901215,
-0.3104017972946167,
0.29150390625,
0.007092882413417101,
0.09731420874595642,
-0.08028781414031982,
-0.010378280654549599,
0.04057520627975464,
0.029744457453489304,
0.038246747106313705,
-0.019143924117088318,
-0.0031268983148038387,
-0.22042900323867798,
-0.03541822358965874,
0.04256240651011467,
0.13066637516021729,
-0.0059580253437161446,
0.10779181122779846,
-0.005252852104604244,
-0.008411449380218983,
0.08425477147102356,
0.011415938846766949,
-0.06896746158599854,
-0.06787362694740295,
-0.024077296257019043,
0.009455898776650429,
-0.09440325945615768,
-0.056530095636844635,
-0.1341955065727234,
-0.14073902368545532,
0.13781210780143738,
0.027000822126865387,
-0.0007995239575393498,
-0.12301095575094223,
0.12036184221506119,
0.08876489847898483,
-0.08287592232227325,
0.04487641155719757,
0.018446246162056923,
0.05112279951572418,
0.023580465465784073,
-0.0653337612748146,
0.11837247759103775,
-0.06605301052331924,
-0.1482972502708435,
-0.07919967919588089,
0.07576441019773483,
0.05532168969511986,
0.07810001075267792,
-0.026408761739730835,
0.028095286339521408,
-0.015749355778098106,
-0.08383543789386749,
0.04857147857546806,
-0.039298154413700104,
0.044481560587882996,
0.03180669620633125,
-0.04612421244382858,
-0.003954660147428513,
-0.05081920698285103,
0.00020373471488710493,
0.17286057770252228,
0.2440778762102127,
-0.09364580363035202,
-0.013799574226140976,
0.03158261254429817,
-0.05200548842549324,
-0.20452654361724854,
0.09535954147577286,
0.08311999589204788,
0.01900179497897625,
0.055797602981328964,
-0.16223521530628204,
0.16316203773021698,
0.09770233929157257,
0.0010211080079898238,
0.13422352075576782,
-0.31417855620384216,
-0.13543477654457092,
0.1057622879743576,
0.1747683584690094,
0.15840153396129608,
-0.1448042392730713,
-0.005281070247292519,
-0.023328835144639015,
-0.10853815823793411,
0.07639297097921371,
-0.08606474101543427,
0.12148343771696091,
-0.01804014854133129,
0.1017015129327774,
0.007426653057336807,
-0.07484612613916397,
0.09653422236442566,
0.004445928148925304,
0.10534447431564331,
-0.06076863780617714,
-0.05440260469913483,
0.047594524919986725,
-0.02229667827486992,
-0.029768338426947594,
-0.03190359100699425,
0.0060850451700389385,
-0.034810785204172134,
-0.013311265967786312,
-0.09682337939739227,
0.04476141929626465,
-0.03076857514679432,
-0.06202223151922226,
-0.017372336238622665,
0.02260972186923027,
0.03886747732758522,
-0.023367606103420258,
0.09875201433897018,
0.01285624410957098,
0.19018107652664185,
0.05580143257975578,
0.05269848555326462,
-0.07245693355798721,
-0.05625801533460617,
0.012040965259075165,
-0.009479571133852005,
0.06977946311235428,
-0.11050362139940262,
0.013184220530092716,
0.15022028982639313,
0.03772936388850212,
0.11132071912288666,
0.0985412523150444,
-0.021504288539290428,
0.02223137952387333,
0.0808112770318985,
-0.16062062978744507,
-0.06702525913715363,
0.013195040635764599,
-0.0916469395160675,
-0.1135701984167099,
0.05073514208197594,
0.07388562709093094,
-0.0742504820227623,
-0.0020059244707226753,
-0.01794801838696003,
-0.02011794224381447,
-0.07995408773422241,
0.23036102950572968,
0.0723431184887886,
0.04630887880921364,
-0.10079071670770645,
0.036230478435754776,
0.04608272761106491,
-0.0896017849445343,
-0.00297720218077302,
0.0896797925233841,
-0.06110437214374542,
-0.01130933128297329,
0.11977910250425339,
0.21129445731639862,
-0.045060135424137115,
-0.007610255386680365,
-0.15228033065795898,
-0.10608469694852829,
0.06368440389633179,
0.21079877018928528,
0.0995422750711441,
-0.015629176050424576,
-0.05873400345444679,
0.04917476698756218,
-0.141943097114563,
0.06634622812271118,
0.04820358753204346,
0.08119940012693405,
-0.12062744051218033,
0.21025912463665009,
0.00425409059971571,
0.048832111060619354,
-0.03570867329835892,
0.039141248911619186,
-0.11919841915369034,
0.02282840944826603,
-0.11919454485177994,
-0.06336008012294769,
-0.005045934114605188,
-0.020354704931378365,
-0.003676649648696184,
-0.07435091584920883,
-0.06769239157438278,
0.0014397124759852886,
-0.12678173184394836,
-0.02795868180692196,
0.04724682867527008,
0.00914476066827774,
-0.12290220707654953,
-0.0360843800008297,
0.026676686480641365,
-0.053154945373535156,
0.04231621325016022,
0.060179147869348526,
0.01946808025240898,
0.08003032952547073,
-0.18201935291290283,
-0.03186623752117157,
0.06111884117126465,
-0.0032441422808915377,
0.10396470874547958,
-0.03210296481847763,
-0.010900996625423431,
-0.0188616756349802,
0.11367948353290558,
0.02068059705197811,
0.07746144384145737,
-0.13645043969154358,
0.012424502521753311,
-0.02959069423377514,
-0.1068752259016037,
-0.05600642040371895,
0.012299523688852787,
0.07452917098999023,
0.006460803095251322,
0.17804740369319916,
-0.09897582232952118,
0.07277751713991165,
-0.2176782339811325,
-0.010306712239980698,
-0.016229961067438126,
-0.09101950377225876,
-0.096713125705719,
-0.05354468524456024,
0.08601633459329605,
-0.05543137714266777,
0.11571680009365082,
0.029979022219777107,
0.07536240667104721,
0.03093292936682701,
-0.026491433382034302,
-0.006403474602848291,
0.027822105213999748,
0.19682732224464417,
0.05198096111416817,
-0.04879259690642357,
0.0621815025806427,
0.08591072261333466,
0.10769727826118469,
0.11752551048994064,
0.2474772334098816,
0.14084988832473755,
0.004422810394316912,
0.09710770845413208,
0.03386479243636131,
-0.056727759540081024,
-0.15033315122127533,
0.00590034993365407,
-0.07773900032043457,
0.08844459801912308,
-0.0259699709713459,
0.18276280164718628,
0.05385029315948486,
-0.15326406061649323,
0.04661209508776665,
-0.06430399417877197,
-0.1000874936580658,
-0.1082688644528389,
0.005311961285769939,
-0.08003679662942886,
-0.12995585799217224,
0.01981426402926445,
-0.09622354060411453,
0.02207113243639469,
0.1211504265666008,
0.014434096403419971,
-0.013192066922783852,
0.2245664894580841,
0.036889299750328064,
0.05982901155948639,
0.0536060631275177,
0.014266595244407654,
-0.012321428395807743,
-0.07333490997552872,
-0.061530351638793945,
-0.04298529401421547,
-0.013879305683076382,
0.03381374105811119,
-0.07627656310796738,
-0.10728041082620621,
0.048079103231430054,
-0.00048579645226709545,
-0.11632400006055832,
0.024998167529702187,
0.015263902954757214,
0.07423905283212662,
0.028782399371266365,
0.00010465137893334031,
0.028134748339653015,
-0.03970131650567055,
0.1962917298078537,
-0.09204915910959244,
-0.08978528529405594,
-0.09653875976800919,
0.2787761688232422,
0.036590736359357834,
0.00015802255074959248,
0.0071416436694562435,
-0.0649292916059494,
-0.004643406718969345,
0.24210432171821594,
0.21257920563220978,
-0.12653811275959015,
0.0005200518062338233,
0.009351294487714767,
-0.013159563764929771,
-0.04825478792190552,
0.1354488730430603,
0.1254747211933136,
0.0738757848739624,
-0.10824579000473022,
-0.046954840421676636,
-0.061801787465810776,
-0.017437726259231567,
-0.04377744346857071,
0.027723737061023712,
0.06880401074886322,
0.024497469887137413,
-0.05751516669988632,
0.06374193727970123,
-0.0703032910823822,
-0.119310162961483,
0.09249762445688248,
-0.22293132543563843,
-0.1709025353193283,
-0.012514535337686539,
0.1207345724105835,
-0.0011366965482011437,
0.08327541500329971,
-0.026452330872416496,
-0.005270794965326786,
0.05121000111103058,
-0.02324812114238739,
-0.05923210829496384,
-0.1007603108882904,
0.10242504626512527,
-0.11098392307758331,
0.19610624015331268,
-0.046547457575798035,
0.07160020619630814,
0.12365472316741943,
0.07815048843622208,
-0.041731178760528564,
0.05102992057800293,
0.04269464313983917,
-0.13034947216510773,
0.004135515075176954,
0.13078437745571136,
-0.03906821832060814,
0.034594230353832245,
0.038576893508434296,
-0.14017702639102936,
0.04150834307074547,
-0.09358241409063339,
-0.03824705258011818,
-0.03532037138938904,
-0.043610911816358566,
-0.055586591362953186,
0.11274053156375885,
0.24083301424980164,
-0.003958006389439106,
0.03108949214220047,
-0.0800229087471962,
0.006115821190178394,
0.056999605149030685,
0.06534532457590103,
-0.11843547970056534,
-0.2575320303440094,
0.017832985147833824,
0.05306575447320938,
-0.04239926114678383,
-0.23744486272335052,
-0.09442552179098129,
0.011234031058847904,
-0.07703082263469696,
-0.08467228710651398,
0.07136452943086624,
0.07837991416454315,
0.06524506211280823,
-0.047147586941719055,
-0.12439220398664474,
-0.06800869852304459,
0.16153071820735931,
-0.16602759063243866,
-0.09214223921298981
] |
null | null | transformers |
# bert-base-spanish-wwm-cased-xnli
**UPDATE, 15.10.2021: Check out our new zero-shot classifiers, much more lightweight and even outperforming this one: [zero-shot SELECTRA small](https://huggingface.co/Recognai/zeroshot_selectra_small) and [zero-shot SELECTRA medium](https://huggingface.co/Recognai/zeroshot_selectra_medium).**
## Model description
This model is a fine-tuned version of the [spanish BERT model](https://huggingface.co/dccuchile/bert-base-spanish-wwm-cased) with the Spanish portion of the XNLI dataset. You can have a look at the [training script](https://huggingface.co/Recognai/bert-base-spanish-wwm-cased-xnli/blob/main/zeroshot_training_script.py) for details of the training.
### How to use
You can use this model with Hugging Face's [zero-shot-classification pipeline](https://discuss.huggingface.co/t/new-pipeline-for-zero-shot-text-classification/681):
```python
from transformers import pipeline
classifier = pipeline("zero-shot-classification",
model="Recognai/bert-base-spanish-wwm-cased-xnli")
classifier(
"El autor se perfila, a los 50 años de su muerte, como uno de los grandes de su siglo",
candidate_labels=["cultura", "sociedad", "economia", "salud", "deportes"],
hypothesis_template="Este ejemplo es {}."
)
"""output
{'sequence': 'El autor se perfila, a los 50 años de su muerte, como uno de los grandes de su siglo',
'labels': ['cultura', 'sociedad', 'economia', 'salud', 'deportes'],
'scores': [0.38897448778152466,
0.22997373342514038,
0.1658431738615036,
0.1205764189362526,
0.09463217109441757]}
"""
```
## Eval results
Accuracy for the test set:
| | XNLI-es |
|-----------------------------|---------|
|bert-base-spanish-wwm-cased-xnli | 79.9% | | {"language": "es", "license": "mit", "tags": ["zero-shot-classification", "nli", "pytorch"], "datasets": ["xnli"], "pipeline_tag": "zero-shot-classification", "widget": [{"text": "El autor se perfila, a los 50 a\u00f1os de su muerte, como uno de los grandes de su siglo", "candidate_labels": "cultura, sociedad, economia, salud, deportes"}]} | zero-shot-classification | Recognai/bert-base-spanish-wwm-cased-xnli | [
"transformers",
"pytorch",
"jax",
"safetensors",
"bert",
"text-classification",
"zero-shot-classification",
"nli",
"es",
"dataset:xnli",
"license:mit",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [
"es"
] | TAGS
#transformers #pytorch #jax #safetensors #bert #text-classification #zero-shot-classification #nli #es #dataset-xnli #license-mit #autotrain_compatible #endpoints_compatible #has_space #region-us
| bert-base-spanish-wwm-cased-xnli
================================
UPDATE, 15.10.2021: Check out our new zero-shot classifiers, much more lightweight and even outperforming this one: zero-shot SELECTRA small and zero-shot SELECTRA medium.
Model description
-----------------
This model is a fine-tuned version of the spanish BERT model with the Spanish portion of the XNLI dataset. You can have a look at the training script for details of the training.
### How to use
You can use this model with Hugging Face's zero-shot-classification pipeline:
Eval results
------------
Accuracy for the test set:
| [
"### How to use\n\n\nYou can use this model with Hugging Face's zero-shot-classification pipeline:\n\n\nEval results\n------------\n\n\nAccuracy for the test set:"
] | [
"TAGS\n#transformers #pytorch #jax #safetensors #bert #text-classification #zero-shot-classification #nli #es #dataset-xnli #license-mit #autotrain_compatible #endpoints_compatible #has_space #region-us \n",
"### How to use\n\n\nYou can use this model with Hugging Face's zero-shot-classification pipeline:\n\n\nEval results\n------------\n\n\nAccuracy for the test set:"
] | [
72,
38
] | [
"passage: TAGS\n#transformers #pytorch #jax #safetensors #bert #text-classification #zero-shot-classification #nli #es #dataset-xnli #license-mit #autotrain_compatible #endpoints_compatible #has_space #region-us \n### How to use\n\n\nYou can use this model with Hugging Face's zero-shot-classification pipeline:\n\n\nEval results\n------------\n\n\nAccuracy for the test set:"
] | [
-0.09726748615503311,
0.12975622713565826,
0.0004552343743853271,
0.07352800667285919,
0.07128740847110748,
-0.0024541441816836596,
0.11879702657461166,
0.12359850853681564,
0.0966503769159317,
0.03268026188015938,
0.15592876076698303,
0.14678628742694855,
0.044673480093479156,
0.21282048523426056,
-0.1053251251578331,
-0.11545498669147491,
0.04519766941666603,
0.030921930447220802,
0.0607934296131134,
0.13524045050144196,
0.08309119939804077,
-0.10390801727771759,
0.09521318972110748,
-0.0030636817682534456,
-0.1526952087879181,
0.022311130538582802,
0.0052598319016397,
-0.092973992228508,
0.10644354671239853,
0.018487248569726944,
0.06251437962055206,
0.043198660016059875,
0.03228102996945381,
-0.1409456729888916,
0.00811215490102768,
-0.002607541624456644,
-0.04137938842177391,
0.0442691333591938,
0.014752399176359177,
-0.05991826951503754,
0.08250594884157181,
-0.03895338624715805,
0.03289685398340225,
-0.002080230275169015,
-0.06680578738451004,
-0.10960046201944351,
-0.008836462162435055,
-0.0056289080530405045,
0.1378430724143982,
0.0706176683306694,
-0.028201164677739143,
0.2537549138069153,
-0.10043115168809891,
0.08815689384937286,
0.12782318890094757,
-0.25339245796203613,
-0.042467035353183746,
0.08290249854326248,
0.045815955847501755,
-0.016009455546736717,
-0.0627034530043602,
0.05956500023603439,
0.1190357357263565,
-0.00437153410166502,
0.04252350702881813,
-0.013851934112608433,
-0.05617576837539673,
0.05011363327503204,
-0.09432408958673477,
-0.045504380017519,
0.19277368485927582,
0.019541947171092033,
-0.056395962834358215,
-0.12581007182598114,
-0.05122658610343933,
-0.11226283758878708,
-0.02689174935221672,
-0.056283000856637955,
0.02703496627509594,
-0.0547187365591526,
-0.11477538198232651,
0.07844314724206924,
-0.08791900426149368,
-0.09870339930057526,
-0.12647636234760284,
0.13869363069534302,
0.04238199442625046,
0.07011909037828445,
-0.09694132953882217,
0.07171989232301712,
-0.05493316426873207,
-0.0897645503282547,
-0.03345240652561188,
-0.07353436201810837,
-0.022881753742694855,
-0.008062449283897877,
-0.0641188696026802,
0.11350762844085693,
0.1363314986228943,
0.1859208345413208,
-0.00495662447065115,
0.008156482130289078,
0.01709621027112007,
0.012019164860248566,
0.027658911421895027,
0.17588858306407928,
0.0040896059945225716,
-0.11544056981801987,
-0.0004946929402649403,
-0.013246320188045502,
0.027006320655345917,
-0.01455180998891592,
-0.06828857958316803,
-0.05516784265637398,
0.0998975858092308,
0.07028032839298248,
-0.025979261845350266,
0.033563967794179916,
-0.022444020956754684,
0.020582852885127068,
0.0008026327705010772,
-0.06279100477695465,
0.056021302938461304,
-0.008921213448047638,
-0.06430526077747345,
-0.04733757674694061,
-0.005258291959762573,
-0.024802595376968384,
-0.02446712553501129,
0.004446649923920631,
-0.03359971195459366,
0.005371120758354664,
-0.058997076004743576,
-0.12407825142145157,
0.013844382017850876,
-0.12301617860794067,
0.04300455376505852,
-0.16753658652305603,
-0.12652048468589783,
-0.01976245827972889,
0.013262332417070866,
-0.03709990531206131,
-0.004287118557840586,
-0.07576490193605423,
-0.011627103202044964,
0.01945791393518448,
-0.03446200489997864,
-0.029204808175563812,
-0.08325619995594025,
0.029134636744856834,
0.0507039949297905,
0.13178329169750214,
-0.003988139797002077,
0.014056689105927944,
-0.1595250368118286,
-0.008638466708362103,
0.022677334025502205,
0.01513733807951212,
-0.07084206491708755,
0.08527502417564392,
-0.047115497291088104,
-0.05652822554111481,
0.04443882778286934,
0.031196625903248787,
0.031615182757377625,
0.17210808396339417,
-0.1634458303451538,
-0.04616125673055649,
0.12838101387023926,
-0.11902929842472076,
-0.21045845746994019,
0.15597355365753174,
-0.024617157876491547,
-0.002841213485226035,
0.05792213976383209,
0.1870289146900177,
0.013487075455486774,
-0.1338217407464981,
0.012234531342983246,
0.06526758521795273,
-0.10982602089643478,
-0.04627528041601181,
0.053402144461870193,
0.06549975275993347,
-0.19321642816066742,
0.04482207074761391,
-0.0458223782479763,
0.01583552174270153,
-0.062439773231744766,
-0.07888434082269669,
-0.07145438343286514,
-0.054440513253211975,
0.04198702052235603,
0.03702589496970177,
0.038562338799238205,
-0.14844809472560883,
-0.043367769569158554,
-0.1187853142619133,
0.07598865032196045,
-0.04239586368203163,
-0.019727183505892754,
-0.10566627979278564,
0.16261433064937592,
-0.09873778373003006,
-0.02714836411178112,
-0.08157921582460403,
-0.04018745943903923,
-0.0057836733758449554,
0.07424348592758179,
-0.0023676857817918062,
0.02665129117667675,
0.03767067566514015,
0.04202587530016899,
0.03296699374914169,
-0.04531029984354973,
0.1603800505399704,
0.015918511897325516,
-0.07265171408653259,
-0.10460234433412552,
0.02161656692624092,
-0.0540396012365818,
0.12090419232845306,
-0.12424708902835846,
0.0030693477019667625,
-0.12724879384040833,
0.016511619091033936,
-0.03913944214582443,
0.07682675123214722,
-0.026037875562906265,
0.016782980412244797,
-0.07365160435438156,
-0.03410332277417183,
0.0804348960518837,
-0.015408352948725224,
-0.08753053843975067,
0.08205000311136246,
-0.16119781136512756,
0.2088446319103241,
0.17530004680156708,
-0.08632154762744904,
-0.056657012552022934,
-0.026657041162252426,
-0.04205192252993584,
0.024334706366062164,
-0.062163855880498886,
0.10095943510532379,
0.006731071509420872,
-0.07386881858110428,
0.11098101735115051,
-0.08580074459314346,
-0.018668072298169136,
0.02828027866780758,
-0.09471213072538376,
-0.0721551775932312,
0.14240309596061707,
0.004202742129564285,
-0.160684272646904,
0.10863381624221802,
0.15312576293945312,
-0.057288963347673416,
0.06415413320064545,
0.01907149888575077,
-0.030320042744278908,
-0.027643244713544846,
-0.03386753797531128,
-0.018980013206601143,
0.09976638108491898,
-0.15628500282764435,
-0.01655050925910473,
0.08719033002853394,
-0.024870170280337334,
0.008690187707543373,
-0.13612648844718933,
-0.04285488277673721,
0.04256758466362953,
-0.017376873642206192,
-0.04558805376291275,
0.08155618607997894,
-0.0030576454009860754,
0.12301595509052277,
-0.04866626113653183,
-0.17937587201595306,
0.029610851779580116,
-0.019900230690836906,
-0.10699822008609772,
0.1790885478258133,
-0.04742796719074249,
-0.23742523789405823,
-0.08340999484062195,
-0.1001739427447319,
-0.08899731934070587,
0.03108169138431549,
0.06732048094272614,
-0.07504002004861832,
-0.04648270830512047,
-0.10210316628217697,
-0.10643680393695831,
-0.010295231826603413,
0.025323819369077682,
-0.07919902354478836,
-0.008118518628180027,
0.03413200005888939,
-0.09559889882802963,
-0.08585111796855927,
-0.03405320644378662,
-0.039765190333127975,
0.14456191658973694,
-0.07967603951692581,
0.11016184836626053,
0.10049454122781754,
-0.09611117839813232,
0.055049821734428406,
-0.03691258653998375,
0.16529449820518494,
-0.05312225595116615,
-0.06749221682548523,
0.14396163821220398,
0.05060506612062454,
0.00838747713714838,
0.1115146279335022,
-0.03195535019040108,
-0.07608624547719955,
0.021051643416285515,
-0.025114212185144424,
-0.07722655683755875,
-0.12430746853351593,
-0.10327290743589401,
-0.0038301541935652494,
0.07760370522737503,
0.10007576644420624,
0.0413995087146759,
0.12141042202711105,
0.10101818293333054,
0.04729963466525078,
-0.07708601653575897,
0.00206032395362854,
0.10049644857645035,
0.05944688618183136,
0.020477909594774246,
0.14175215363502502,
-0.047624360769987106,
-0.10721025615930557,
0.05278978496789932,
-0.03461926802992821,
0.0969032347202301,
0.03777259960770607,
-0.13910523056983948,
0.06367897242307663,
0.15822170674800873,
0.05409640446305275,
0.07931341230869293,
0.0380980484187603,
-0.06050195172429085,
-0.004000114742666483,
-0.03390183299779892,
-0.033361081033945084,
0.04885697737336159,
0.01252337358891964,
0.020627273246645927,
-0.10405994951725006,
-0.059911374002695084,
0.0886627584695816,
0.018368393182754517,
0.1854662299156189,
-0.35615235567092896,
-0.010059695690870285,
0.04043402895331383,
-0.0019277865067124367,
-0.0758948102593422,
0.07675226777791977,
-0.005503031425178051,
-0.07118666172027588,
0.08612919598817825,
-0.02485385537147522,
0.10026547312736511,
0.08109214156866074,
0.0537283681333065,
-0.07202036678791046,
-0.038208167999982834,
-0.010073351673781872,
0.1249232292175293,
-0.25016021728515625,
0.20198838412761688,
-0.02227921597659588,
0.05371525138616562,
-0.0619225837290287,
-0.059507016092538834,
0.08426378667354584,
0.10963104665279388,
0.20995713770389557,
-0.02558761276304722,
-0.09412996470928192,
-0.15366141498088837,
-0.0712989792227745,
0.07244514673948288,
0.0034299450926482677,
0.01668861322104931,
0.07414636015892029,
-0.024887753650546074,
-0.00407807482406497,
0.014806290157139301,
0.17239898443222046,
-0.05520571768283844,
-0.05167238041758537,
-0.05816248431801796,
0.057522185146808624,
-0.049866676330566406,
0.01100221462547779,
-0.07912366092205048,
-0.21111729741096497,
0.13079848885536194,
0.05222896486520767,
-0.007950213737785816,
-0.09155840426683426,
0.011047491803765297,
0.05918461084365845,
-0.0321115106344223,
-0.022099927067756653,
-0.0450243279337883,
0.09771396219730377,
-0.01048806868493557,
-0.12707915902137756,
0.06322028487920761,
-0.08846033364534378,
-0.05090989172458649,
-0.03017314337193966,
0.06905010342597961,
-0.010552182793617249,
0.019351545721292496,
0.08667073398828506,
0.053402725607156754,
-0.11195620894432068,
-0.07716533541679382,
0.058822061866521835,
0.034274086356163025,
0.03689412400126457,
0.07522690296173096,
-0.009888502769172192,
-0.09849447011947632,
-0.07836513221263885,
0.04119815304875374,
0.1712176501750946,
0.28753358125686646,
-0.08896162360906601,
0.03726363554596901,
0.06835608184337616,
-0.0627167671918869,
-0.2585739195346832,
-0.07303488254547119,
-0.10449949651956558,
0.06753981113433838,
0.03235087916254997,
0.09971099346876144,
0.06397116184234619,
0.004287440329790115,
-0.03923897445201874,
0.005036409944295883,
-0.1873297244310379,
-0.11252880096435547,
0.23197950422763824,
0.07507144659757614,
0.276071161031723,
-0.059189531952142715,
0.00032903143437579274,
-0.07139360159635544,
-0.011868095025420189,
0.13030162453651428,
-0.053239285945892334,
0.08657003194093704,
-0.0734434649348259,
0.02730398066341877,
0.034156251698732376,
-0.04871385917067528,
0.16886986792087555,
-0.04034484550356865,
0.11986140161752701,
-0.05544751510024071,
-0.03132012113928795,
-0.03714504465460777,
-0.09218301624059677,
0.16081787645816803,
-0.07266725599765778,
0.05699191614985466,
-0.1269497275352478,
-0.03814353048801422,
-0.03287232294678688,
0.08899068832397461,
0.006457113195210695,
-0.012026211246848106,
-0.07620180398225784,
0.009291131049394608,
0.04228007420897484,
-0.05426082760095596,
0.0956503376364708,
-0.06386521458625793,
0.15134799480438232,
0.08853702992200851,
0.04648484289646149,
-0.039208993315696716,
0.03363099694252014,
0.03759164735674858,
-0.03706471621990204,
0.07817603647708893,
-0.11160636693239212,
0.0628216490149498,
0.12528765201568604,
-0.01759173348546028,
0.09828614443540573,
0.08885061740875244,
0.011333133094012737,
-0.029348233714699745,
0.0916362926363945,
-0.1349371075630188,
-0.00791340321302414,
-0.0230490081012249,
-0.10865308344364166,
0.031964465975761414,
0.07493884861469269,
0.05802784860134125,
-0.07432662695646286,
0.0017469673184677958,
-0.0281208548694849,
0.04081128537654877,
-0.056146081537008286,
0.1037917360663414,
0.053819067776203156,
0.025973275303840637,
-0.1397257298231125,
0.07455110549926758,
-0.030365101993083954,
-0.04874006658792496,
0.03855031728744507,
-0.01927688531577587,
-0.05911426246166229,
-0.10854075849056244,
0.06636872887611389,
0.2511184513568878,
-0.11363805830478668,
-0.1210600733757019,
-0.09539753198623657,
-0.12098407745361328,
0.031053708866238594,
0.10237784683704376,
0.13891316950321198,
0.04456532746553421,
0.007596317678689957,
-0.08207827806472778,
-0.10369434207677841,
0.11035628616809845,
0.074930340051651,
0.022240284830331802,
-0.10556287318468094,
0.08645612001419067,
-0.01107010431587696,
0.10279013961553574,
-0.08918864279985428,
-0.05292610824108124,
-0.14081977307796478,
0.01704324223101139,
-0.139547660946846,
0.027946611866354942,
-0.061145614832639694,
0.004684366751462221,
0.059835921972990036,
-0.008506580255925655,
-0.08021141588687897,
-0.026492036879062653,
-0.07077034562826157,
0.02626829780638218,
0.01992296613752842,
0.06369709223508835,
-0.09030790627002716,
-0.060309458523988724,
0.030292438343167305,
-0.031071169301867485,
0.02982492186129093,
0.06737790256738663,
-0.03589919954538345,
-0.00271774223074317,
-0.20095834136009216,
-0.043459512293338776,
0.04689962416887283,
0.012722231447696686,
0.0808202251791954,
-0.13222065567970276,
0.07208655774593353,
0.10492338240146637,
-0.058745428919792175,
0.055801328271627426,
0.03893113508820534,
-0.0564684122800827,
-0.04463719204068184,
-0.08441026508808136,
-0.10812130570411682,
-0.03651975467801094,
-0.019186701625585556,
0.07469264417886734,
0.05419577658176422,
0.15007561445236206,
-0.0973799079656601,
0.021740633994340897,
-0.12383899837732315,
-0.018987605348229408,
-0.035292815417051315,
-0.10150529444217682,
-0.09356546401977539,
-0.054028939455747604,
0.05642560124397278,
0.0013087426777929068,
0.23936203122138977,
0.10683663934469223,
0.060795918107032776,
0.014520359225571156,
0.03426169231534004,
0.09102019667625427,
-0.016574839130043983,
0.16934622824192047,
-0.005509168840944767,
0.00385128241032362,
-0.0012432036455720663,
0.05293339490890503,
0.06514446437358856,
0.015371656976640224,
0.00835501216351986,
0.06381439417600632,
-0.043178800493478775,
0.12620435655117035,
-0.01085725985467434,
-0.03474438190460205,
-0.017039306461811066,
-0.10533838719129562,
-0.06915687769651413,
0.05447188764810562,
0.05194941163063049,
0.05919817090034485,
0.16654354333877563,
-0.12521138787269592,
0.018745271489024162,
-0.0693451464176178,
-0.055600881576538086,
-0.17232811450958252,
-0.16601286828517914,
-0.12176883220672607,
-0.11976084858179092,
0.031918466091156006,
-0.08574868738651276,
-0.027626685798168182,
0.07063883543014526,
0.05630602315068245,
-0.06758622080087662,
0.04086460545659065,
-0.04982925206422806,
0.016690827906131744,
0.07685147970914841,
-0.02132726088166237,
0.0073601375333964825,
0.03976210206747055,
0.02355422079563141,
0.017507491633296013,
-0.0622393973171711,
0.02752210758626461,
-0.03618587926030159,
0.036140576004981995,
0.06458701938390732,
-0.06542070209980011,
-0.08376242220401764,
-0.0269600972533226,
0.007203824818134308,
0.06078115105628967,
0.10168672353029251,
0.050314631313085556,
0.021749766543507576,
0.01814156211912632,
0.20580393075942993,
-0.07795383036136627,
-0.05033954232931137,
-0.08725666254758835,
0.29312947392463684,
0.0256218109279871,
0.054945945739746094,
-0.0035776877775788307,
-0.020970657467842102,
-0.023280926048755646,
0.2025042474269867,
0.21988900005817413,
-0.03429512679576874,
-0.006975834723562002,
-0.025850240141153336,
0.0009930472588166595,
-0.016598530113697052,
0.0505513995885849,
0.05947002023458481,
0.13938212394714355,
-0.08838341385126114,
0.04431089758872986,
-0.09719591587781906,
-0.012971409596502781,
-0.04750536382198334,
0.055015262216329575,
0.03529830276966095,
-0.06154823303222656,
-0.044666826725006104,
0.08310390263795853,
-0.06441154330968857,
0.043457984924316406,
0.16214017570018768,
-0.10155884176492691,
-0.10056556016206741,
0.04877679422497749,
0.10899728536605835,
0.005012399982661009,
0.0646950975060463,
-0.08171212673187256,
0.08098917454481125,
-0.011690251529216766,
-0.04297429695725441,
-0.10094384104013443,
-0.04228188097476959,
0.028606999665498734,
0.02982502430677414,
0.1575344055891037,
0.01893053948879242,
0.14751818776130676,
0.10800274461507797,
0.01689215376973152,
-0.06973472237586975,
0.16409732401371002,
-0.007681814953684807,
-0.04518890008330345,
0.10358619689941406,
0.02078612893819809,
0.029905298724770546,
-0.032361939549446106,
0.08653922379016876,
-0.05915558710694313,
0.02686355821788311,
-0.12106233835220337,
-0.05683909356594086,
-0.09910595417022705,
0.08673899620771408,
-0.03467744216322899,
0.09739069640636444,
0.1037224605679512,
-0.047800224274396896,
0.0043831379152834415,
-0.04064447432756424,
0.07346710562705994,
-0.01248140912503004,
-0.09129754453897476,
0.031086813658475876,
-0.09782949090003967,
0.033434975892305374,
0.06763019412755966,
-0.013946975581347942,
-0.2513843774795532,
-0.02098201960325241,
-0.08640986680984497,
-0.08374463766813278,
-0.030207214877009392,
0.07861284911632538,
0.08625650405883789,
0.04092032462358475,
-0.04212259501218796,
-0.09185482561588287,
0.04903613403439522,
0.09815365821123123,
-0.052032798528671265,
-0.13574370741844177
] |
null | null | transformers |
# DistilBERT base multilingual model Spanish subset (cased)
This model is the Spanish extract of `distilbert-base-multilingual-cased` (https://huggingface.co/distilbert-base-multilingual-cased), a distilled version of the [BERT base multilingual model](bert-base-multilingual-cased). This model is cased: it does make a difference between english and English.
It uses the extraction method proposed by Geotrend described in https://github.com/Geotrend-research/smaller-transformers.
The resulting model has the same architecture as DistilmBERT: 6 layers, 768 dimension and 12 heads, with a total of **63M parameters** (compared to 134M parameters for DistilmBERT).
The goal of this model is to reduce even further the size of the `distilbert-base-multilingual` multilingual model by selecting only most frequent tokens for Spanish, reducing the size of the embedding layer. For more details visit the paper from the Geotrend team: Load What You Need: Smaller Versions of Multilingual BERT. | {"language": "es", "license": "apache-2.0", "datasets": ["wikipedia"], "widget": [{"text": "Mi nombre es Juan y vivo en [MASK]."}]} | fill-mask | Recognai/distilbert-base-es-multilingual-cased | [
"transformers",
"pytorch",
"safetensors",
"distilbert",
"fill-mask",
"es",
"dataset:wikipedia",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [
"es"
] | TAGS
#transformers #pytorch #safetensors #distilbert #fill-mask #es #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
|
# DistilBERT base multilingual model Spanish subset (cased)
This model is the Spanish extract of 'distilbert-base-multilingual-cased' (URL a distilled version of the BERT base multilingual model. This model is cased: it does make a difference between english and English.
It uses the extraction method proposed by Geotrend described in URL
The resulting model has the same architecture as DistilmBERT: 6 layers, 768 dimension and 12 heads, with a total of 63M parameters (compared to 134M parameters for DistilmBERT).
The goal of this model is to reduce even further the size of the 'distilbert-base-multilingual' multilingual model by selecting only most frequent tokens for Spanish, reducing the size of the embedding layer. For more details visit the paper from the Geotrend team: Load What You Need: Smaller Versions of Multilingual BERT. | [
"# DistilBERT base multilingual model Spanish subset (cased)\n\nThis model is the Spanish extract of 'distilbert-base-multilingual-cased' (URL a distilled version of the BERT base multilingual model. This model is cased: it does make a difference between english and English.\n\nIt uses the extraction method proposed by Geotrend described in URL \n\nThe resulting model has the same architecture as DistilmBERT: 6 layers, 768 dimension and 12 heads, with a total of 63M parameters (compared to 134M parameters for DistilmBERT).\n\nThe goal of this model is to reduce even further the size of the 'distilbert-base-multilingual' multilingual model by selecting only most frequent tokens for Spanish, reducing the size of the embedding layer. For more details visit the paper from the Geotrend team: Load What You Need: Smaller Versions of Multilingual BERT."
] | [
"TAGS\n#transformers #pytorch #safetensors #distilbert #fill-mask #es #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n",
"# DistilBERT base multilingual model Spanish subset (cased)\n\nThis model is the Spanish extract of 'distilbert-base-multilingual-cased' (URL a distilled version of the BERT base multilingual model. This model is cased: it does make a difference between english and English.\n\nIt uses the extraction method proposed by Geotrend described in URL \n\nThe resulting model has the same architecture as DistilmBERT: 6 layers, 768 dimension and 12 heads, with a total of 63M parameters (compared to 134M parameters for DistilmBERT).\n\nThe goal of this model is to reduce even further the size of the 'distilbert-base-multilingual' multilingual model by selecting only most frequent tokens for Spanish, reducing the size of the embedding layer. For more details visit the paper from the Geotrend team: Load What You Need: Smaller Versions of Multilingual BERT."
] | [
58,
219
] | [
"passage: TAGS\n#transformers #pytorch #safetensors #distilbert #fill-mask #es #dataset-wikipedia #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n# DistilBERT base multilingual model Spanish subset (cased)\n\nThis model is the Spanish extract of 'distilbert-base-multilingual-cased' (URL a distilled version of the BERT base multilingual model. This model is cased: it does make a difference between english and English.\n\nIt uses the extraction method proposed by Geotrend described in URL \n\nThe resulting model has the same architecture as DistilmBERT: 6 layers, 768 dimension and 12 heads, with a total of 63M parameters (compared to 134M parameters for DistilmBERT).\n\nThe goal of this model is to reduce even further the size of the 'distilbert-base-multilingual' multilingual model by selecting only most frequent tokens for Spanish, reducing the size of the embedding layer. For more details visit the paper from the Geotrend team: Load What You Need: Smaller Versions of Multilingual BERT."
] | [
-0.05211328715085983,
0.018428238108754158,
-0.0058491346426308155,
0.1066426932811737,
0.07980499416589737,
-0.004022512119263411,
0.09841731935739517,
0.027846215292811394,
-0.012615399435162544,
0.06918701529502869,
-0.021576207131147385,
0.030934952199459076,
0.017621338367462158,
-0.005053722765296698,
0.017480812966823578,
-0.2818472385406494,
0.04269145429134369,
-0.011493785306811333,
0.0679139718413353,
0.021699383854866028,
0.09218441694974899,
-0.08153665065765381,
0.04950128495693207,
0.02612633816897869,
-0.014774153009057045,
0.066563680768013,
0.007478237152099609,
-0.027038291096687317,
0.07922375947237015,
0.02128998562693596,
0.10743468254804611,
0.014071539975702763,
0.058824289590120316,
-0.07003537565469742,
0.000876786420121789,
0.017486900091171265,
0.023195762187242508,
0.04212755709886551,
0.1150083839893341,
-0.06872791796922684,
-0.00099682598374784,
-0.10487527400255203,
0.01856255903840065,
0.022854290902614594,
-0.06346685439348221,
-0.03566106781363487,
-0.09428831934928894,
0.050393279641866684,
0.06627026200294495,
-0.020980607718229294,
0.015158934518694878,
0.024859905242919922,
-0.072653628885746,
0.039033450186252594,
0.11334601789712906,
-0.27574726939201355,
-0.01698988303542137,
0.1209060400724411,
0.03146800026297569,
0.053260210901498795,
-0.024349309504032135,
0.04948069900274277,
0.03361465781927109,
0.06174862012267113,
0.0000289907657133881,
-0.05227060243487358,
-0.015061595477163792,
-0.07160479575395584,
-0.06278226524591446,
-0.004819326102733612,
0.11687036603689194,
-0.01164727658033371,
-0.062444280833005905,
-0.042287085205316544,
-0.09898413717746735,
0.202412948012352,
-0.06698836386203766,
0.01591331698000431,
0.04456449672579765,
0.018384626135230064,
0.06237218901515007,
-0.11441870778799057,
-0.07582524418830872,
-0.005130601581186056,
-0.1631716936826706,
0.1694253534078598,
0.04725762456655502,
0.030568845570087433,
-0.0010710578644648194,
0.0636187270283699,
-0.13443690538406372,
-0.09265483915805817,
0.002999250777065754,
-0.07122351229190826,
-0.014985193498432636,
0.0367998406291008,
-0.0628807470202446,
-0.09482316672801971,
0.0691271722316742,
0.04269096627831459,
-0.07041922956705093,
0.011005213484168053,
-0.03199899569153786,
0.00010942055087070912,
0.07403133064508438,
0.08730177581310272,
-0.07978886365890503,
-0.03908272087574005,
0.05516286566853523,
-0.032991427928209305,
0.05028287693858147,
0.016203146427869797,
-0.10034793615341187,
-0.05610055848956108,
0.04884699359536171,
0.055747196078300476,
0.027160050347447395,
0.0399048812687397,
-0.02370544895529747,
-0.04336049035191536,
0.09658680111169815,
-0.12224532663822174,
0.064494788646698,
-0.017928173765540123,
0.059227973222732544,
0.21015284955501556,
-0.03466503322124481,
-0.033755168318748474,
-0.07085627317428589,
0.0027676913887262344,
-0.06987237185239792,
-0.015715349465608597,
-0.07615388929843903,
-0.14298197627067566,
0.012796320952475071,
-0.031475845724344254,
-0.039000652730464935,
-0.1164124384522438,
-0.21727882325649261,
-0.023408902809023857,
0.028130361810326576,
-0.06939142942428589,
0.033251095563173294,
-0.0039236340671777725,
0.03522878140211105,
-0.008882471360266209,
-0.025518810376524925,
-0.07165226340293884,
-0.03140370175242424,
0.02250317484140396,
-0.0249464251101017,
0.05071520060300827,
-0.09168623387813568,
0.015568836592137814,
-0.04152838885784149,
0.059703800827264786,
-0.22442829608917236,
0.15848512947559357,
-0.0243060402572155,
0.003647945588454604,
-0.06695272773504257,
-0.05762259662151337,
-0.0560750886797905,
0.05058096721768379,
0.03244619444012642,
0.14545194804668427,
-0.13881157338619232,
-0.04879513755440712,
0.19260123372077942,
-0.1068771481513977,
0.08423318713903427,
0.1224348396062851,
-0.027030328288674355,
-0.015606820583343506,
0.18981987237930298,
0.10240127146244049,
0.14290928840637207,
-0.044858936220407486,
-0.01945088617503643,
0.05905048921704292,
0.006810745224356651,
0.018820099532604218,
0.051362477242946625,
-0.051705989986658096,
-0.08102939277887344,
0.06592564284801483,
-0.04077671840786934,
0.0253582913428545,
-0.03564329072833061,
0.030807482078671455,
-0.04457142949104309,
-0.033487070351839066,
0.09275855869054794,
-0.02610946260392666,
0.04099593684077263,
-0.03948201984167099,
-0.0015478231944143772,
0.18767696619033813,
0.059755392372608185,
-0.0579787977039814,
0.023721277713775635,
-0.03725723549723625,
0.10463828593492508,
-0.17688651382923126,
-0.00618324801325798,
-0.15820622444152832,
-0.13128013908863068,
0.05707106739282608,
0.01121513731777668,
0.005146780516952276,
0.13129888474941254,
0.0437452457845211,
0.07123634219169617,
-0.07340282201766968,
0.051729340106248856,
0.06038111820816994,
-0.03061213530600071,
-0.028549689799547195,
-0.041844744235277176,
-0.03877536579966545,
-0.06441985815763474,
-0.06925380975008011,
0.0073807984590530396,
0.012187480926513672,
-0.10356999933719635,
0.06202803552150726,
-0.04174217954277992,
0.019912568852305412,
0.008434519171714783,
0.03239618241786957,
-0.03825731575489044,
-0.04546840488910675,
0.050855960696935654,
-0.0014948572497814894,
0.04368024691939354,
0.07780160754919052,
-0.03803454339504242,
-0.003391401842236519,
0.0728999525308609,
0.09161971509456635,
0.02175302430987358,
-0.03861422836780548,
0.015483856201171875,
0.007605938706547022,
-0.0561755932867527,
-0.12229286879301071,
0.17197071015834808,
0.016217393800616264,
0.10483266413211823,
-0.13339507579803467,
0.005992016289383173,
-0.0030227394308894873,
-0.050575658679008484,
-0.06752606481313705,
0.030035609379410744,
-0.01965048536658287,
-0.09993480890989304,
0.06220412999391556,
0.030898936092853546,
0.005463931709527969,
0.1306130588054657,
-0.004179688170552254,
-0.055571798235177994,
0.01592794805765152,
0.03226006403565407,
-0.007801470812410116,
-0.02118399553000927,
0.006255891174077988,
-0.007785719819366932,
0.05268317088484764,
0.0714867040514946,
0.04561500996351242,
-0.04971006512641907,
0.06049616262316704,
0.021044135093688965,
-0.043689385056495667,
-0.04294519126415253,
-0.009889911860227585,
0.030404023826122284,
0.08307333290576935,
-0.008124962449073792,
0.0051658558659255505,
-0.01676774024963379,
0.010794405825436115,
-0.031599562615156174,
0.1618562936782837,
-0.14287997782230377,
-0.2192929983139038,
-0.17797064781188965,
-0.12912902235984802,
-0.13766594231128693,
0.0809752494096756,
0.016222231090068817,
-0.0713367834687233,
-0.06697310507297516,
-0.05041614919900894,
0.13718104362487793,
0.023466991260647774,
-0.03637387230992317,
0.01508100051432848,
-0.017302436754107475,
-0.0015821622218936682,
-0.1687323898077011,
-0.0314820297062397,
-0.007170145865529776,
-0.07729624956846237,
-0.007330758031457663,
-0.07066188007593155,
0.04271163046360016,
0.08890434354543686,
-0.01107991673052311,
-0.06405961513519287,
0.021332228556275368,
0.12310726940631866,
0.050979990512132645,
0.08026806265115738,
0.1880107969045639,
0.08050123602151871,
0.07187771052122116,
0.10144507884979248,
0.06649008393287659,
-0.04620286822319031,
-0.015912465751171112,
0.02415148727595806,
-0.041058216243982315,
-0.20107147097587585,
-0.12346287816762924,
-0.051344018429517746,
-0.05003811791539192,
0.0492282472550869,
0.06411754339933395,
-0.10631483793258667,
0.09366178512573242,
-0.052069585770368576,
0.014749820344150066,
0.020175708457827568,
0.04340550675988197,
0.16379418969154358,
-0.034560251981019974,
0.09753595292568207,
-0.02987842634320259,
-0.09213297069072723,
0.13011586666107178,
0.06901483237743378,
0.10557945817708969,
-0.030353179201483727,
-0.004643429070711136,
0.054121267050504684,
-0.007850540801882744,
0.04208273068070412,
0.22864991426467896,
-0.06873676925897598,
0.00351524050347507,
-0.035863492637872696,
-0.024983204901218414,
0.012150966562330723,
-0.0347987562417984,
0.0794687271118164,
-0.008015383034944534,
-0.04351933300495148,
-0.05269491299986839,
0.042511533945798874,
0.21787221729755402,
0.09854045510292053,
-0.1771949976682663,
-0.04481225833296776,
0.06254323571920395,
-0.046784982085227966,
-0.10664904117584229,
0.004463174846023321,
0.12836965918540955,
-0.06406724452972412,
0.10058742761611938,
0.02100110799074173,
0.07611405849456787,
-0.07839196175336838,
0.012019003741443157,
-0.11111222207546234,
0.10026208311319351,
-0.0018615121953189373,
0.07007049769163132,
-0.06815582513809204,
0.09458011388778687,
0.04358827322721481,
0.03242659196257591,
-0.03817972540855408,
0.04978722333908081,
0.011902350932359695,
0.10903045535087585,
0.07427371293306351,
0.02104656584560871,
0.11569103598594666,
-0.058760013431310654,
-0.15523548424243927,
0.04542471468448639,
0.020548086613416672,
-0.05263727158308029,
0.08320749551057816,
-0.018222447484731674,
-0.021740686148405075,
-0.034359171986579895,
-0.04299318790435791,
-0.14842446148395538,
-0.16632875800132751,
-0.03492724150419235,
0.002400572644546628,
-0.054778363555669785,
-0.03983507305383682,
-0.08123132586479187,
-0.09681893140077591,
0.18949201703071594,
0.058469031006097794,
-0.13429437577724457,
-0.10298701375722885,
-0.03252701461315155,
0.1725022792816162,
-0.03558901324868202,
0.06825139373540878,
-0.06482242047786713,
0.11483320593833923,
-0.05444280803203583,
-0.12588223814964294,
-0.006564270239323378,
-0.09465768933296204,
-0.019047444686293602,
-0.023028243333101273,
0.157699316740036,
0.004791030660271645,
-0.0003322509292047471,
0.05939819663763046,
-0.008332365192472935,
0.030029235407710075,
-0.10846000164747238,
-0.11207173019647598,
0.1278516799211502,
0.11750644445419312,
0.14089426398277283,
-0.19419322907924652,
-0.1420586109161377,
0.04109513387084007,
0.07434030622243881,
0.12933339178562164,
0.16099140048027039,
-0.021522510796785355,
0.11169930547475815,
0.24974235892295837,
-0.04012390598654747,
-0.20241348445415497,
-0.13374871015548706,
0.048337727785110474,
0.03932958468794823,
0.030234340578317642,
-0.13170801103115082,
0.10371358692646027,
0.08039242029190063,
0.001985743874683976,
-0.05304502695798874,
-0.22986136376857758,
-0.0698583796620369,
0.04011970013380051,
0.043442294001579285,
0.2745928466320038,
-0.05505537614226341,
-0.06549311429262161,
-0.09280818700790405,
-0.09657223522663116,
0.01734883151948452,
0.004499757196754217,
0.12383399158716202,
-0.026028525084257126,
-0.004901675507426262,
0.049698490649461746,
-0.02374958246946335,
0.1566995084285736,
0.03191094473004341,
0.019262272864580154,
-0.04603244364261627,
0.05304715409874916,
0.15185536444187164,
-0.03313130512833595,
0.14756430685520172,
-0.017313385382294655,
0.06891483068466187,
0.013986477628350258,
-0.09991937130689621,
-0.08229180425405502,
0.06307294964790344,
0.0005086445016786456,
-0.03475674241781235,
-0.007666660938411951,
0.00032569325412623584,
0.02539730817079544,
-0.0012825116282328963,
-0.04077833145856857,
-0.023009082302451134,
-0.015884656459093094,
0.06356348097324371,
0.11404083669185638,
-0.06526404619216919,
-0.1002875491976738,
-0.04843507707118988,
-0.030542265623807907,
0.026919826865196228,
-0.023682396858930588,
0.060618650168180466,
0.04379856213927269,
-0.020678197965025902,
0.014321288093924522,
0.04323727265000343,
-0.05954764038324356,
0.02295069769024849,
0.10216373205184937,
-0.051129259169101715,
-0.18973276019096375,
0.022129183635115623,
-0.04688012972474098,
-0.06636593490839005,
0.05410238355398178,
0.14329202473163605,
0.03746171295642853,
-0.06370021402835846,
-0.02192271314561367,
0.00033618314773775637,
-0.025408966466784477,
0.10952357202768326,
0.002758858958259225,
-0.00018075415573548526,
-0.08351831883192062,
0.07657598704099655,
0.07171221077442169,
0.002234151354059577,
-0.041005440056324005,
0.01794513687491417,
-0.07973972707986832,
-0.05481969937682152,
0.044940099120140076,
0.07739818096160889,
-0.03870789334177971,
-0.06349045783281326,
-0.05233606696128845,
-0.11081865429878235,
0.020053183659911156,
0.01831580139696598,
0.03011700138449669,
0.05074429139494896,
-0.09047819674015045,
-0.04505985602736473,
0.0021015331149101257,
0.02584211528301239,
0.06103311851620674,
0.04901159182190895,
-0.11730334907770157,
-0.026285268366336823,
-0.017760172486305237,
0.05713547766208649,
-0.03609437867999077,
-0.017640208825469017,
-0.06691073626279831,
-0.030016399919986725,
-0.24092653393745422,
0.027992507442831993,
-0.10545902699232101,
-0.023222673684358597,
-0.025426503270864487,
-0.012858979403972626,
-0.006603002082556486,
0.005886361468583345,
-0.045714154839515686,
-0.005832770373672247,
-0.05008509010076523,
0.09044456481933594,
-0.06634809076786041,
0.02075176313519478,
0.019121598452329636,
-0.05982119217514992,
0.10402248054742813,
-0.022596804425120354,
-0.03132136911153793,
0.04304684326052666,
-0.19335120916366577,
-0.07988142967224121,
0.053336311131715775,
0.042660973966121674,
0.01016486156731844,
-0.017682649195194244,
0.06636184453964233,
0.0835287868976593,
0.032117463648319244,
-0.02263164520263672,
0.060567911714315414,
-0.07545670121908188,
0.05931900441646576,
-0.020907262340188026,
-0.08736331760883331,
-0.025254791602492332,
-0.002529405988752842,
0.11495146155357361,
0.027017107233405113,
0.09724422544240952,
-0.08925486356019974,
-0.007047280669212341,
-0.11237815767526627,
-0.012588050216436386,
-0.02883063070476055,
-0.0656188353896141,
0.005175027064979076,
-0.07246435433626175,
0.027023987844586372,
0.0901826024055481,
0.17887166142463684,
0.039707209914922714,
0.04248173534870148,
-0.057095903903245926,
0.07391377538442612,
0.07732833921909332,
0.010175095871090889,
0.15144121646881104,
0.003397693857550621,
-0.014380568638443947,
-0.05472281575202942,
0.06376827508211136,
0.054712217301130295,
0.07084023207426071,
0.08089105784893036,
0.1425580531358719,
0.0018007471226155758,
0.05221524462103844,
0.05928811430931091,
0.001975234132260084,
-0.08860307186841965,
-0.08875006437301636,
-0.029347915202379227,
0.02502024732530117,
-0.04863014817237854,
0.03892885521054268,
0.03507579490542412,
-0.1109001561999321,
0.10287939757108688,
0.0790552943944931,
-0.03959359973669052,
-0.10122505575418472,
-0.15433113276958466,
-0.04475895315408707,
-0.0667974054813385,
-0.01576385274529457,
-0.11648343503475189,
-0.021908648312091827,
0.06694803386926651,
0.022792745381593704,
-0.041216086596250534,
0.0989866852760315,
-0.12025435268878937,
-0.09784364700317383,
0.030156301334500313,
0.0005282814963720739,
0.09396864473819733,
0.10587688535451889,
-0.03988835960626602,
-0.034337710589170456,
0.09383828938007355,
0.03087327815592289,
0.08037598431110382,
0.08897115290164948,
0.02465197816491127,
-0.043816156685352325,
-0.040672555565834045,
-0.03312524035573006,
-0.01917128823697567,
0.05266492813825607,
0.1619413197040558,
0.02864096686244011,
-0.09997805953025818,
-0.006732651498168707,
0.10281803458929062,
0.013571714982390404,
-0.13779211044311523,
-0.0950360894203186,
0.2068605273962021,
0.010158968158066273,
0.06276685744524002,
0.021854054182767868,
-0.11182230710983276,
-0.06263180077075958,
0.2051801085472107,
0.1560889482498169,
0.008281203918159008,
-0.018485985696315765,
-0.03657860308885574,
-0.012408645823597908,
0.05233350396156311,
0.12925763428211212,
-0.022693369537591934,
0.2669235169887543,
-0.039408423006534576,
0.035382937639951706,
-0.01552062202244997,
0.03127172961831093,
-0.1706346720457077,
0.08863327652215958,
-0.07798533141613007,
-0.023628367111086845,
-0.0301275085657835,
0.09769080579280853,
0.027686376124620438,
-0.06562083214521408,
0.029130127280950546,
-0.029820788651704788,
-0.08541392534971237,
-0.013437056913971901,
0.02109934203326702,
-0.010548155754804611,
0.08696010708808899,
-0.0535295344889164,
0.012117679230868816,
0.0795724019408226,
-0.04640742763876915,
-0.09742068499326706,
-0.03434872254729271,
0.07673653215169907,
0.05785627290606499,
0.13620130717754364,
0.04119280353188515,
-0.005709795746952295,
0.09858028590679169,
-0.009203721769154072,
-0.10637451708316803,
0.04921386390924454,
-0.04656801372766495,
-0.031662870198488235,
0.091423898935318,
0.008225547149777412,
-0.016222961246967316,
-0.06540721654891968,
-0.01818527840077877,
-0.041441213339567184,
0.03088601492345333,
0.0876520648598671,
0.008750016801059246,
-0.11360453814268112,
0.06834106892347336,
-0.11832872033119202,
0.08692516386508942,
0.11464950442314148,
0.027841683477163315,
-0.02811765857040882,
-0.043732840567827225,
0.10535610467195511,
0.011152276769280434,
0.08023614436388016,
-0.04814162105321884,
-0.19883029162883759,
-0.006907765753567219,
-0.043451979756355286,
0.045968685299158096,
-0.2171078473329544,
-0.04664074629545212,
-0.0182565338909626,
0.002547819633036852,
-0.10768815129995346,
0.05800532177090645,
0.06140253320336342,
0.030693478882312775,
0.022274164482951164,
-0.13908728957176208,
-0.03355415165424347,
0.03142355754971504,
-0.10100129246711731,
-0.05954520404338837
] |
null | null | transformers |
# SELECTRA: A Spanish ELECTRA
SELECTRA is a Spanish pre-trained language model based on [ELECTRA](https://github.com/google-research/electra).
We release a `small` and `medium` version with the following configuration:
| Model | Layers | Embedding/Hidden Size | Params | Vocab Size | Max Sequence Length | Cased |
| --- | --- | --- | --- | --- | --- | --- |
| [SELECTRA small](https://huggingface.co/Recognai/selectra_small) | 12 | 256 | 22M | 50k | 512 | True |
| **SELECTRA medium** | **12** | **384** | **41M** | **50k** | **512** | **True** |
**SELECTRA small (medium) is about 5 (3) times smaller than BETO but achieves comparable results** (see Metrics section below).
## Usage
From the original [ELECTRA model card](https://huggingface.co/google/electra-small-discriminator): "ELECTRA models are trained to distinguish "real" input tokens vs "fake" input tokens generated by another neural network, similar to the discriminator of a GAN."
The discriminator should therefore activate the logit corresponding to the fake input token, as the following example demonstrates:
```python
from transformers import ElectraForPreTraining, ElectraTokenizerFast
discriminator = ElectraForPreTraining.from_pretrained("Recognai/selectra_small")
tokenizer = ElectraTokenizerFast.from_pretrained("Recognai/selectra_small")
sentence_with_fake_token = "Estamos desayunando pan rosa con tomate y aceite de oliva."
inputs = tokenizer.encode(sentence_with_fake_token, return_tensors="pt")
logits = discriminator(inputs).logits.tolist()[0]
print("\t".join(tokenizer.tokenize(sentence_with_fake_token)))
print("\t".join(map(lambda x: str(x)[:4], logits[1:-1])))
"""Output:
Estamos desayun ##ando pan rosa con tomate y aceite de oliva .
-3.1 -3.6 -6.9 -3.0 0.19 -4.5 -3.3 -5.1 -5.7 -7.7 -4.4 -4.2
"""
```
However, you probably want to use this model to fine-tune it on a downstream task.
We provide models fine-tuned on the [XNLI dataset](https://huggingface.co/datasets/xnli), which can be used together with the zero-shot classification pipeline:
- [Zero-shot SELECTRA small](https://huggingface.co/Recognai/zeroshot_selectra_small)
- [Zero-shot SELECTRA medium](https://huggingface.co/Recognai/zeroshot_selectra_medium)
## Metrics
We fine-tune our models on 3 different down-stream tasks:
- [XNLI](https://huggingface.co/datasets/xnli)
- [PAWS-X](https://huggingface.co/datasets/paws-x)
- [CoNLL2002 - NER](https://huggingface.co/datasets/conll2002)
For each task, we conduct 5 trials and state the mean and standard deviation of the metrics in the table below.
To compare our results to other Spanish language models, we provide the same metrics taken from the [evaluation table](https://github.com/PlanTL-SANIDAD/lm-spanish#evaluation-) of the [Spanish Language Model](https://github.com/PlanTL-SANIDAD/lm-spanish) repo.
| Model | CoNLL2002 - NER (f1) | PAWS-X (acc) | XNLI (acc) | Params |
| --- | --- | --- | --- | --- |
| SELECTRA small | 0.865 +- 0.004 | 0.896 +- 0.002 | 0.784 +- 0.002 | 22M |
| SELECTRA medium | 0.873 +- 0.003 | 0.896 +- 0.002 | 0.804 +- 0.002 | 41M |
| | | | | |
| [mBERT](https://huggingface.co/bert-base-multilingual-cased) | 0.8691 | 0.8955 | 0.7876 | 178M |
| [BETO](https://huggingface.co/dccuchile/bert-base-spanish-wwm-cased) | 0.8759 | 0.9000 | 0.8130 | 110M |
| [RoBERTa-b](https://huggingface.co/BSC-TeMU/roberta-base-bne) | 0.8851 | 0.9000 | 0.8016 | 125M |
| [RoBERTa-l](https://huggingface.co/BSC-TeMU/roberta-large-bne) | 0.8772 | 0.9060 | 0.7958 | 355M |
| [Bertin](https://huggingface.co/bertin-project/bertin-roberta-base-spanish/tree/v1-512) | 0.8835 | 0.8990 | 0.7890 | 125M |
| [ELECTRICIDAD](https://huggingface.co/mrm8488/electricidad-base-discriminator) | 0.7954 | 0.9025 | 0.7878 | 109M |
Some details of our fine-tuning runs:
- epochs: 5
- batch-size: 32
- learning rate: 1e-4
- warmup proportion: 0.1
- linear learning rate decay
- layerwise learning rate decay
For all the details, check out our [selectra repo](https://github.com/recognai/selectra).
## Training
We pre-trained our SELECTRA models on the Spanish portion of the [Oscar](https://huggingface.co/datasets/oscar) dataset, which is about 150GB in size.
Each model version is trained for 300k steps, with a warm restart of the learning rate after the first 150k steps.
Some details of the training:
- steps: 300k
- batch-size: 128
- learning rate: 5e-4
- warmup steps: 10k
- linear learning rate decay
- TPU cores: 8 (v2-8)
For all details, check out our [selectra repo](https://github.com/recognai/selectra).
**Note:** Due to a misconfiguration in the pre-training scripts the embeddings of the vocabulary containing an accent were not optimized. If you fine-tune this model on a down-stream task, you might consider using a tokenizer that does not strip the accents:
```python
tokenizer = ElectraTokenizerFast.from_pretrained("Recognai/selectra_small", strip_accents=False)
```
## Motivation
Despite the abundance of excellent Spanish language models (BETO, BSC-BNE, Bertin, ELECTRICIDAD, etc.), we felt there was still a lack of distilled or compact Spanish language models and a lack of comparing those to their bigger siblings.
## Acknowledgment
This research was supported by the Google TPU Research Cloud (TRC) program.
## Authors
- David Fidalgo ([GitHub](https://github.com/dcfidalgo))
- Javier Lopez ([GitHub](https://github.com/javispp))
- Daniel Vila ([GitHub](https://github.com/dvsrepo))
- Francisco Aranda ([GitHub](https://github.com/frascuchon)) | {"language": ["es"], "license": "apache-2.0", "datasets": ["oscar"], "thumbnail": "url to a thumbnail used in social sharing"} | null | Recognai/selectra_medium | [
"transformers",
"pytorch",
"electra",
"pretraining",
"es",
"dataset:oscar",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [
"es"
] | TAGS
#transformers #pytorch #electra #pretraining #es #dataset-oscar #license-apache-2.0 #endpoints_compatible #region-us
| SELECTRA: A Spanish ELECTRA
===========================
SELECTRA is a Spanish pre-trained language model based on ELECTRA.
We release a 'small' and 'medium' version with the following configuration:
SELECTRA small (medium) is about 5 (3) times smaller than BETO but achieves comparable results (see Metrics section below).
Usage
-----
From the original ELECTRA model card: "ELECTRA models are trained to distinguish "real" input tokens vs "fake" input tokens generated by another neural network, similar to the discriminator of a GAN."
The discriminator should therefore activate the logit corresponding to the fake input token, as the following example demonstrates:
However, you probably want to use this model to fine-tune it on a downstream task.
We provide models fine-tuned on the XNLI dataset, which can be used together with the zero-shot classification pipeline:
* Zero-shot SELECTRA small
* Zero-shot SELECTRA medium
Metrics
-------
We fine-tune our models on 3 different down-stream tasks:
* XNLI
* PAWS-X
* CoNLL2002 - NER
For each task, we conduct 5 trials and state the mean and standard deviation of the metrics in the table below.
To compare our results to other Spanish language models, we provide the same metrics taken from the evaluation table of the Spanish Language Model repo.
Some details of our fine-tuning runs:
* epochs: 5
* batch-size: 32
* learning rate: 1e-4
* warmup proportion: 0.1
* linear learning rate decay
* layerwise learning rate decay
For all the details, check out our selectra repo.
Training
--------
We pre-trained our SELECTRA models on the Spanish portion of the Oscar dataset, which is about 150GB in size.
Each model version is trained for 300k steps, with a warm restart of the learning rate after the first 150k steps.
Some details of the training:
* steps: 300k
* batch-size: 128
* learning rate: 5e-4
* warmup steps: 10k
* linear learning rate decay
* TPU cores: 8 (v2-8)
For all details, check out our selectra repo.
Note: Due to a misconfiguration in the pre-training scripts the embeddings of the vocabulary containing an accent were not optimized. If you fine-tune this model on a down-stream task, you might consider using a tokenizer that does not strip the accents:
Motivation
----------
Despite the abundance of excellent Spanish language models (BETO, BSC-BNE, Bertin, ELECTRICIDAD, etc.), we felt there was still a lack of distilled or compact Spanish language models and a lack of comparing those to their bigger siblings.
Acknowledgment
--------------
This research was supported by the Google TPU Research Cloud (TRC) program.
Authors
-------
* David Fidalgo (GitHub)
* Javier Lopez (GitHub)
* Daniel Vila (GitHub)
* Francisco Aranda (GitHub)
| [] | [
"TAGS\n#transformers #pytorch #electra #pretraining #es #dataset-oscar #license-apache-2.0 #endpoints_compatible #region-us \n"
] | [
43
] | [
"passage: TAGS\n#transformers #pytorch #electra #pretraining #es #dataset-oscar #license-apache-2.0 #endpoints_compatible #region-us \n"
] | [
-0.07683845609426498,
0.18955309689044952,
-0.00659080408513546,
0.030211126431822777,
0.09840341657400131,
0.018398506566882133,
0.05997311696410179,
0.11080925166606903,
0.021579211577773094,
-0.07366665452718735,
0.17904803156852722,
0.1963929682970047,
-0.012490801513195038,
0.06203972175717354,
-0.04274727404117584,
-0.2311093807220459,
0.08776770532131195,
0.05098164454102516,
-0.0660795122385025,
0.07429376989603043,
0.12509115040302277,
-0.0908072218298912,
0.03433637693524361,
-0.001981661655008793,
-0.04280845820903778,
0.029203612357378006,
0.0027932517696172,
-0.11046809703111649,
0.09528407454490662,
0.0433439202606678,
0.05660942941904068,
0.04187637194991112,
-0.003945271484553814,
-0.18933100998401642,
0.024477165192365646,
-0.023405157029628754,
-0.06838838756084442,
0.0481049008667469,
0.005227357614785433,
-0.06773369014263153,
0.07942680269479752,
0.07944031804800034,
0.007553336676210165,
0.03688516840338707,
-0.11076807230710983,
-0.227263405919075,
-0.08183244615793228,
0.06484026461839676,
0.03908763453364372,
0.10368122160434723,
0.017301246523857117,
0.12588448822498322,
-0.12457743287086487,
0.006704565137624741,
0.09981586784124374,
-0.3235836327075958,
-0.006895713973790407,
-0.00009184568625641987,
0.10064049810171127,
-0.040118273347616196,
0.014907030388712883,
0.039756011217832565,
0.07982202619314194,
0.012895988300442696,
0.051977209746837616,
-0.03437446057796478,
-0.115831159055233,
0.05333786830306053,
-0.05436762422323227,
-0.09276608377695084,
0.30669647455215454,
0.05915834382176399,
0.018835509195923805,
0.013405196368694305,
-0.026932386681437492,
-0.04537971317768097,
0.009641101583838463,
0.054854776710271835,
0.0600060410797596,
0.07845240086317062,
0.06095879152417183,
-0.006552563048899174,
-0.14847026765346527,
0.03572157770395279,
-0.1556803286075592,
0.10158172249794006,
0.038446176797151566,
0.09712142497301102,
-0.1331988126039505,
0.0663803294301033,
0.08458492904901505,
-0.09499779343605042,
-0.03272280842065811,
-0.09111962467432022,
0.07864515483379364,
0.043135304003953934,
-0.06845463812351227,
0.1323084980249405,
0.10315385460853577,
0.26082223653793335,
0.05503556504845619,
-0.035516344010829926,
0.08372398465871811,
0.13872717320919037,
0.027084210887551308,
0.03790615499019623,
-0.03928164392709732,
0.013039368204772472,
0.06708614528179169,
-0.10779742151498795,
0.07985562831163406,
-0.0014892334584146738,
-0.07478918135166168,
-0.041202038526535034,
-0.033591412007808685,
0.08783018589019775,
0.06948698312044144,
-0.01712065190076828,
-0.135088711977005,
0.0203086007386446,
0.08553238958120346,
-0.03558002784848213,
-0.03569525107741356,
-0.039561834186315536,
-0.0492158867418766,
0.09275542944669724,
0.04126298427581787,
0.056275349110364914,
-0.004177860915660858,
0.08156299591064453,
-0.08381997793912888,
-0.029664522036910057,
-0.036404192447662354,
0.0332149937748909,
0.11960088461637497,
-0.15177099406719208,
0.13456487655639648,
-0.12393051385879517,
-0.13795650005340576,
0.019349103793501854,
0.11472675949335098,
0.016697145998477936,
-0.08534852415323257,
0.020835570991039276,
0.006596914492547512,
-0.010913372039794922,
-0.0998326987028122,
-0.01683359034359455,
-0.10849346220493317,
0.05591447651386261,
-0.059524282813072205,
0.028101978823542595,
-0.1290932446718216,
0.03932229056954384,
-0.10448966920375824,
0.005622376222163439,
-0.01565554365515709,
-0.01768525131046772,
-0.08226142078638077,
0.20294217765331268,
-0.02058333531022072,
-0.06742178648710251,
-0.007004959974437952,
0.043303828686475754,
-0.04595400020480156,
0.15691711008548737,
-0.09736622869968414,
-0.03653354197740555,
0.19120758771896362,
-0.11523900926113129,
-0.1933595985174179,
0.06033594161272049,
-0.0184506643563509,
0.007397811394184828,
0.052724890410900116,
0.1307481974363327,
-0.05366017296910286,
-0.1042306199669838,
0.012393989600241184,
0.07811156660318375,
-0.08172012120485306,
-0.2300691157579422,
0.10923616588115692,
0.012491805478930473,
-0.04045041278004646,
0.02175549976527691,
-0.04057437181472778,
0.10328621417284012,
-0.05836952105164528,
-0.07137701660394669,
-0.033044006675481796,
-0.06102779135107994,
-0.052568309009075165,
0.04279453679919243,
0.026784313842654228,
-0.08594734221696854,
-0.05489518120884895,
-0.017800452187657356,
0.07400728017091751,
0.04104806110262871,
0.051676809787750244,
-0.13983629643917084,
0.029784638434648514,
0.03345511481165886,
-0.03187424689531326,
-0.12510646879673004,
0.025350071489810944,
-0.0433889739215374,
0.03954453021287918,
-0.023609254509210587,
0.13350386917591095,
0.023977432399988174,
-0.0974145457148552,
0.0022276989184319973,
-0.020353006199002266,
0.059764500707387924,
0.04148663207888603,
0.021746844053268433,
-0.1397637128829956,
0.011161454021930695,
-0.05857183039188385,
0.09135235100984573,
-0.02320033498108387,
0.013861481100320816,
0.03070036880671978,
0.1276272088289261,
-0.024713875725865364,
0.03992987796664238,
-0.014570309780538082,
-0.012186498381197453,
-0.05046781525015831,
-0.008326034992933273,
0.10448021441698074,
0.06690534949302673,
-0.09983803331851959,
0.12486027926206589,
0.01606069505214691,
0.2610210180282593,
0.16110078990459442,
-0.17241153120994568,
0.05737519636750221,
0.05270462855696678,
-0.050580933690071106,
0.025338858366012573,
0.04054780676960945,
0.05823718383908272,
0.04895817115902901,
0.0027748364955186844,
0.10972925275564194,
-0.02006663754582405,
-0.03325860947370529,
-0.016844142228364944,
-0.06572195887565613,
-0.012233463115990162,
0.04774356260895729,
0.19120846688747406,
-0.12210902571678162,
0.1692216992378235,
0.21270105242729187,
-0.019643964245915413,
0.08926520496606827,
-0.09068010002374649,
-0.008307944983243942,
0.045310668647289276,
-0.06480050832033157,
-0.046015556901693344,
0.09806573390960693,
-0.1624995321035385,
0.029955437406897545,
0.08063597232103348,
-0.030126234516501427,
0.04755592346191406,
-0.16667503118515015,
-0.1244606077671051,
0.021399881690740585,
-0.0017728526145219803,
-0.05820788815617561,
0.0540565624833107,
-0.06119184195995331,
0.06534013897180557,
-0.003912101499736309,
-0.09408536553382874,
0.11042118072509766,
-0.008553623221814632,
-0.07439586520195007,
0.1604304164648056,
-0.11787083745002747,
-0.20045000314712524,
-0.07261933386325836,
0.006280918139964342,
0.07298969477415085,
0.01999838463962078,
0.13620144128799438,
-0.00705955782905221,
-0.059916164726018906,
0.037398967891931534,
-0.10300920158624649,
-0.061136797070503235,
-0.0016949373530223966,
0.08097533881664276,
0.0326981358230114,
-0.03202525153756142,
-0.09505083411931992,
-0.04376409575343132,
-0.006692842114716768,
-0.014321565628051758,
0.09362632781267166,
-0.05509103462100029,
0.06938546895980835,
0.04436882585287094,
0.06021706014871597,
0.010703685693442822,
-0.026097144931554794,
0.14716221392154694,
-0.0628543272614479,
-0.05504655838012695,
0.20610104501247406,
0.003484223736450076,
0.05392543226480484,
0.12415309995412827,
0.03752589598298073,
-0.06601539254188538,
-0.008333723992109299,
-0.07004017382860184,
-0.0794602707028389,
-0.31762072443962097,
-0.103192038834095,
-0.07249736040830612,
0.042249079793691635,
0.018198125064373016,
0.07160263508558273,
0.06617629528045654,
0.1264808177947998,
-0.028306210413575172,
-0.1084650531411171,
-0.04376774653792381,
0.0253796074539423,
0.2205074578523636,
0.0002846337447408587,
0.0784611627459526,
-0.12917876243591309,
-0.017169533297419548,
0.12144012749195099,
0.056546032428741455,
0.17200814187526703,
0.1349295824766159,
-0.009811299853026867,
0.10839825123548508,
0.15554849803447723,
0.057749032974243164,
0.08291179686784744,
0.062318913638591766,
-0.020134776830673218,
-0.054722584784030914,
0.03058907575905323,
-0.06002378463745117,
0.09529902786016464,
-0.023830724880099297,
-0.12730635702610016,
-0.038288578391075134,
-0.10376398265361786,
0.030408570542931557,
0.2514234781265259,
0.015217569656670094,
-0.20484112203121185,
-0.0011771729914471507,
0.0925958901643753,
-0.019468454644083977,
-0.01118850614875555,
0.08879145234823227,
-0.040287233889102936,
-0.11680659651756287,
0.16708558797836304,
-0.00663050776347518,
0.08847757428884506,
-0.01810126192867756,
0.01984694041311741,
0.015132827684283257,
-0.15392804145812988,
0.10034053772687912,
0.11651712656021118,
-0.2822793126106262,
0.20575527846813202,
-0.04206530377268791,
-0.010670584626495838,
-0.0522872656583786,
-0.013797717168927193,
0.010988368652760983,
0.1823185384273529,
0.19412373006343842,
0.01118172612041235,
-0.0842834860086441,
0.04742049053311348,
-0.03279787674546242,
0.06286755949258804,
-0.055477578192949295,
-0.037176862359046936,
-0.046608082950115204,
-0.02160341665148735,
0.013750583864748478,
0.011115637607872486,
0.038130540400743484,
-0.06014873459935188,
-0.1267845630645752,
-0.007080703508108854,
0.0964130237698555,
0.11598934233188629,
-0.06270117312669754,
-0.06513774394989014,
-0.05746573582291603,
0.09513984620571136,
-0.14058910310268402,
-0.06985274702310562,
-0.0750175192952156,
-0.09175931662321091,
0.0518355667591095,
-0.052809521555900574,
0.11050676554441452,
-0.04460519552230835,
-0.037203237414360046,
-0.05680049583315849,
-0.14012745022773743,
0.14721499383449554,
-0.16627924144268036,
-0.013141771778464317,
-0.0440291129052639,
0.07015644758939743,
-0.03463279828429222,
0.05533347278833389,
0.008744316175580025,
0.013796636834740639,
-0.10917270183563232,
-0.09701994806528091,
-0.00454053794965148,
0.0712493509054184,
0.12470178306102753,
-0.044578395783901215,
-0.06366043537855148,
-0.030982619151473045,
0.03732270374894142,
-0.10516539961099625,
0.20240430533885956,
0.23342175781726837,
-0.10632002353668213,
0.11443380266427994,
0.14097005128860474,
-0.04885879531502724,
-0.2800050377845764,
-0.1288079470396042,
-0.10078492015600204,
-0.07400091737508774,
0.04574798420071602,
-0.2317659854888916,
0.10025875270366669,
0.1311509907245636,
-0.10330228507518768,
0.09569743275642395,
-0.22451922297477722,
-0.03333798795938492,
0.13373823463916779,
-0.003032251261174679,
0.36633893847465515,
-0.12809540331363678,
-0.03478893265128136,
-0.0336286723613739,
-0.21982096135616302,
0.1593703031539917,
0.032971106469631195,
0.04534896835684776,
0.003967553377151489,
-0.0778537467122078,
-0.04370960220694542,
-0.051381129771471024,
0.1452057957649231,
0.028033500537276268,
0.020046968013048172,
-0.06463898718357086,
0.03970823436975479,
0.005194030702114105,
-0.010823575779795647,
0.04354329779744148,
-0.0820116475224495,
0.02162904478609562,
-0.1878456026315689,
-0.026462065055966377,
-0.06448547542095184,
0.0961386188864708,
0.03504554182291031,
-0.040451351553201675,
-0.05900039151310921,
-0.008855590596795082,
0.026086945086717606,
0.0042853932827711105,
0.2542562186717987,
0.01933862268924713,
0.027421994134783745,
0.015319939702749252,
0.10616664588451385,
-0.17623215913772583,
-0.044952016323804855,
-0.12816685438156128,
-0.05068143829703331,
0.0676659345626831,
-0.1503879725933075,
0.002201367635279894,
0.13845059275627136,
-0.05515425279736519,
0.04683864489197731,
0.0494481660425663,
-0.012392787262797356,
0.0189041830599308,
0.15581437945365906,
-0.17241349816322327,
-0.020971275866031647,
-0.008551088161766529,
0.10398152470588684,
0.12375864386558533,
0.06887616217136383,
0.057986535131931305,
-0.01901349611580372,
-0.037876494228839874,
-0.0031160307116806507,
0.03581018000841141,
-0.07012946903705597,
0.024805475026369095,
0.05577515438199043,
0.008908101357519627,
-0.12422115355730057,
0.14138251543045044,
-0.01373280119150877,
-0.25174200534820557,
-0.007031968329101801,
0.04799224063754082,
-0.10588926076889038,
-0.09816496074199677,
-0.0664064809679985,
0.049093637615442276,
-0.2578333914279938,
-0.13206321001052856,
-0.06540492177009583,
-0.12128692865371704,
0.08826112002134323,
0.1134592592716217,
0.10618220269680023,
0.07911776751279831,
0.038353607058525085,
-0.02543691359460354,
0.0637282282114029,
-0.03060556761920452,
0.00750554446130991,
-0.005376668646931648,
-0.13288043439388275,
-0.1293914020061493,
0.03574774041771889,
0.1036253347992897,
-0.05032486468553543,
-0.013096512295305729,
-0.019756736233830452,
0.07323550432920456,
-0.11734418570995331,
-0.006862774956971407,
-0.09565560519695282,
-0.02595396712422371,
0.014868584461510181,
-0.12657959759235382,
-0.03536748141050339,
-0.009476209990680218,
-0.11669839918613434,
0.03583639860153198,
-0.0022651832550764084,
0.039427824318408966,
-0.09730164706707001,
-0.04772692546248436,
0.0922694131731987,
-0.03228858858346939,
0.12272964417934418,
0.08069181442260742,
-0.051768649369478226,
0.049250684678554535,
-0.08356716483831406,
-0.09318286925554276,
0.12463074177503586,
0.024099959060549736,
0.014949703589081764,
0.02491569332778454,
0.008805260062217712,
0.10662486404180527,
-0.03012217953801155,
-0.0022689674515277147,
-0.0892576053738594,
-0.15810437500476837,
-0.09775681048631668,
0.06389960646629333,
-0.07993653416633606,
-0.021994182839989662,
-0.12384723126888275,
0.181371808052063,
0.023204956203699112,
0.16885200142860413,
0.019053325057029724,
0.01815900392830372,
-0.026604201644659042,
0.03488294780254364,
-0.03651953861117363,
-0.09828165918588638,
-0.10430032759904861,
-0.018875649198889732,
-0.04203801229596138,
-0.025373676791787148,
0.27510735392570496,
-0.015977445989847183,
-0.12534897029399872,
0.08476509153842926,
0.041475895792245865,
0.026775456964969635,
0.024937231093645096,
0.2548011541366577,
0.06751658767461777,
0.005739097483456135,
-0.09423471242189407,
0.029435254633426666,
0.06435926258563995,
-0.07774854451417923,
0.08165944367647171,
0.15751929581165314,
0.08924635499715805,
0.09536637365818024,
-0.0028937198221683502,
0.041933804750442505,
-0.06548627465963364,
-0.07930850237607956,
0.035198647528886795,
0.052264343947172165,
0.04695205017924309,
0.12343577295541763,
0.15581828355789185,
-0.05377048999071121,
-0.0013967191334813833,
-0.04403802007436752,
0.03092820569872856,
-0.14404296875,
-0.11552020162343979,
-0.052140966057777405,
-0.12552975118160248,
0.03514053300023079,
-0.06156488507986069,
0.02320687659084797,
0.2429381012916565,
0.07160583883523941,
-0.0795685350894928,
-0.037974294275045395,
0.01659414917230606,
-0.014848267659544945,
0.04818182811141014,
0.02783849462866783,
-0.010389527305960655,
-0.03334972262382507,
-0.05690468102693558,
-0.05508106201887131,
0.01740362122654915,
-0.04411116987466812,
0.02001343108713627,
-0.04002311825752258,
0.07747011631727219,
-0.057979412376880646,
-0.05622478574514389,
-0.06657877564430237,
0.04227034002542496,
0.028153162449598312,
0.11362532526254654,
-0.02782123163342476,
0.0743708536028862,
0.11561092734336853,
0.2012292742729187,
-0.06425110995769501,
-0.081939198076725,
-0.08938087522983551,
0.036480724811553955,
0.030782196670770645,
0.06588292866945267,
0.03861319646239281,
-0.0008434547344222665,
-0.06628057360649109,
0.23006118834018707,
0.1968867927789688,
-0.05670779198408127,
0.003969068173319101,
-0.03699784725904465,
0.019483160227537155,
0.032656047493219376,
0.05721087008714676,
0.10964179784059525,
0.18044790625572205,
-0.14002865552902222,
-0.04857723414897919,
-0.08680815249681473,
0.05005316063761711,
-0.11211282759904861,
0.014327911660075188,
-0.0359933115541935,
-0.07458072155714035,
-0.0314447283744812,
0.11334214359521866,
-0.11928743124008179,
0.015610789880156517,
0.060788072645664215,
-0.09133467078208923,
-0.06239784508943558,
-0.032435499131679535,
0.18985740840435028,
0.08959393948316574,
0.05762568861246109,
-0.030245976522564888,
-0.10573684424161911,
0.09920911490917206,
0.033283308148384094,
-0.23948287963867188,
-0.07908772677183151,
0.11945568025112152,
0.032933756709098816,
0.13772284984588623,
-0.02695920318365097,
0.08603065460920334,
0.05459687113761902,
0.056818101555109024,
-0.10155890136957169,
0.05578337237238884,
0.03505956009030342,
-0.014669299125671387,
-0.027735263109207153,
-0.1222544014453888,
-0.027547717094421387,
-0.03486831858754158,
0.03252943232655525,
-0.04688536375761032,
0.021057846024632454,
-0.0417972207069397,
-0.0480581633746624,
-0.05411363020539284,
0.0702691599726677,
-0.0758950486779213,
0.05821794271469116,
-0.039036188274621964,
-0.0578802190721035,
-0.0725521445274353,
-0.05929379165172577,
-0.007331951055675745,
0.08088807016611099,
-0.14339745044708252,
-0.022131260484457016,
-0.015301241539418697,
0.001923867384903133,
0.007079050876200199,
0.05078783258795738,
-0.022348886355757713,
0.01272299699485302,
-0.1223883405327797,
-0.009405471384525299,
-0.08989009261131287,
-0.010455460287630558,
0.06923522055149078,
0.005880489479750395,
-0.0397980734705925,
0.028421051800251007,
0.02411198616027832,
0.0011080991243943572,
-0.07293609529733658,
-0.10089243948459625
] |
null | null | transformers |
# SELECTRA: A Spanish ELECTRA
SELECTRA is a Spanish pre-trained language model based on [ELECTRA](https://github.com/google-research/electra).
We release a `small` and `medium` version with the following configuration:
| Model | Layers | Embedding/Hidden Size | Params | Vocab Size | Max Sequence Length | Cased |
| --- | --- | --- | --- | --- | --- | --- |
| **SELECTRA small** | **12** | **256** | **22M** | **50k** | **512** | **True** |
| [SELECTRA medium](https://huggingface.co/Recognai/selectra_medium) | 12 | 384 | 41M | 50k | 512 | True |
**SELECTRA small (medium) is about 5 (3) times smaller than BETO but achieves comparable results** (see Metrics section below).
## Usage
From the original [ELECTRA model card](https://huggingface.co/google/electra-small-discriminator): "ELECTRA models are trained to distinguish "real" input tokens vs "fake" input tokens generated by another neural network, similar to the discriminator of a GAN."
The discriminator should therefore activate the logit corresponding to the fake input token, as the following example demonstrates:
```python
from transformers import ElectraForPreTraining, ElectraTokenizerFast
discriminator = ElectraForPreTraining.from_pretrained("Recognai/selectra_small")
tokenizer = ElectraTokenizerFast.from_pretrained("Recognai/selectra_small")
sentence_with_fake_token = "Estamos desayunando pan rosa con tomate y aceite de oliva."
inputs = tokenizer.encode(sentence_with_fake_token, return_tensors="pt")
logits = discriminator(inputs).logits.tolist()[0]
print("\t".join(tokenizer.tokenize(sentence_with_fake_token)))
print("\t".join(map(lambda x: str(x)[:4], logits[1:-1])))
"""Output:
Estamos desayun ##ando pan rosa con tomate y aceite de oliva .
-3.1 -3.6 -6.9 -3.0 0.19 -4.5 -3.3 -5.1 -5.7 -7.7 -4.4 -4.2
"""
```
However, you probably want to use this model to fine-tune it on a downstream task.
We provide models fine-tuned on the [XNLI dataset](https://huggingface.co/datasets/xnli), which can be used together with the zero-shot classification pipeline:
- [Zero-shot SELECTRA small](https://huggingface.co/Recognai/zeroshot_selectra_small)
- [Zero-shot SELECTRA medium](https://huggingface.co/Recognai/zeroshot_selectra_medium)
## Metrics
We fine-tune our models on 3 different down-stream tasks:
- [XNLI](https://huggingface.co/datasets/xnli)
- [PAWS-X](https://huggingface.co/datasets/paws-x)
- [CoNLL2002 - NER](https://huggingface.co/datasets/conll2002)
For each task, we conduct 5 trials and state the mean and standard deviation of the metrics in the table below.
To compare our results to other Spanish language models, we provide the same metrics taken from the [evaluation table](https://github.com/PlanTL-SANIDAD/lm-spanish#evaluation-) of the [Spanish Language Model](https://github.com/PlanTL-SANIDAD/lm-spanish) repo.
| Model | CoNLL2002 - NER (f1) | PAWS-X (acc) | XNLI (acc) | Params |
| --- | --- | --- | --- | --- |
| SELECTRA small | 0.865 +- 0.004 | 0.896 +- 0.002 | 0.784 +- 0.002 | 22M |
| SELECTRA medium | 0.873 +- 0.003 | 0.896 +- 0.002 | 0.804 +- 0.002 | 41M |
| | | | | |
| [mBERT](https://huggingface.co/bert-base-multilingual-cased) | 0.8691 | 0.8955 | 0.7876 | 178M |
| [BETO](https://huggingface.co/dccuchile/bert-base-spanish-wwm-cased) | 0.8759 | 0.9000 | 0.8130 | 110M |
| [RoBERTa-b](https://huggingface.co/BSC-TeMU/roberta-base-bne) | 0.8851 | 0.9000 | 0.8016 | 125M |
| [RoBERTa-l](https://huggingface.co/BSC-TeMU/roberta-large-bne) | 0.8772 | 0.9060 | 0.7958 | 355M |
| [Bertin](https://huggingface.co/bertin-project/bertin-roberta-base-spanish/tree/v1-512) | 0.8835 | 0.8990 | 0.7890 | 125M |
| [ELECTRICIDAD](https://huggingface.co/mrm8488/electricidad-base-discriminator) | 0.7954 | 0.9025 | 0.7878 | 109M |
Some details of our fine-tuning runs:
- epochs: 5
- batch-size: 32
- learning rate: 1e-4
- warmup proportion: 0.1
- linear learning rate decay
- layerwise learning rate decay
For all the details, check out our [selectra repo](https://github.com/recognai/selectra).
## Training
We pre-trained our SELECTRA models on the Spanish portion of the [Oscar](https://huggingface.co/datasets/oscar) dataset, which is about 150GB in size.
Each model version is trained for 300k steps, with a warm restart of the learning rate after the first 150k steps.
Some details of the training:
- steps: 300k
- batch-size: 128
- learning rate: 5e-4
- warmup steps: 10k
- linear learning rate decay
- TPU cores: 8 (v2-8)
For all details, check out our [selectra repo](https://github.com/recognai/selectra).
**Note:** Due to a misconfiguration in the pre-training scripts the embeddings of the vocabulary containing an accent were not optimized. If you fine-tune this model on a down-stream task, you might consider using a tokenizer that does not strip the accents:
```python
tokenizer = ElectraTokenizerFast.from_pretrained("Recognai/selectra_small", strip_accents=False)
```
## Motivation
Despite the abundance of excellent Spanish language models (BETO, BSC-BNE, Bertin, ELECTRICIDAD, etc.), we felt there was still a lack of distilled or compact Spanish language models and a lack of comparing those to their bigger siblings.
## Acknowledgment
This research was supported by the Google TPU Research Cloud (TRC) program.
## Authors
- David Fidalgo ([GitHub](https://github.com/dcfidalgo))
- Javier Lopez ([GitHub](https://github.com/javispp))
- Daniel Vila ([GitHub](https://github.com/dvsrepo))
- Francisco Aranda ([GitHub](https://github.com/frascuchon)) | {"language": ["es"], "license": "apache-2.0", "datasets": ["oscar"], "thumbnail": "url to a thumbnail used in social sharing"} | null | Recognai/selectra_small | [
"transformers",
"pytorch",
"electra",
"pretraining",
"es",
"dataset:oscar",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [
"es"
] | TAGS
#transformers #pytorch #electra #pretraining #es #dataset-oscar #license-apache-2.0 #endpoints_compatible #region-us
| SELECTRA: A Spanish ELECTRA
===========================
SELECTRA is a Spanish pre-trained language model based on ELECTRA.
We release a 'small' and 'medium' version with the following configuration:
SELECTRA small (medium) is about 5 (3) times smaller than BETO but achieves comparable results (see Metrics section below).
Usage
-----
From the original ELECTRA model card: "ELECTRA models are trained to distinguish "real" input tokens vs "fake" input tokens generated by another neural network, similar to the discriminator of a GAN."
The discriminator should therefore activate the logit corresponding to the fake input token, as the following example demonstrates:
However, you probably want to use this model to fine-tune it on a downstream task.
We provide models fine-tuned on the XNLI dataset, which can be used together with the zero-shot classification pipeline:
* Zero-shot SELECTRA small
* Zero-shot SELECTRA medium
Metrics
-------
We fine-tune our models on 3 different down-stream tasks:
* XNLI
* PAWS-X
* CoNLL2002 - NER
For each task, we conduct 5 trials and state the mean and standard deviation of the metrics in the table below.
To compare our results to other Spanish language models, we provide the same metrics taken from the evaluation table of the Spanish Language Model repo.
Some details of our fine-tuning runs:
* epochs: 5
* batch-size: 32
* learning rate: 1e-4
* warmup proportion: 0.1
* linear learning rate decay
* layerwise learning rate decay
For all the details, check out our selectra repo.
Training
--------
We pre-trained our SELECTRA models on the Spanish portion of the Oscar dataset, which is about 150GB in size.
Each model version is trained for 300k steps, with a warm restart of the learning rate after the first 150k steps.
Some details of the training:
* steps: 300k
* batch-size: 128
* learning rate: 5e-4
* warmup steps: 10k
* linear learning rate decay
* TPU cores: 8 (v2-8)
For all details, check out our selectra repo.
Note: Due to a misconfiguration in the pre-training scripts the embeddings of the vocabulary containing an accent were not optimized. If you fine-tune this model on a down-stream task, you might consider using a tokenizer that does not strip the accents:
Motivation
----------
Despite the abundance of excellent Spanish language models (BETO, BSC-BNE, Bertin, ELECTRICIDAD, etc.), we felt there was still a lack of distilled or compact Spanish language models and a lack of comparing those to their bigger siblings.
Acknowledgment
--------------
This research was supported by the Google TPU Research Cloud (TRC) program.
Authors
-------
* David Fidalgo (GitHub)
* Javier Lopez (GitHub)
* Daniel Vila (GitHub)
* Francisco Aranda (GitHub)
| [] | [
"TAGS\n#transformers #pytorch #electra #pretraining #es #dataset-oscar #license-apache-2.0 #endpoints_compatible #region-us \n"
] | [
43
] | [
"passage: TAGS\n#transformers #pytorch #electra #pretraining #es #dataset-oscar #license-apache-2.0 #endpoints_compatible #region-us \n"
] | [
-0.07683845609426498,
0.18955309689044952,
-0.00659080408513546,
0.030211126431822777,
0.09840341657400131,
0.018398506566882133,
0.05997311696410179,
0.11080925166606903,
0.021579211577773094,
-0.07366665452718735,
0.17904803156852722,
0.1963929682970047,
-0.012490801513195038,
0.06203972175717354,
-0.04274727404117584,
-0.2311093807220459,
0.08776770532131195,
0.05098164454102516,
-0.0660795122385025,
0.07429376989603043,
0.12509115040302277,
-0.0908072218298912,
0.03433637693524361,
-0.001981661655008793,
-0.04280845820903778,
0.029203612357378006,
0.0027932517696172,
-0.11046809703111649,
0.09528407454490662,
0.0433439202606678,
0.05660942941904068,
0.04187637194991112,
-0.003945271484553814,
-0.18933100998401642,
0.024477165192365646,
-0.023405157029628754,
-0.06838838756084442,
0.0481049008667469,
0.005227357614785433,
-0.06773369014263153,
0.07942680269479752,
0.07944031804800034,
0.007553336676210165,
0.03688516840338707,
-0.11076807230710983,
-0.227263405919075,
-0.08183244615793228,
0.06484026461839676,
0.03908763453364372,
0.10368122160434723,
0.017301246523857117,
0.12588448822498322,
-0.12457743287086487,
0.006704565137624741,
0.09981586784124374,
-0.3235836327075958,
-0.006895713973790407,
-0.00009184568625641987,
0.10064049810171127,
-0.040118273347616196,
0.014907030388712883,
0.039756011217832565,
0.07982202619314194,
0.012895988300442696,
0.051977209746837616,
-0.03437446057796478,
-0.115831159055233,
0.05333786830306053,
-0.05436762422323227,
-0.09276608377695084,
0.30669647455215454,
0.05915834382176399,
0.018835509195923805,
0.013405196368694305,
-0.026932386681437492,
-0.04537971317768097,
0.009641101583838463,
0.054854776710271835,
0.0600060410797596,
0.07845240086317062,
0.06095879152417183,
-0.006552563048899174,
-0.14847026765346527,
0.03572157770395279,
-0.1556803286075592,
0.10158172249794006,
0.038446176797151566,
0.09712142497301102,
-0.1331988126039505,
0.0663803294301033,
0.08458492904901505,
-0.09499779343605042,
-0.03272280842065811,
-0.09111962467432022,
0.07864515483379364,
0.043135304003953934,
-0.06845463812351227,
0.1323084980249405,
0.10315385460853577,
0.26082223653793335,
0.05503556504845619,
-0.035516344010829926,
0.08372398465871811,
0.13872717320919037,
0.027084210887551308,
0.03790615499019623,
-0.03928164392709732,
0.013039368204772472,
0.06708614528179169,
-0.10779742151498795,
0.07985562831163406,
-0.0014892334584146738,
-0.07478918135166168,
-0.041202038526535034,
-0.033591412007808685,
0.08783018589019775,
0.06948698312044144,
-0.01712065190076828,
-0.135088711977005,
0.0203086007386446,
0.08553238958120346,
-0.03558002784848213,
-0.03569525107741356,
-0.039561834186315536,
-0.0492158867418766,
0.09275542944669724,
0.04126298427581787,
0.056275349110364914,
-0.004177860915660858,
0.08156299591064453,
-0.08381997793912888,
-0.029664522036910057,
-0.036404192447662354,
0.0332149937748909,
0.11960088461637497,
-0.15177099406719208,
0.13456487655639648,
-0.12393051385879517,
-0.13795650005340576,
0.019349103793501854,
0.11472675949335098,
0.016697145998477936,
-0.08534852415323257,
0.020835570991039276,
0.006596914492547512,
-0.010913372039794922,
-0.0998326987028122,
-0.01683359034359455,
-0.10849346220493317,
0.05591447651386261,
-0.059524282813072205,
0.028101978823542595,
-0.1290932446718216,
0.03932229056954384,
-0.10448966920375824,
0.005622376222163439,
-0.01565554365515709,
-0.01768525131046772,
-0.08226142078638077,
0.20294217765331268,
-0.02058333531022072,
-0.06742178648710251,
-0.007004959974437952,
0.043303828686475754,
-0.04595400020480156,
0.15691711008548737,
-0.09736622869968414,
-0.03653354197740555,
0.19120758771896362,
-0.11523900926113129,
-0.1933595985174179,
0.06033594161272049,
-0.0184506643563509,
0.007397811394184828,
0.052724890410900116,
0.1307481974363327,
-0.05366017296910286,
-0.1042306199669838,
0.012393989600241184,
0.07811156660318375,
-0.08172012120485306,
-0.2300691157579422,
0.10923616588115692,
0.012491805478930473,
-0.04045041278004646,
0.02175549976527691,
-0.04057437181472778,
0.10328621417284012,
-0.05836952105164528,
-0.07137701660394669,
-0.033044006675481796,
-0.06102779135107994,
-0.052568309009075165,
0.04279453679919243,
0.026784313842654228,
-0.08594734221696854,
-0.05489518120884895,
-0.017800452187657356,
0.07400728017091751,
0.04104806110262871,
0.051676809787750244,
-0.13983629643917084,
0.029784638434648514,
0.03345511481165886,
-0.03187424689531326,
-0.12510646879673004,
0.025350071489810944,
-0.0433889739215374,
0.03954453021287918,
-0.023609254509210587,
0.13350386917591095,
0.023977432399988174,
-0.0974145457148552,
0.0022276989184319973,
-0.020353006199002266,
0.059764500707387924,
0.04148663207888603,
0.021746844053268433,
-0.1397637128829956,
0.011161454021930695,
-0.05857183039188385,
0.09135235100984573,
-0.02320033498108387,
0.013861481100320816,
0.03070036880671978,
0.1276272088289261,
-0.024713875725865364,
0.03992987796664238,
-0.014570309780538082,
-0.012186498381197453,
-0.05046781525015831,
-0.008326034992933273,
0.10448021441698074,
0.06690534949302673,
-0.09983803331851959,
0.12486027926206589,
0.01606069505214691,
0.2610210180282593,
0.16110078990459442,
-0.17241153120994568,
0.05737519636750221,
0.05270462855696678,
-0.050580933690071106,
0.025338858366012573,
0.04054780676960945,
0.05823718383908272,
0.04895817115902901,
0.0027748364955186844,
0.10972925275564194,
-0.02006663754582405,
-0.03325860947370529,
-0.016844142228364944,
-0.06572195887565613,
-0.012233463115990162,
0.04774356260895729,
0.19120846688747406,
-0.12210902571678162,
0.1692216992378235,
0.21270105242729187,
-0.019643964245915413,
0.08926520496606827,
-0.09068010002374649,
-0.008307944983243942,
0.045310668647289276,
-0.06480050832033157,
-0.046015556901693344,
0.09806573390960693,
-0.1624995321035385,
0.029955437406897545,
0.08063597232103348,
-0.030126234516501427,
0.04755592346191406,
-0.16667503118515015,
-0.1244606077671051,
0.021399881690740585,
-0.0017728526145219803,
-0.05820788815617561,
0.0540565624833107,
-0.06119184195995331,
0.06534013897180557,
-0.003912101499736309,
-0.09408536553382874,
0.11042118072509766,
-0.008553623221814632,
-0.07439586520195007,
0.1604304164648056,
-0.11787083745002747,
-0.20045000314712524,
-0.07261933386325836,
0.006280918139964342,
0.07298969477415085,
0.01999838463962078,
0.13620144128799438,
-0.00705955782905221,
-0.059916164726018906,
0.037398967891931534,
-0.10300920158624649,
-0.061136797070503235,
-0.0016949373530223966,
0.08097533881664276,
0.0326981358230114,
-0.03202525153756142,
-0.09505083411931992,
-0.04376409575343132,
-0.006692842114716768,
-0.014321565628051758,
0.09362632781267166,
-0.05509103462100029,
0.06938546895980835,
0.04436882585287094,
0.06021706014871597,
0.010703685693442822,
-0.026097144931554794,
0.14716221392154694,
-0.0628543272614479,
-0.05504655838012695,
0.20610104501247406,
0.003484223736450076,
0.05392543226480484,
0.12415309995412827,
0.03752589598298073,
-0.06601539254188538,
-0.008333723992109299,
-0.07004017382860184,
-0.0794602707028389,
-0.31762072443962097,
-0.103192038834095,
-0.07249736040830612,
0.042249079793691635,
0.018198125064373016,
0.07160263508558273,
0.06617629528045654,
0.1264808177947998,
-0.028306210413575172,
-0.1084650531411171,
-0.04376774653792381,
0.0253796074539423,
0.2205074578523636,
0.0002846337447408587,
0.0784611627459526,
-0.12917876243591309,
-0.017169533297419548,
0.12144012749195099,
0.056546032428741455,
0.17200814187526703,
0.1349295824766159,
-0.009811299853026867,
0.10839825123548508,
0.15554849803447723,
0.057749032974243164,
0.08291179686784744,
0.062318913638591766,
-0.020134776830673218,
-0.054722584784030914,
0.03058907575905323,
-0.06002378463745117,
0.09529902786016464,
-0.023830724880099297,
-0.12730635702610016,
-0.038288578391075134,
-0.10376398265361786,
0.030408570542931557,
0.2514234781265259,
0.015217569656670094,
-0.20484112203121185,
-0.0011771729914471507,
0.0925958901643753,
-0.019468454644083977,
-0.01118850614875555,
0.08879145234823227,
-0.040287233889102936,
-0.11680659651756287,
0.16708558797836304,
-0.00663050776347518,
0.08847757428884506,
-0.01810126192867756,
0.01984694041311741,
0.015132827684283257,
-0.15392804145812988,
0.10034053772687912,
0.11651712656021118,
-0.2822793126106262,
0.20575527846813202,
-0.04206530377268791,
-0.010670584626495838,
-0.0522872656583786,
-0.013797717168927193,
0.010988368652760983,
0.1823185384273529,
0.19412373006343842,
0.01118172612041235,
-0.0842834860086441,
0.04742049053311348,
-0.03279787674546242,
0.06286755949258804,
-0.055477578192949295,
-0.037176862359046936,
-0.046608082950115204,
-0.02160341665148735,
0.013750583864748478,
0.011115637607872486,
0.038130540400743484,
-0.06014873459935188,
-0.1267845630645752,
-0.007080703508108854,
0.0964130237698555,
0.11598934233188629,
-0.06270117312669754,
-0.06513774394989014,
-0.05746573582291603,
0.09513984620571136,
-0.14058910310268402,
-0.06985274702310562,
-0.0750175192952156,
-0.09175931662321091,
0.0518355667591095,
-0.052809521555900574,
0.11050676554441452,
-0.04460519552230835,
-0.037203237414360046,
-0.05680049583315849,
-0.14012745022773743,
0.14721499383449554,
-0.16627924144268036,
-0.013141771778464317,
-0.0440291129052639,
0.07015644758939743,
-0.03463279828429222,
0.05533347278833389,
0.008744316175580025,
0.013796636834740639,
-0.10917270183563232,
-0.09701994806528091,
-0.00454053794965148,
0.0712493509054184,
0.12470178306102753,
-0.044578395783901215,
-0.06366043537855148,
-0.030982619151473045,
0.03732270374894142,
-0.10516539961099625,
0.20240430533885956,
0.23342175781726837,
-0.10632002353668213,
0.11443380266427994,
0.14097005128860474,
-0.04885879531502724,
-0.2800050377845764,
-0.1288079470396042,
-0.10078492015600204,
-0.07400091737508774,
0.04574798420071602,
-0.2317659854888916,
0.10025875270366669,
0.1311509907245636,
-0.10330228507518768,
0.09569743275642395,
-0.22451922297477722,
-0.03333798795938492,
0.13373823463916779,
-0.003032251261174679,
0.36633893847465515,
-0.12809540331363678,
-0.03478893265128136,
-0.0336286723613739,
-0.21982096135616302,
0.1593703031539917,
0.032971106469631195,
0.04534896835684776,
0.003967553377151489,
-0.0778537467122078,
-0.04370960220694542,
-0.051381129771471024,
0.1452057957649231,
0.028033500537276268,
0.020046968013048172,
-0.06463898718357086,
0.03970823436975479,
0.005194030702114105,
-0.010823575779795647,
0.04354329779744148,
-0.0820116475224495,
0.02162904478609562,
-0.1878456026315689,
-0.026462065055966377,
-0.06448547542095184,
0.0961386188864708,
0.03504554182291031,
-0.040451351553201675,
-0.05900039151310921,
-0.008855590596795082,
0.026086945086717606,
0.0042853932827711105,
0.2542562186717987,
0.01933862268924713,
0.027421994134783745,
0.015319939702749252,
0.10616664588451385,
-0.17623215913772583,
-0.044952016323804855,
-0.12816685438156128,
-0.05068143829703331,
0.0676659345626831,
-0.1503879725933075,
0.002201367635279894,
0.13845059275627136,
-0.05515425279736519,
0.04683864489197731,
0.0494481660425663,
-0.012392787262797356,
0.0189041830599308,
0.15581437945365906,
-0.17241349816322327,
-0.020971275866031647,
-0.008551088161766529,
0.10398152470588684,
0.12375864386558533,
0.06887616217136383,
0.057986535131931305,
-0.01901349611580372,
-0.037876494228839874,
-0.0031160307116806507,
0.03581018000841141,
-0.07012946903705597,
0.024805475026369095,
0.05577515438199043,
0.008908101357519627,
-0.12422115355730057,
0.14138251543045044,
-0.01373280119150877,
-0.25174200534820557,
-0.007031968329101801,
0.04799224063754082,
-0.10588926076889038,
-0.09816496074199677,
-0.0664064809679985,
0.049093637615442276,
-0.2578333914279938,
-0.13206321001052856,
-0.06540492177009583,
-0.12128692865371704,
0.08826112002134323,
0.1134592592716217,
0.10618220269680023,
0.07911776751279831,
0.038353607058525085,
-0.02543691359460354,
0.0637282282114029,
-0.03060556761920452,
0.00750554446130991,
-0.005376668646931648,
-0.13288043439388275,
-0.1293914020061493,
0.03574774041771889,
0.1036253347992897,
-0.05032486468553543,
-0.013096512295305729,
-0.019756736233830452,
0.07323550432920456,
-0.11734418570995331,
-0.006862774956971407,
-0.09565560519695282,
-0.02595396712422371,
0.014868584461510181,
-0.12657959759235382,
-0.03536748141050339,
-0.009476209990680218,
-0.11669839918613434,
0.03583639860153198,
-0.0022651832550764084,
0.039427824318408966,
-0.09730164706707001,
-0.04772692546248436,
0.0922694131731987,
-0.03228858858346939,
0.12272964417934418,
0.08069181442260742,
-0.051768649369478226,
0.049250684678554535,
-0.08356716483831406,
-0.09318286925554276,
0.12463074177503586,
0.024099959060549736,
0.014949703589081764,
0.02491569332778454,
0.008805260062217712,
0.10662486404180527,
-0.03012217953801155,
-0.0022689674515277147,
-0.0892576053738594,
-0.15810437500476837,
-0.09775681048631668,
0.06389960646629333,
-0.07993653416633606,
-0.021994182839989662,
-0.12384723126888275,
0.181371808052063,
0.023204956203699112,
0.16885200142860413,
0.019053325057029724,
0.01815900392830372,
-0.026604201644659042,
0.03488294780254364,
-0.03651953861117363,
-0.09828165918588638,
-0.10430032759904861,
-0.018875649198889732,
-0.04203801229596138,
-0.025373676791787148,
0.27510735392570496,
-0.015977445989847183,
-0.12534897029399872,
0.08476509153842926,
0.041475895792245865,
0.026775456964969635,
0.024937231093645096,
0.2548011541366577,
0.06751658767461777,
0.005739097483456135,
-0.09423471242189407,
0.029435254633426666,
0.06435926258563995,
-0.07774854451417923,
0.08165944367647171,
0.15751929581165314,
0.08924635499715805,
0.09536637365818024,
-0.0028937198221683502,
0.041933804750442505,
-0.06548627465963364,
-0.07930850237607956,
0.035198647528886795,
0.052264343947172165,
0.04695205017924309,
0.12343577295541763,
0.15581828355789185,
-0.05377048999071121,
-0.0013967191334813833,
-0.04403802007436752,
0.03092820569872856,
-0.14404296875,
-0.11552020162343979,
-0.052140966057777405,
-0.12552975118160248,
0.03514053300023079,
-0.06156488507986069,
0.02320687659084797,
0.2429381012916565,
0.07160583883523941,
-0.0795685350894928,
-0.037974294275045395,
0.01659414917230606,
-0.014848267659544945,
0.04818182811141014,
0.02783849462866783,
-0.010389527305960655,
-0.03334972262382507,
-0.05690468102693558,
-0.05508106201887131,
0.01740362122654915,
-0.04411116987466812,
0.02001343108713627,
-0.04002311825752258,
0.07747011631727219,
-0.057979412376880646,
-0.05622478574514389,
-0.06657877564430237,
0.04227034002542496,
0.028153162449598312,
0.11362532526254654,
-0.02782123163342476,
0.0743708536028862,
0.11561092734336853,
0.2012292742729187,
-0.06425110995769501,
-0.081939198076725,
-0.08938087522983551,
0.036480724811553955,
0.030782196670770645,
0.06588292866945267,
0.03861319646239281,
-0.0008434547344222665,
-0.06628057360649109,
0.23006118834018707,
0.1968867927789688,
-0.05670779198408127,
0.003969068173319101,
-0.03699784725904465,
0.019483160227537155,
0.032656047493219376,
0.05721087008714676,
0.10964179784059525,
0.18044790625572205,
-0.14002865552902222,
-0.04857723414897919,
-0.08680815249681473,
0.05005316063761711,
-0.11211282759904861,
0.014327911660075188,
-0.0359933115541935,
-0.07458072155714035,
-0.0314447283744812,
0.11334214359521866,
-0.11928743124008179,
0.015610789880156517,
0.060788072645664215,
-0.09133467078208923,
-0.06239784508943558,
-0.032435499131679535,
0.18985740840435028,
0.08959393948316574,
0.05762568861246109,
-0.030245976522564888,
-0.10573684424161911,
0.09920911490917206,
0.033283308148384094,
-0.23948287963867188,
-0.07908772677183151,
0.11945568025112152,
0.032933756709098816,
0.13772284984588623,
-0.02695920318365097,
0.08603065460920334,
0.05459687113761902,
0.056818101555109024,
-0.10155890136957169,
0.05578337237238884,
0.03505956009030342,
-0.014669299125671387,
-0.027735263109207153,
-0.1222544014453888,
-0.027547717094421387,
-0.03486831858754158,
0.03252943232655525,
-0.04688536375761032,
0.021057846024632454,
-0.0417972207069397,
-0.0480581633746624,
-0.05411363020539284,
0.0702691599726677,
-0.0758950486779213,
0.05821794271469116,
-0.039036188274621964,
-0.0578802190721035,
-0.0725521445274353,
-0.05929379165172577,
-0.007331951055675745,
0.08088807016611099,
-0.14339745044708252,
-0.022131260484457016,
-0.015301241539418697,
0.001923867384903133,
0.007079050876200199,
0.05078783258795738,
-0.022348886355757713,
0.01272299699485302,
-0.1223883405327797,
-0.009405471384525299,
-0.08989009261131287,
-0.010455460287630558,
0.06923522055149078,
0.005880489479750395,
-0.0397980734705925,
0.028421051800251007,
0.02411198616027832,
0.0011080991243943572,
-0.07293609529733658,
-0.10089243948459625
] |
null | null | transformers | # Zero-shot SELECTRA: A zero-shot classifier based on SELECTRA
*Zero-shot SELECTRA* is a [SELECTRA model](https://huggingface.co/Recognai/selectra_small) fine-tuned on the Spanish portion of the [XNLI dataset](https://huggingface.co/datasets/xnli). You can use it with Hugging Face's [Zero-shot pipeline](https://huggingface.co/transformers/master/main_classes/pipelines.html#transformers.ZeroShotClassificationPipeline) to make [zero-shot classifications](https://joeddav.github.io/blog/2020/05/29/ZSL.html).
In comparison to our previous zero-shot classifier [based on BETO](https://huggingface.co/Recognai/bert-base-spanish-wwm-cased-xnli), zero-shot SELECTRA is **much more lightweight**. As shown in the *Metrics* section, the *small* version (5 times fewer parameters) performs slightly worse, while the *medium* version (3 times fewer parameters) **outperforms** the BETO based zero-shot classifier.
## Usage
```python
from transformers import pipeline
classifier = pipeline("zero-shot-classification",
model="Recognai/zeroshot_selectra_medium")
classifier(
"El autor se perfila, a los 50 años de su muerte, como uno de los grandes de su siglo",
candidate_labels=["cultura", "sociedad", "economia", "salud", "deportes"],
hypothesis_template="Este ejemplo es {}."
)
"""Output
{'sequence': 'El autor se perfila, a los 50 años de su muerte, como uno de los grandes de su siglo',
'labels': ['sociedad', 'cultura', 'economia', 'salud', 'deportes'],
'scores': [0.6450043320655823,
0.16710571944713593,
0.08507631719112396,
0.0759836807847023,
0.026829993352293968]}
"""
```
The `hypothesis_template` parameter is important and should be in Spanish. **In the widget on the right, this parameter is set to its default value: "This example is {}.", so different results are expected.**
## Demo and tutorial
If you want to see this model in action, we have created a basic tutorial using [Rubrix](https://www.rubrix.ml/), a free and open-source tool to *explore, annotate, and monitor data for NLP*.
The tutorial shows you how to evaluate this classifier for news categorization in Spanish, and how it could be used to build a training set for training a supervised classifier (which might be useful if you want obtain more precise results or improve the model over time).
You can [find the tutorial here](https://rubrix.readthedocs.io/en/master/tutorials/zeroshot_data_annotation.html).
See the video below showing the predictions within the annotation process (see that the predictions are almost correct for every example).
<video width="100%" controls><source src="https://github.com/recognai/rubrix-materials/raw/main/tutorials/videos/zeroshot_selectra_news_data_annotation.mp4" type="video/mp4"></video>
## Metrics
| Model | Params | XNLI (acc) | \*MLSUM (acc) |
| --- | --- | --- | --- |
| [zs BETO](https://huggingface.co/Recognai/bert-base-spanish-wwm-cased-xnli) | 110M | 0.799 | 0.530 |
| zs SELECTRA medium | 41M | **0.807** | **0.589** |
| [zs SELECTRA small](https://huggingface.co/Recognai/zeroshot_selectra_small) | **22M** | 0.795 | 0.446 |
\*evaluated with zero-shot learning (ZSL)
- **XNLI**: The stated accuracy refers to the test portion of the [XNLI dataset](https://huggingface.co/datasets/xnli), after finetuning the model on the training portion.
- **MLSUM**: For this accuracy we take the test set of the [MLSUM dataset](https://huggingface.co/datasets/mlsum) and classify the summaries of 5 selected labels. For details, check out our [evaluation notebook](https://github.com/recognai/selectra/blob/main/zero-shot_classifier/evaluation.ipynb)
## Training
Check out our [training notebook](https://github.com/recognai/selectra/blob/main/zero-shot_classifier/training.ipynb) for all the details.
## Authors
- David Fidalgo ([GitHub](https://github.com/dcfidalgo))
- Daniel Vila ([GitHub](https://github.com/dvsrepo))
- Francisco Aranda ([GitHub](https://github.com/frascuchon))
- Javier Lopez ([GitHub](https://github.com/javispp)) | {"language": "es", "license": "apache-2.0", "tags": ["zero-shot-classification", "nli", "pytorch"], "datasets": ["xnli"], "pipeline_tag": "zero-shot-classification", "widget": [{"text": "El autor se perfila, a los 50 a\u00f1os de su muerte, como uno de los grandes de su siglo", "candidate_labels": "cultura, sociedad, economia, salud, deportes"}]} | zero-shot-classification | Recognai/zeroshot_selectra_medium | [
"transformers",
"pytorch",
"safetensors",
"electra",
"text-classification",
"zero-shot-classification",
"nli",
"es",
"dataset:xnli",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [
"es"
] | TAGS
#transformers #pytorch #safetensors #electra #text-classification #zero-shot-classification #nli #es #dataset-xnli #license-apache-2.0 #autotrain_compatible #endpoints_compatible #has_space #region-us
| Zero-shot SELECTRA: A zero-shot classifier based on SELECTRA
============================================================
*Zero-shot SELECTRA* is a SELECTRA model fine-tuned on the Spanish portion of the XNLI dataset. You can use it with Hugging Face's Zero-shot pipeline to make zero-shot classifications.
In comparison to our previous zero-shot classifier based on BETO, zero-shot SELECTRA is much more lightweight. As shown in the *Metrics* section, the *small* version (5 times fewer parameters) performs slightly worse, while the *medium* version (3 times fewer parameters) outperforms the BETO based zero-shot classifier.
Usage
-----
The 'hypothesis\_template' parameter is important and should be in Spanish. In the widget on the right, this parameter is set to its default value: "This example is {}.", so different results are expected.
Demo and tutorial
-----------------
If you want to see this model in action, we have created a basic tutorial using Rubrix, a free and open-source tool to *explore, annotate, and monitor data for NLP*.
The tutorial shows you how to evaluate this classifier for news categorization in Spanish, and how it could be used to build a training set for training a supervised classifier (which might be useful if you want obtain more precise results or improve the model over time).
You can find the tutorial here.
See the video below showing the predictions within the annotation process (see that the predictions are almost correct for every example).
<source src="URL type="video/mp4">
Metrics
-------
\*evaluated with zero-shot learning (ZSL)
* XNLI: The stated accuracy refers to the test portion of the XNLI dataset, after finetuning the model on the training portion.
* MLSUM: For this accuracy we take the test set of the MLSUM dataset and classify the summaries of 5 selected labels. For details, check out our evaluation notebook
Training
--------
Check out our training notebook for all the details.
Authors
-------
* David Fidalgo (GitHub)
* Daniel Vila (GitHub)
* Francisco Aranda (GitHub)
* Javier Lopez (GitHub)
| [] | [
"TAGS\n#transformers #pytorch #safetensors #electra #text-classification #zero-shot-classification #nli #es #dataset-xnli #license-apache-2.0 #autotrain_compatible #endpoints_compatible #has_space #region-us \n"
] | [
73
] | [
"passage: TAGS\n#transformers #pytorch #safetensors #electra #text-classification #zero-shot-classification #nli #es #dataset-xnli #license-apache-2.0 #autotrain_compatible #endpoints_compatible #has_space #region-us \n"
] | [
-0.05387501418590546,
0.2051544040441513,
-0.00483085447922349,
0.055047307163476944,
0.0808182954788208,
-0.0030656105373054743,
0.12892752885818481,
0.13347908854484558,
0.012299665249884129,
-0.009982259944081306,
0.12316092103719711,
0.2141873687505722,
-0.0033817640505731106,
0.14095810055732727,
-0.10167360305786133,
-0.15776467323303223,
0.10856467485427856,
0.00896915327757597,
-0.03308935463428497,
0.08221783488988876,
0.10039664059877396,
-0.058199193328619,
0.04815131798386574,
-0.036145515739917755,
-0.023452669382095337,
0.01341644860804081,
0.032168544828891754,
-0.13793887197971344,
0.09271339327096939,
0.02490883693099022,
0.06932948529720306,
0.053730934858322144,
-0.020417731255292892,
-0.18917424976825714,
0.027695879340171814,
0.05399112403392792,
-0.09920374304056168,
0.05038870871067047,
0.01667352393269539,
-0.0640612319111824,
0.06642314046621323,
-0.044843923300504684,
-0.0026113081257790327,
0.01924026384949684,
-0.06858736276626587,
-0.17290236055850983,
-0.05036090686917305,
0.08223102241754532,
0.04099070653319359,
0.07224003970623016,
0.01786341518163681,
0.1943810135126114,
-0.10901793837547302,
0.08501283079385757,
0.1289275735616684,
-0.2923934757709503,
-0.013372303918004036,
0.06725780665874481,
0.044805023819208145,
0.056548360735177994,
-0.030449993908405304,
0.053749989718198776,
0.06660572439432144,
-0.018491702154278755,
0.04921789839863777,
-0.0017087490996345878,
-0.11134959012269974,
0.041352562606334686,
-0.0724756196141243,
-0.058890201151371,
0.2199925035238266,
0.0023912840988487005,
0.04938957467675209,
-0.04807514324784279,
-0.06092192977666855,
-0.056066278368234634,
-0.0018757217330858111,
0.05511344596743584,
0.005622342228889465,
0.03821394219994545,
0.0818798616528511,
0.021563513204455376,
-0.11444800347089767,
0.02444734424352646,
-0.1575709730386734,
0.15398111939430237,
0.03317846357822418,
0.06753256916999817,
-0.1042047068476677,
0.02882174775004387,
0.06939657777547836,
-0.12458769977092743,
0.006445017177611589,
-0.05514828488230705,
0.0639912337064743,
0.03092057630419731,
-0.05416446551680565,
0.07382047921419144,
0.18225522339344025,
0.22654764354228973,
0.019290413707494736,
0.00334250763989985,
-0.02580098621547222,
0.10144972801208496,
0.018562957644462585,
0.04411749169230461,
0.0052710035815835,
-0.05835171043872833,
0.09607745707035065,
-0.04268626496195793,
0.10231954604387283,
-0.058556970208883286,
-0.12192071229219437,
-0.005641259718686342,
0.057711582630872726,
0.11254018545150757,
0.09472858160734177,
0.05062602832913399,
-0.0319637693464756,
0.026483852416276932,
0.10984866321086884,
-0.07183212041854858,
0.046744223684072495,
0.0007889235857874155,
0.021698549389839172,
0.017528098076581955,
0.012825333513319492,
0.00901935063302517,
-0.06432293355464935,
0.020031893625855446,
-0.04571821913123131,
-0.024803346022963524,
-0.023765072226524353,
-0.07294146716594696,
0.1014041155576706,
-0.12662312388420105,
0.04035454988479614,
-0.1848539113998413,
-0.11756844818592072,
0.028942139819264412,
0.05616402253508568,
0.0014613077510148287,
-0.0924699455499649,
0.025535734370350838,
-0.04568909853696823,
0.049996741116046906,
-0.07446977496147156,
-0.07167315483093262,
-0.1176653802394867,
0.06027376651763916,
-0.0724615529179573,
0.06779051572084427,
-0.13961559534072876,
0.032292384654283524,
-0.12227360904216766,
-0.016004731878638268,
0.010701027698814869,
-0.022095652297139168,
-0.10532747209072113,
0.11410234123468399,
-0.019208461046218872,
-0.0384284071624279,
0.04333748668432236,
0.01984732411801815,
-0.04034924507141113,
0.1028117686510086,
-0.11667707562446594,
-0.03750584274530411,
0.12774565815925598,
-0.16616831719875336,
-0.17083901166915894,
0.09031616896390915,
0.019282888621091843,
-0.08155328780412674,
0.057523299008607864,
0.182608500123024,
0.0673249363899231,
-0.05210547521710396,
-0.043045997619628906,
0.10624352097511292,
-0.06770379841327667,
-0.1568426489830017,
0.04081230238080025,
0.06824634224176407,
-0.08787418901920319,
0.04247799888253212,
-0.017534326761960983,
0.05380826070904732,
-0.015127927996218204,
-0.09590858966112137,
-0.07935740053653717,
-0.05607103928923607,
0.06279781460762024,
0.016514385119080544,
0.015683377161622047,
-0.09948640316724777,
-0.060357045382261276,
-0.04269873723387718,
0.06664444506168365,
0.016311267390847206,
0.0323026143014431,
-0.0879238173365593,
0.12641385197639465,
-0.024620329961180687,
0.018019529059529305,
-0.1083228588104248,
-0.08622710406780243,
-0.01929285377264023,
0.002695694798603654,
-0.0028995790053159,
0.0672355443239212,
0.007694492116570473,
-0.010154965333640575,
0.006573069375008345,
-0.046860337257385254,
0.11624444276094437,
0.06691790372133255,
-0.03951007500290871,
-0.1685899794101715,
0.02699747309088707,
-0.0571700744330883,
0.04928869381546974,
-0.11543795466423035,
0.03630289062857628,
0.007931409403681755,
0.08599365502595901,
-0.03358053043484688,
0.09148512035608292,
-0.03091312013566494,
0.025296859443187714,
-0.10211871564388275,
-0.0110582634806633,
0.0714760348200798,
0.03384307399392128,
-0.08029713481664658,
0.1262313574552536,
-0.12453003972768784,
0.2985159754753113,
0.19230958819389343,
-0.1471753567457199,
0.06125999242067337,
0.04392791539430618,
-0.007478398270905018,
0.0018695896724238992,
-0.024207070469856262,
0.06031055003404617,
-0.06907819211483002,
-0.0162312351167202,
0.1258750706911087,
-0.06186245009303093,
-0.02258562110364437,
0.0015311110764741898,
-0.07571520656347275,
-0.05332536995410919,
0.07874606549739838,
0.11612428724765778,
-0.16173937916755676,
0.21250735223293304,
0.26776841282844543,
-0.05603073164820671,
0.08982688188552856,
-0.04758753255009651,
0.0291100163012743,
0.0262384545058012,
-0.052832312881946564,
-0.01789047010242939,
0.040254320949316025,
-0.07470431178808212,
0.03735392913222313,
0.09488163888454437,
0.028160033747553825,
0.017549453303217888,
-0.13708096742630005,
-0.057931430637836456,
0.010609772987663746,
-0.01829387992620468,
-0.04916159436106682,
0.05666384845972061,
0.006660635583102703,
0.1361021101474762,
-0.05844726786017418,
-0.1452195942401886,
0.09168752282857895,
-0.002654516138136387,
-0.0467582605779171,
0.1553085893392563,
-0.14346827566623688,
-0.29105618596076965,
-0.07698105275630951,
-0.06900990754365921,
-0.05984288454055786,
-0.00024470494827255607,
0.112838014960289,
-0.06466604769229889,
-0.05072299391031265,
-0.06922569125890732,
-0.13689568638801575,
-0.004197167698293924,
0.04042985662817955,
-0.047158196568489075,
0.06484851986169815,
0.023823056370019913,
-0.11938446760177612,
-0.058206502348184586,
0.015729449689388275,
-0.049773700535297394,
0.12259557843208313,
-0.003964363597333431,
0.09029324352741241,
0.13738524913787842,
-0.024704497307538986,
0.010054362937808037,
-0.02367834560573101,
0.11882800608873367,
-0.055961720645427704,
0.016395581886172295,
0.20846116542816162,
0.0006962534389458597,
0.06841018050909042,
0.14461348950862885,
0.010537425056099892,
-0.0304256659001112,
0.0026574889197945595,
-0.01729617267847061,
-0.0860641673207283,
-0.22546535730361938,
-0.14449766278266907,
-0.07737710326910019,
0.06419537216424942,
0.05571671947836876,
0.1083080843091011,
0.13751158118247986,
0.06533849239349365,
-0.019661730155348778,
-0.034994855523109436,
-0.003309518564492464,
0.0688367709517479,
0.1852417290210724,
0.01890689879655838,
0.13807645440101624,
-0.10032256692647934,
-0.05574826896190643,
0.09095770865678787,
0.06585220247507095,
0.10916781425476074,
0.06871739029884338,
-0.008694874122738838,
0.07819855213165283,
0.17322000861167908,
0.08995424211025238,
0.07923958450555801,
0.04452433064579964,
-0.018316127359867096,
-0.02253008261322975,
-0.01801964081823826,
-0.02222607657313347,
0.031101251021027565,
-0.06272503733634949,
-0.07910194247961044,
-0.027635999023914337,
-0.08368758112192154,
0.10655859857797623,
0.11347353458404541,
0.08368255943059921,
-0.20087243616580963,
0.019113756716251373,
0.09309538453817368,
-0.015161210671067238,
-0.04017527028918266,
0.08674801886081696,
-0.007291694637387991,
-0.03091236762702465,
0.131361186504364,
-0.017362205311655998,
0.09785826504230499,
-0.02618219330906868,
0.034995365887880325,
-0.0454828254878521,
-0.09960324317216873,
0.034809816628694534,
0.12268029153347015,
-0.235741525888443,
0.18217355012893677,
-0.025826547294855118,
-0.015934353694319725,
-0.08642257004976273,
-0.021518051624298096,
0.055558085441589355,
0.1874791383743286,
0.09109900891780853,
0.02498580515384674,
-0.19098924100399017,
-0.09260353446006775,
-0.08350583165884018,
0.04432785511016846,
-0.00562624866142869,
0.021818099543452263,
0.0007767795468680561,
-0.05086824670433998,
0.008038898929953575,
0.016549181193113327,
0.07860379666090012,
-0.043320558965206146,
-0.1553129404783249,
0.03439081832766533,
0.1670696884393692,
0.0032107264269143343,
-0.046329304575920105,
-0.06387095898389816,
-0.14618614315986633,
0.16749754548072815,
-0.10119529068470001,
-0.06433525681495667,
-0.08636875450611115,
-0.04856099933385849,
0.024965336546301842,
-0.044319745153188705,
0.050167545676231384,
-0.04883979260921478,
0.06459218263626099,
-0.04930935055017471,
-0.1851077675819397,
0.066643126308918,
-0.13950300216674805,
-0.07410252839326859,
-0.032240498811006546,
0.06916991621255875,
-0.06168915331363678,
0.01984417252242565,
0.04832376912236214,
0.041968103498220444,
-0.08729823678731918,
-0.10266570001840591,
-0.0022441167384386063,
0.015978362411260605,
0.058933358639478683,
0.05156755447387695,
-0.04839467257261276,
-0.12781864404678345,
0.0334433950483799,
-0.04484596475958824,
0.20835748314857483,
0.22835122048854828,
-0.10353544354438782,
0.1154387816786766,
0.11864061653614044,
-0.08920987695455551,
-0.29834702610969543,
-0.15628403425216675,
-0.15621572732925415,
-0.06602215766906738,
0.0343952476978302,
-0.07979012280702591,
0.12380561232566833,
0.06675191223621368,
-0.09258049726486206,
0.09138654917478561,
-0.20647558569908142,
-0.044284380972385406,
0.16853079199790955,
-0.009301352314651012,
0.3033722937107086,
-0.1483723372220993,
-0.0298710148781538,
-0.07552536576986313,
-0.03345399722456932,
0.2037406712770462,
-0.10611547529697418,
0.05365181714296341,
-0.028156889602541924,
-0.03550909087061882,
0.010034219361841679,
-0.0451323539018631,
0.13084156811237335,
-0.04854574427008629,
0.07448498159646988,
-0.1264340579509735,
-0.019313571974635124,
0.07003936171531677,
-0.061566270887851715,
0.050436049699783325,
-0.20545895397663116,
0.015690166503190994,
-0.08741392940282822,
-0.009655201807618141,
-0.052993252873420715,
0.0838758572936058,
0.0038910876028239727,
-0.01598351262509823,
-0.0648893490433693,
0.008468953892588615,
0.06007709726691246,
0.0029417762998491526,
0.2838088572025299,
0.02947373501956463,
0.1580941081047058,
0.12177993357181549,
0.04391666129231453,
-0.10350888967514038,
-0.005896913353353739,
-0.06329900026321411,
-0.05653301998972893,
0.07355337589979172,
-0.13937260210514069,
0.07323857396841049,
0.09087736904621124,
-0.06082434579730034,
0.0598389096558094,
0.08211461454629898,
0.04226662963628769,
-0.04012467339634895,
0.1584198921918869,
-0.14504966139793396,
-0.020235873758792877,
0.02568059228360653,
0.09394381195306778,
0.05786997824907303,
0.10740111768245697,
0.13586153090000153,
-0.01764673925936222,
-0.025294138118624687,
-0.0019393697148188949,
0.05780666694045067,
-0.029445679858326912,
0.045567259192466736,
0.07889535278081894,
0.03715920448303223,
-0.11646364629268646,
0.09717410057783127,
0.05709799379110336,
-0.08910477161407471,
0.02386900596320629,
0.052645206451416016,
-0.09787368774414062,
-0.1578790247440338,
0.05873388424515724,
0.04727870970964432,
-0.09396649152040482,
-0.1339034140110016,
-0.05807822570204735,
-0.1250123679637909,
0.06465853005647659,
0.12107788771390915,
0.10292351990938187,
0.03910181298851967,
0.016958793625235558,
-0.07283122837543488,
-0.02085622400045395,
0.045391689985990524,
-0.04683440551161766,
0.02641572616994381,
-0.15923210978507996,
-0.016119934618473053,
0.019793907180428505,
0.07899875193834305,
-0.05902518704533577,
-0.012667568400502205,
-0.10925661027431488,
0.020630601793527603,
-0.10741657763719559,
0.0402078703045845,
-0.07595133781433105,
0.0029473265167325735,
0.011087726801633835,
-0.06423210352659225,
-0.04458608850836754,
-0.009449009783565998,
-0.09043131023645401,
-0.0037798669654875994,
-0.010284851305186749,
0.08090619742870331,
-0.12626534700393677,
-0.0763154923915863,
0.03194527328014374,
-0.017425505444407463,
0.09669745713472366,
0.021387342363595963,
-0.06762682646512985,
0.05094264820218086,
-0.17100462317466736,
-0.0682716891169548,
0.10416458547115326,
0.06115340068936348,
0.01763000898063183,
-0.026096293702721596,
0.03134908527135849,
0.1251322478055954,
-0.05542096868157387,
0.048317525535821915,
-0.02497178502380848,
-0.11660414934158325,
-0.017253993079066277,
-0.05446786433458328,
-0.11242019385099411,
0.003453245386481285,
-0.07465243339538574,
0.12468556314706802,
-0.032235924154520035,
0.20202061533927917,
-0.0421365462243557,
0.006434870418161154,
-0.08277270942926407,
0.012129372917115688,
-0.0353839285671711,
-0.1719907522201538,
-0.14834780991077423,
-0.02489444985985756,
-0.006926980335265398,
-0.02003660425543785,
0.24895113706588745,
0.06722486764192581,
-0.0641251876950264,
0.05169932171702385,
0.04294581711292267,
-0.006369367241859436,
0.024425704032182693,
0.2124771773815155,
0.046355605125427246,
-0.01371139194816351,
-0.03099970333278179,
-0.038527220487594604,
0.04318513348698616,
-0.028477387502789497,
0.0787554383277893,
0.10066966712474823,
0.07891213148832321,
0.06413779407739639,
0.009849604219198227,
-0.04414355382323265,
-0.11726648360490799,
-0.08386973291635513,
-0.04531266912817955,
0.12435328215360641,
0.037184134125709534,
0.05052722245454788,
0.15028348565101624,
-0.029083723202347755,
0.009923864156007767,
-0.05752842128276825,
-0.01019437238574028,
-0.17055989801883698,
-0.21660038828849792,
-0.0938776507973671,
-0.10736684501171112,
-0.020827289670705795,
-0.07911214232444763,
-0.0010861436603590846,
0.11179789155721664,
0.04446709156036377,
-0.061952654272317886,
-0.04771986976265907,
0.03775682672858238,
0.003904404118657112,
0.025310203433036804,
-0.01582220382988453,
-0.011777427047491074,
-0.000343171413987875,
-0.03059724159538746,
-0.06738303601741791,
-0.01696888729929924,
-0.03310570865869522,
0.037791505455970764,
0.004773354157805443,
0.09270191192626953,
-0.11477713286876678,
-0.08785588294267654,
-0.03195410966873169,
-0.0075392103753983974,
0.03082258068025112,
0.16079775989055634,
0.030512463301420212,
0.01691404916346073,
0.0907694473862648,
0.20559430122375488,
-0.03264369070529938,
-0.14971405267715454,
-0.03445136174559593,
0.14124837517738342,
0.022844232618808746,
0.049967627972364426,
-0.013527176342904568,
0.003966030664741993,
-0.06544696539640427,
0.19026914238929749,
0.26540252566337585,
-0.03140486031770706,
0.04485868290066719,
-0.058964282274246216,
-0.001264002756215632,
0.031351976096630096,
0.07823557406663895,
0.10914626717567444,
0.17812545597553253,
-0.06687017530202866,
-0.005456386599689722,
-0.07602094858884811,
0.0419095940887928,
-0.1563708782196045,
0.04974354803562164,
-0.01734304055571556,
-0.06620311737060547,
-0.03612294793128967,
0.07337763160467148,
-0.10302326083183289,
0.08594819158315659,
0.048723768442869186,
-0.13066284358501434,
-0.09165183454751968,
0.013273049145936966,
0.1904364973306656,
-0.0007997379871085286,
0.004177446477115154,
-0.04256977513432503,
-0.04159393906593323,
-0.003112673293799162,
-0.03531518578529358,
-0.1357811838388443,
-0.04084238409996033,
0.04404858499765396,
0.03774557635188103,
0.16480259597301483,
0.0011982027208432555,
0.06661088019609451,
0.10752831399440765,
0.029819896444678307,
-0.10174287110567093,
0.12669888138771057,
0.002635407028719783,
-0.04740220680832863,
0.04665013775229454,
-0.09556066244840622,
-0.023181362077593803,
-0.005693471059203148,
0.10762418061494827,
-0.048812560737133026,
0.015251524746418,
-0.010503946803510189,
-0.05951569974422455,
-0.018864305689930916,
0.03914494812488556,
-0.040739547461271286,
0.08012192696332932,
0.008694096468389034,
-0.05200501158833504,
-0.05175472050905228,
-0.04705476760864258,
0.005877370480448008,
-0.011458886787295341,
-0.13306820392608643,
-0.03198770061135292,
-0.05165901780128479,
-0.0024968415964394808,
0.056966423988342285,
0.07905115932226181,
-0.08648340404033661,
-0.017602669075131416,
-0.10815952718257904,
-0.015609056688845158,
-0.152584508061409,
0.028680169954895973,
0.1063011959195137,
-0.02363627962768078,
-0.03850589320063591,
-0.053876884281635284,
0.007938290014863014,
0.04658912867307663,
-0.07255688309669495,
-0.07717888802289963
] |
null | null | transformers | # Zero-shot SELECTRA: A zero-shot classifier based on SELECTRA
*Zero-shot SELECTRA* is a [SELECTRA model](https://huggingface.co/Recognai/selectra_small) fine-tuned on the Spanish portion of the [XNLI dataset](https://huggingface.co/datasets/xnli). You can use it with Hugging Face's [Zero-shot pipeline](https://huggingface.co/transformers/master/main_classes/pipelines.html#transformers.ZeroShotClassificationPipeline) to make [zero-shot classifications](https://joeddav.github.io/blog/2020/05/29/ZSL.html).
In comparison to our previous zero-shot classifier [based on BETO](https://huggingface.co/Recognai/bert-base-spanish-wwm-cased-xnli), zero-shot SELECTRA is **much more lightweight**. As shown in the *Metrics* section, the *small* version (5 times fewer parameters) performs slightly worse, while the *medium* version (3 times fewer parameters) **outperforms** the BETO based zero-shot classifier.
## Usage
```python
from transformers import pipeline
classifier = pipeline("zero-shot-classification",
model="Recognai/zeroshot_selectra_medium")
classifier(
"El autor se perfila, a los 50 años de su muerte, como uno de los grandes de su siglo",
candidate_labels=["cultura", "sociedad", "economia", "salud", "deportes"],
hypothesis_template="Este ejemplo es {}."
)
"""Output
{'sequence': 'El autor se perfila, a los 50 años de su muerte, como uno de los grandes de su siglo',
'labels': ['sociedad', 'cultura', 'salud', 'economia', 'deportes'],
'scores': [0.3711881935596466,
0.25650349259376526,
0.17355826497077942,
0.1641489565372467,
0.03460107371211052]}
"""
```
The `hypothesis_template` parameter is important and should be in Spanish. **In the widget on the right, this parameter is set to its default value: "This example is {}.", so different results are expected.**
## Metrics
| Model | Params | XNLI (acc) | \*MLSUM (acc) |
| --- | --- | --- | --- |
| [zs BETO](https://huggingface.co/Recognai/bert-base-spanish-wwm-cased-xnli) | 110M | 0.799 | 0.530 |
| [zs SELECTRA medium](https://huggingface.co/Recognai/zeroshot_selectra_medium) | 41M | **0.807** | **0.589** |
| zs SELECTRA small | **22M** | 0.795 | 0.446 |
\*evaluated with zero-shot learning (ZSL)
- **XNLI**: The stated accuracy refers to the test portion of the [XNLI dataset](https://huggingface.co/datasets/xnli), after finetuning the model on the training portion.
- **MLSUM**: For this accuracy we take the test set of the [MLSUM dataset](https://huggingface.co/datasets/mlsum) and classify the summaries of 5 selected labels. For details, check out our [evaluation notebook](https://github.com/recognai/selectra/blob/main/zero-shot_classifier/evaluation.ipynb)
## Training
Check out our [training notebook](https://github.com/recognai/selectra/blob/main/zero-shot_classifier/training.ipynb) for all the details.
## Authors
- David Fidalgo ([GitHub](https://github.com/dcfidalgo))
- Daniel Vila ([GitHub](https://github.com/dvsrepo))
- Francisco Aranda ([GitHub](https://github.com/frascuchon))
- Javier Lopez ([GitHub](https://github.com/javispp)) | {"language": "es", "license": "apache-2.0", "tags": ["zero-shot-classification", "nli", "pytorch"], "datasets": ["xnli"], "pipeline_tag": "zero-shot-classification", "widget": [{"text": "El autor se perfila, a los 50 a\u00f1os de su muerte, como uno de los grandes de su siglo", "candidate_labels": "cultura, sociedad, economia, salud, deportes"}]} | zero-shot-classification | Recognai/zeroshot_selectra_small | [
"transformers",
"pytorch",
"safetensors",
"electra",
"text-classification",
"zero-shot-classification",
"nli",
"es",
"dataset:xnli",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [
"es"
] | TAGS
#transformers #pytorch #safetensors #electra #text-classification #zero-shot-classification #nli #es #dataset-xnli #license-apache-2.0 #autotrain_compatible #endpoints_compatible #has_space #region-us
| Zero-shot SELECTRA: A zero-shot classifier based on SELECTRA
============================================================
*Zero-shot SELECTRA* is a SELECTRA model fine-tuned on the Spanish portion of the XNLI dataset. You can use it with Hugging Face's Zero-shot pipeline to make zero-shot classifications.
In comparison to our previous zero-shot classifier based on BETO, zero-shot SELECTRA is much more lightweight. As shown in the *Metrics* section, the *small* version (5 times fewer parameters) performs slightly worse, while the *medium* version (3 times fewer parameters) outperforms the BETO based zero-shot classifier.
Usage
-----
The 'hypothesis\_template' parameter is important and should be in Spanish. In the widget on the right, this parameter is set to its default value: "This example is {}.", so different results are expected.
Metrics
-------
\*evaluated with zero-shot learning (ZSL)
* XNLI: The stated accuracy refers to the test portion of the XNLI dataset, after finetuning the model on the training portion.
* MLSUM: For this accuracy we take the test set of the MLSUM dataset and classify the summaries of 5 selected labels. For details, check out our evaluation notebook
Training
--------
Check out our training notebook for all the details.
Authors
-------
* David Fidalgo (GitHub)
* Daniel Vila (GitHub)
* Francisco Aranda (GitHub)
* Javier Lopez (GitHub)
| [] | [
"TAGS\n#transformers #pytorch #safetensors #electra #text-classification #zero-shot-classification #nli #es #dataset-xnli #license-apache-2.0 #autotrain_compatible #endpoints_compatible #has_space #region-us \n"
] | [
73
] | [
"passage: TAGS\n#transformers #pytorch #safetensors #electra #text-classification #zero-shot-classification #nli #es #dataset-xnli #license-apache-2.0 #autotrain_compatible #endpoints_compatible #has_space #region-us \n"
] | [
-0.05387501418590546,
0.2051544040441513,
-0.00483085447922349,
0.055047307163476944,
0.0808182954788208,
-0.0030656105373054743,
0.12892752885818481,
0.13347908854484558,
0.012299665249884129,
-0.009982259944081306,
0.12316092103719711,
0.2141873687505722,
-0.0033817640505731106,
0.14095810055732727,
-0.10167360305786133,
-0.15776467323303223,
0.10856467485427856,
0.00896915327757597,
-0.03308935463428497,
0.08221783488988876,
0.10039664059877396,
-0.058199193328619,
0.04815131798386574,
-0.036145515739917755,
-0.023452669382095337,
0.01341644860804081,
0.032168544828891754,
-0.13793887197971344,
0.09271339327096939,
0.02490883693099022,
0.06932948529720306,
0.053730934858322144,
-0.020417731255292892,
-0.18917424976825714,
0.027695879340171814,
0.05399112403392792,
-0.09920374304056168,
0.05038870871067047,
0.01667352393269539,
-0.0640612319111824,
0.06642314046621323,
-0.044843923300504684,
-0.0026113081257790327,
0.01924026384949684,
-0.06858736276626587,
-0.17290236055850983,
-0.05036090686917305,
0.08223102241754532,
0.04099070653319359,
0.07224003970623016,
0.01786341518163681,
0.1943810135126114,
-0.10901793837547302,
0.08501283079385757,
0.1289275735616684,
-0.2923934757709503,
-0.013372303918004036,
0.06725780665874481,
0.044805023819208145,
0.056548360735177994,
-0.030449993908405304,
0.053749989718198776,
0.06660572439432144,
-0.018491702154278755,
0.04921789839863777,
-0.0017087490996345878,
-0.11134959012269974,
0.041352562606334686,
-0.0724756196141243,
-0.058890201151371,
0.2199925035238266,
0.0023912840988487005,
0.04938957467675209,
-0.04807514324784279,
-0.06092192977666855,
-0.056066278368234634,
-0.0018757217330858111,
0.05511344596743584,
0.005622342228889465,
0.03821394219994545,
0.0818798616528511,
0.021563513204455376,
-0.11444800347089767,
0.02444734424352646,
-0.1575709730386734,
0.15398111939430237,
0.03317846357822418,
0.06753256916999817,
-0.1042047068476677,
0.02882174775004387,
0.06939657777547836,
-0.12458769977092743,
0.006445017177611589,
-0.05514828488230705,
0.0639912337064743,
0.03092057630419731,
-0.05416446551680565,
0.07382047921419144,
0.18225522339344025,
0.22654764354228973,
0.019290413707494736,
0.00334250763989985,
-0.02580098621547222,
0.10144972801208496,
0.018562957644462585,
0.04411749169230461,
0.0052710035815835,
-0.05835171043872833,
0.09607745707035065,
-0.04268626496195793,
0.10231954604387283,
-0.058556970208883286,
-0.12192071229219437,
-0.005641259718686342,
0.057711582630872726,
0.11254018545150757,
0.09472858160734177,
0.05062602832913399,
-0.0319637693464756,
0.026483852416276932,
0.10984866321086884,
-0.07183212041854858,
0.046744223684072495,
0.0007889235857874155,
0.021698549389839172,
0.017528098076581955,
0.012825333513319492,
0.00901935063302517,
-0.06432293355464935,
0.020031893625855446,
-0.04571821913123131,
-0.024803346022963524,
-0.023765072226524353,
-0.07294146716594696,
0.1014041155576706,
-0.12662312388420105,
0.04035454988479614,
-0.1848539113998413,
-0.11756844818592072,
0.028942139819264412,
0.05616402253508568,
0.0014613077510148287,
-0.0924699455499649,
0.025535734370350838,
-0.04568909853696823,
0.049996741116046906,
-0.07446977496147156,
-0.07167315483093262,
-0.1176653802394867,
0.06027376651763916,
-0.0724615529179573,
0.06779051572084427,
-0.13961559534072876,
0.032292384654283524,
-0.12227360904216766,
-0.016004731878638268,
0.010701027698814869,
-0.022095652297139168,
-0.10532747209072113,
0.11410234123468399,
-0.019208461046218872,
-0.0384284071624279,
0.04333748668432236,
0.01984732411801815,
-0.04034924507141113,
0.1028117686510086,
-0.11667707562446594,
-0.03750584274530411,
0.12774565815925598,
-0.16616831719875336,
-0.17083901166915894,
0.09031616896390915,
0.019282888621091843,
-0.08155328780412674,
0.057523299008607864,
0.182608500123024,
0.0673249363899231,
-0.05210547521710396,
-0.043045997619628906,
0.10624352097511292,
-0.06770379841327667,
-0.1568426489830017,
0.04081230238080025,
0.06824634224176407,
-0.08787418901920319,
0.04247799888253212,
-0.017534326761960983,
0.05380826070904732,
-0.015127927996218204,
-0.09590858966112137,
-0.07935740053653717,
-0.05607103928923607,
0.06279781460762024,
0.016514385119080544,
0.015683377161622047,
-0.09948640316724777,
-0.060357045382261276,
-0.04269873723387718,
0.06664444506168365,
0.016311267390847206,
0.0323026143014431,
-0.0879238173365593,
0.12641385197639465,
-0.024620329961180687,
0.018019529059529305,
-0.1083228588104248,
-0.08622710406780243,
-0.01929285377264023,
0.002695694798603654,
-0.0028995790053159,
0.0672355443239212,
0.007694492116570473,
-0.010154965333640575,
0.006573069375008345,
-0.046860337257385254,
0.11624444276094437,
0.06691790372133255,
-0.03951007500290871,
-0.1685899794101715,
0.02699747309088707,
-0.0571700744330883,
0.04928869381546974,
-0.11543795466423035,
0.03630289062857628,
0.007931409403681755,
0.08599365502595901,
-0.03358053043484688,
0.09148512035608292,
-0.03091312013566494,
0.025296859443187714,
-0.10211871564388275,
-0.0110582634806633,
0.0714760348200798,
0.03384307399392128,
-0.08029713481664658,
0.1262313574552536,
-0.12453003972768784,
0.2985159754753113,
0.19230958819389343,
-0.1471753567457199,
0.06125999242067337,
0.04392791539430618,
-0.007478398270905018,
0.0018695896724238992,
-0.024207070469856262,
0.06031055003404617,
-0.06907819211483002,
-0.0162312351167202,
0.1258750706911087,
-0.06186245009303093,
-0.02258562110364437,
0.0015311110764741898,
-0.07571520656347275,
-0.05332536995410919,
0.07874606549739838,
0.11612428724765778,
-0.16173937916755676,
0.21250735223293304,
0.26776841282844543,
-0.05603073164820671,
0.08982688188552856,
-0.04758753255009651,
0.0291100163012743,
0.0262384545058012,
-0.052832312881946564,
-0.01789047010242939,
0.040254320949316025,
-0.07470431178808212,
0.03735392913222313,
0.09488163888454437,
0.028160033747553825,
0.017549453303217888,
-0.13708096742630005,
-0.057931430637836456,
0.010609772987663746,
-0.01829387992620468,
-0.04916159436106682,
0.05666384845972061,
0.006660635583102703,
0.1361021101474762,
-0.05844726786017418,
-0.1452195942401886,
0.09168752282857895,
-0.002654516138136387,
-0.0467582605779171,
0.1553085893392563,
-0.14346827566623688,
-0.29105618596076965,
-0.07698105275630951,
-0.06900990754365921,
-0.05984288454055786,
-0.00024470494827255607,
0.112838014960289,
-0.06466604769229889,
-0.05072299391031265,
-0.06922569125890732,
-0.13689568638801575,
-0.004197167698293924,
0.04042985662817955,
-0.047158196568489075,
0.06484851986169815,
0.023823056370019913,
-0.11938446760177612,
-0.058206502348184586,
0.015729449689388275,
-0.049773700535297394,
0.12259557843208313,
-0.003964363597333431,
0.09029324352741241,
0.13738524913787842,
-0.024704497307538986,
0.010054362937808037,
-0.02367834560573101,
0.11882800608873367,
-0.055961720645427704,
0.016395581886172295,
0.20846116542816162,
0.0006962534389458597,
0.06841018050909042,
0.14461348950862885,
0.010537425056099892,
-0.0304256659001112,
0.0026574889197945595,
-0.01729617267847061,
-0.0860641673207283,
-0.22546535730361938,
-0.14449766278266907,
-0.07737710326910019,
0.06419537216424942,
0.05571671947836876,
0.1083080843091011,
0.13751158118247986,
0.06533849239349365,
-0.019661730155348778,
-0.034994855523109436,
-0.003309518564492464,
0.0688367709517479,
0.1852417290210724,
0.01890689879655838,
0.13807645440101624,
-0.10032256692647934,
-0.05574826896190643,
0.09095770865678787,
0.06585220247507095,
0.10916781425476074,
0.06871739029884338,
-0.008694874122738838,
0.07819855213165283,
0.17322000861167908,
0.08995424211025238,
0.07923958450555801,
0.04452433064579964,
-0.018316127359867096,
-0.02253008261322975,
-0.01801964081823826,
-0.02222607657313347,
0.031101251021027565,
-0.06272503733634949,
-0.07910194247961044,
-0.027635999023914337,
-0.08368758112192154,
0.10655859857797623,
0.11347353458404541,
0.08368255943059921,
-0.20087243616580963,
0.019113756716251373,
0.09309538453817368,
-0.015161210671067238,
-0.04017527028918266,
0.08674801886081696,
-0.007291694637387991,
-0.03091236762702465,
0.131361186504364,
-0.017362205311655998,
0.09785826504230499,
-0.02618219330906868,
0.034995365887880325,
-0.0454828254878521,
-0.09960324317216873,
0.034809816628694534,
0.12268029153347015,
-0.235741525888443,
0.18217355012893677,
-0.025826547294855118,
-0.015934353694319725,
-0.08642257004976273,
-0.021518051624298096,
0.055558085441589355,
0.1874791383743286,
0.09109900891780853,
0.02498580515384674,
-0.19098924100399017,
-0.09260353446006775,
-0.08350583165884018,
0.04432785511016846,
-0.00562624866142869,
0.021818099543452263,
0.0007767795468680561,
-0.05086824670433998,
0.008038898929953575,
0.016549181193113327,
0.07860379666090012,
-0.043320558965206146,
-0.1553129404783249,
0.03439081832766533,
0.1670696884393692,
0.0032107264269143343,
-0.046329304575920105,
-0.06387095898389816,
-0.14618614315986633,
0.16749754548072815,
-0.10119529068470001,
-0.06433525681495667,
-0.08636875450611115,
-0.04856099933385849,
0.024965336546301842,
-0.044319745153188705,
0.050167545676231384,
-0.04883979260921478,
0.06459218263626099,
-0.04930935055017471,
-0.1851077675819397,
0.066643126308918,
-0.13950300216674805,
-0.07410252839326859,
-0.032240498811006546,
0.06916991621255875,
-0.06168915331363678,
0.01984417252242565,
0.04832376912236214,
0.041968103498220444,
-0.08729823678731918,
-0.10266570001840591,
-0.0022441167384386063,
0.015978362411260605,
0.058933358639478683,
0.05156755447387695,
-0.04839467257261276,
-0.12781864404678345,
0.0334433950483799,
-0.04484596475958824,
0.20835748314857483,
0.22835122048854828,
-0.10353544354438782,
0.1154387816786766,
0.11864061653614044,
-0.08920987695455551,
-0.29834702610969543,
-0.15628403425216675,
-0.15621572732925415,
-0.06602215766906738,
0.0343952476978302,
-0.07979012280702591,
0.12380561232566833,
0.06675191223621368,
-0.09258049726486206,
0.09138654917478561,
-0.20647558569908142,
-0.044284380972385406,
0.16853079199790955,
-0.009301352314651012,
0.3033722937107086,
-0.1483723372220993,
-0.0298710148781538,
-0.07552536576986313,
-0.03345399722456932,
0.2037406712770462,
-0.10611547529697418,
0.05365181714296341,
-0.028156889602541924,
-0.03550909087061882,
0.010034219361841679,
-0.0451323539018631,
0.13084156811237335,
-0.04854574427008629,
0.07448498159646988,
-0.1264340579509735,
-0.019313571974635124,
0.07003936171531677,
-0.061566270887851715,
0.050436049699783325,
-0.20545895397663116,
0.015690166503190994,
-0.08741392940282822,
-0.009655201807618141,
-0.052993252873420715,
0.0838758572936058,
0.0038910876028239727,
-0.01598351262509823,
-0.0648893490433693,
0.008468953892588615,
0.06007709726691246,
0.0029417762998491526,
0.2838088572025299,
0.02947373501956463,
0.1580941081047058,
0.12177993357181549,
0.04391666129231453,
-0.10350888967514038,
-0.005896913353353739,
-0.06329900026321411,
-0.05653301998972893,
0.07355337589979172,
-0.13937260210514069,
0.07323857396841049,
0.09087736904621124,
-0.06082434579730034,
0.0598389096558094,
0.08211461454629898,
0.04226662963628769,
-0.04012467339634895,
0.1584198921918869,
-0.14504966139793396,
-0.020235873758792877,
0.02568059228360653,
0.09394381195306778,
0.05786997824907303,
0.10740111768245697,
0.13586153090000153,
-0.01764673925936222,
-0.025294138118624687,
-0.0019393697148188949,
0.05780666694045067,
-0.029445679858326912,
0.045567259192466736,
0.07889535278081894,
0.03715920448303223,
-0.11646364629268646,
0.09717410057783127,
0.05709799379110336,
-0.08910477161407471,
0.02386900596320629,
0.052645206451416016,
-0.09787368774414062,
-0.1578790247440338,
0.05873388424515724,
0.04727870970964432,
-0.09396649152040482,
-0.1339034140110016,
-0.05807822570204735,
-0.1250123679637909,
0.06465853005647659,
0.12107788771390915,
0.10292351990938187,
0.03910181298851967,
0.016958793625235558,
-0.07283122837543488,
-0.02085622400045395,
0.045391689985990524,
-0.04683440551161766,
0.02641572616994381,
-0.15923210978507996,
-0.016119934618473053,
0.019793907180428505,
0.07899875193834305,
-0.05902518704533577,
-0.012667568400502205,
-0.10925661027431488,
0.020630601793527603,
-0.10741657763719559,
0.0402078703045845,
-0.07595133781433105,
0.0029473265167325735,
0.011087726801633835,
-0.06423210352659225,
-0.04458608850836754,
-0.009449009783565998,
-0.09043131023645401,
-0.0037798669654875994,
-0.010284851305186749,
0.08090619742870331,
-0.12626534700393677,
-0.0763154923915863,
0.03194527328014374,
-0.017425505444407463,
0.09669745713472366,
0.021387342363595963,
-0.06762682646512985,
0.05094264820218086,
-0.17100462317466736,
-0.0682716891169548,
0.10416458547115326,
0.06115340068936348,
0.01763000898063183,
-0.026096293702721596,
0.03134908527135849,
0.1251322478055954,
-0.05542096868157387,
0.048317525535821915,
-0.02497178502380848,
-0.11660414934158325,
-0.017253993079066277,
-0.05446786433458328,
-0.11242019385099411,
0.003453245386481285,
-0.07465243339538574,
0.12468556314706802,
-0.032235924154520035,
0.20202061533927917,
-0.0421365462243557,
0.006434870418161154,
-0.08277270942926407,
0.012129372917115688,
-0.0353839285671711,
-0.1719907522201538,
-0.14834780991077423,
-0.02489444985985756,
-0.006926980335265398,
-0.02003660425543785,
0.24895113706588745,
0.06722486764192581,
-0.0641251876950264,
0.05169932171702385,
0.04294581711292267,
-0.006369367241859436,
0.024425704032182693,
0.2124771773815155,
0.046355605125427246,
-0.01371139194816351,
-0.03099970333278179,
-0.038527220487594604,
0.04318513348698616,
-0.028477387502789497,
0.0787554383277893,
0.10066966712474823,
0.07891213148832321,
0.06413779407739639,
0.009849604219198227,
-0.04414355382323265,
-0.11726648360490799,
-0.08386973291635513,
-0.04531266912817955,
0.12435328215360641,
0.037184134125709534,
0.05052722245454788,
0.15028348565101624,
-0.029083723202347755,
0.009923864156007767,
-0.05752842128276825,
-0.01019437238574028,
-0.17055989801883698,
-0.21660038828849792,
-0.0938776507973671,
-0.10736684501171112,
-0.020827289670705795,
-0.07911214232444763,
-0.0010861436603590846,
0.11179789155721664,
0.04446709156036377,
-0.061952654272317886,
-0.04771986976265907,
0.03775682672858238,
0.003904404118657112,
0.025310203433036804,
-0.01582220382988453,
-0.011777427047491074,
-0.000343171413987875,
-0.03059724159538746,
-0.06738303601741791,
-0.01696888729929924,
-0.03310570865869522,
0.037791505455970764,
0.004773354157805443,
0.09270191192626953,
-0.11477713286876678,
-0.08785588294267654,
-0.03195410966873169,
-0.0075392103753983974,
0.03082258068025112,
0.16079775989055634,
0.030512463301420212,
0.01691404916346073,
0.0907694473862648,
0.20559430122375488,
-0.03264369070529938,
-0.14971405267715454,
-0.03445136174559593,
0.14124837517738342,
0.022844232618808746,
0.049967627972364426,
-0.013527176342904568,
0.003966030664741993,
-0.06544696539640427,
0.19026914238929749,
0.26540252566337585,
-0.03140486031770706,
0.04485868290066719,
-0.058964282274246216,
-0.001264002756215632,
0.031351976096630096,
0.07823557406663895,
0.10914626717567444,
0.17812545597553253,
-0.06687017530202866,
-0.005456386599689722,
-0.07602094858884811,
0.0419095940887928,
-0.1563708782196045,
0.04974354803562164,
-0.01734304055571556,
-0.06620311737060547,
-0.03612294793128967,
0.07337763160467148,
-0.10302326083183289,
0.08594819158315659,
0.048723768442869186,
-0.13066284358501434,
-0.09165183454751968,
0.013273049145936966,
0.1904364973306656,
-0.0007997379871085286,
0.004177446477115154,
-0.04256977513432503,
-0.04159393906593323,
-0.003112673293799162,
-0.03531518578529358,
-0.1357811838388443,
-0.04084238409996033,
0.04404858499765396,
0.03774557635188103,
0.16480259597301483,
0.0011982027208432555,
0.06661088019609451,
0.10752831399440765,
0.029819896444678307,
-0.10174287110567093,
0.12669888138771057,
0.002635407028719783,
-0.04740220680832863,
0.04665013775229454,
-0.09556066244840622,
-0.023181362077593803,
-0.005693471059203148,
0.10762418061494827,
-0.048812560737133026,
0.015251524746418,
-0.010503946803510189,
-0.05951569974422455,
-0.018864305689930916,
0.03914494812488556,
-0.040739547461271286,
0.08012192696332932,
0.008694096468389034,
-0.05200501158833504,
-0.05175472050905228,
-0.04705476760864258,
0.005877370480448008,
-0.011458886787295341,
-0.13306820392608643,
-0.03198770061135292,
-0.05165901780128479,
-0.0024968415964394808,
0.056966423988342285,
0.07905115932226181,
-0.08648340404033661,
-0.017602669075131416,
-0.10815952718257904,
-0.015609056688845158,
-0.152584508061409,
0.028680169954895973,
0.1063011959195137,
-0.02363627962768078,
-0.03850589320063591,
-0.053876884281635284,
0.007938290014863014,
0.04658912867307663,
-0.07255688309669495,
-0.07717888802289963
] |
null | null | transformers |
## Swedish BERT models for sentiment analysis, Sentiment targets.
[Recorded Future](https://www.recordedfuture.com/) together with [AI Sweden](https://www.ai.se/en) releases a Named Entity Recognition(NER) model for entety detection in Swedish. The model is based on [KB/bert-base-swedish-cased](https://huggingface.co/KB/bert-base-swedish-cased) and finetuned on data collected from various internet sources and forums.
The model has been trained on Swedish data and only supports inference of Swedish input texts. The models inference metrics for all non-Swedish inputs are not defined, these inputs are considered as out of domain data.
The current models are supported at Transformers version >= 4.3.3 and Torch version 1.8.0, compatibility with older versions are not verified.
### Available tags
* Location
* Organization
* Person
* Religion
* Title
### Evaluation metrics
The model had the following metrics when evaluated on test data originating from the same domain as the training data.
#### F1-score
| Loc | Org | Per | Nat | Rel | Tit | Total |
|------|------|------|------|------|------|-------|
| 0.91 | 0.88 | 0.96 | 0.95 | 0.91 | 0.84 | 0.92 |
| {"language": "sv", "license": "mit"} | token-classification | RecordedFuture/Swedish-NER | [
"transformers",
"pytorch",
"bert",
"token-classification",
"sv",
"license:mit",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [
"sv"
] | TAGS
#transformers #pytorch #bert #token-classification #sv #license-mit #autotrain_compatible #endpoints_compatible #has_space #region-us
| Swedish BERT models for sentiment analysis, Sentiment targets.
--------------------------------------------------------------
Recorded Future together with AI Sweden releases a Named Entity Recognition(NER) model for entety detection in Swedish. The model is based on KB/bert-base-swedish-cased and finetuned on data collected from various internet sources and forums.
The model has been trained on Swedish data and only supports inference of Swedish input texts. The models inference metrics for all non-Swedish inputs are not defined, these inputs are considered as out of domain data.
The current models are supported at Transformers version >= 4.3.3 and Torch version 1.8.0, compatibility with older versions are not verified.
### Available tags
* Location
* Organization
* Person
* Religion
* Title
### Evaluation metrics
The model had the following metrics when evaluated on test data originating from the same domain as the training data.
#### F1-score
| [
"### Available tags\n\n\n* Location\n* Organization\n* Person\n* Religion\n* Title",
"### Evaluation metrics\n\n\nThe model had the following metrics when evaluated on test data originating from the same domain as the training data.",
"#### F1-score"
] | [
"TAGS\n#transformers #pytorch #bert #token-classification #sv #license-mit #autotrain_compatible #endpoints_compatible #has_space #region-us \n",
"### Available tags\n\n\n* Location\n* Organization\n* Person\n* Religion\n* Title",
"### Evaluation metrics\n\n\nThe model had the following metrics when evaluated on test data originating from the same domain as the training data.",
"#### F1-score"
] | [
48,
14,
30,
6
] | [
"passage: TAGS\n#transformers #pytorch #bert #token-classification #sv #license-mit #autotrain_compatible #endpoints_compatible #has_space #region-us \n### Available tags\n\n\n* Location\n* Organization\n* Person\n* Religion\n* Title### Evaluation metrics\n\n\nThe model had the following metrics when evaluated on test data originating from the same domain as the training data.#### F1-score"
] | [
-0.1146531030535698,
0.11819420009851456,
-0.0013355686096474528,
0.06103421747684479,
0.1749604493379593,
0.04314052313566208,
0.13111746311187744,
0.08062232285737991,
0.09895928204059601,
-0.009877925738692284,
0.11914875358343124,
0.12460658699274063,
0.010232080705463886,
0.09801275283098221,
-0.03854379057884216,
-0.22372011840343475,
0.04065629839897156,
0.04847745597362518,
-0.10957590490579605,
0.15800897777080536,
0.10840144008398056,
-0.11192823946475983,
0.10102181136608124,
0.01589525304734707,
-0.15715232491493225,
0.009140136651694775,
0.02607022598385811,
-0.06180720776319504,
0.12807194888591766,
0.0033168531954288483,
0.10971750319004059,
0.05599478632211685,
0.04660961776971817,
-0.15345308184623718,
0.013576132245361805,
-0.01309300772845745,
-0.032137706875801086,
0.044127997010946274,
0.033753957599401474,
-0.019874360412359238,
0.10008567571640015,
0.10215123742818832,
0.040026597678661346,
0.04004613310098648,
-0.11291195452213287,
-0.21376730501651764,
-0.08289297670125961,
0.09951633960008621,
0.01330510899424553,
0.08809749782085419,
-0.019580699503421783,
0.18087135255336761,
-0.25117653608322144,
0.05987721309065819,
0.027848146855831146,
-0.24799960851669312,
-0.023781169205904007,
0.2693454325199127,
0.006763093173503876,
0.022034671157598495,
-0.12196608632802963,
0.03381580486893654,
0.05933373421430588,
0.05203614383935928,
0.03947104141116142,
-0.058139506727457047,
0.014188356697559357,
0.06501240283250809,
-0.1379866898059845,
-0.07333221286535263,
0.23259642720222473,
0.03045128658413887,
0.029070518910884857,
-0.04565449431538582,
0.03138501197099686,
-0.08419676125049591,
0.021319454535841942,
-0.09532850235700607,
0.02092563360929489,
-0.012836932204663754,
-0.01105678454041481,
0.08930729329586029,
-0.14119549095630646,
-0.09571138769388199,
-0.15930844843387604,
0.1274692267179489,
0.008342853747308254,
0.02122517302632332,
-0.14183788001537323,
0.13720959424972534,
-0.08274397253990173,
-0.12758907675743103,
-0.0481242910027504,
-0.08534298092126846,
0.0169928427785635,
-0.06868404150009155,
-0.04165887460112572,
-0.007858069613575935,
0.05654782056808472,
0.08613904565572739,
-0.04618362709879875,
0.013767511583864689,
0.055577654391527176,
0.034939512610435486,
0.01861676760017872,
0.24278080463409424,
-0.014008814468979836,
-0.04899999871850014,
-0.05371087044477463,
-0.012451494112610817,
-0.08937235176563263,
0.035226356238126755,
-0.09615433216094971,
-0.01026155799627304,
0.09170153737068176,
0.056638941168785095,
-0.08674932271242142,
0.11450047045946121,
-0.05516320466995239,
0.021048372611403465,
0.005685537587851286,
-0.06198413670063019,
0.027661852538585663,
0.016793547198176384,
-0.03742999956011772,
-0.020756617188453674,
-0.006759010720998049,
-0.029611390084028244,
0.025306226685643196,
0.07734348624944687,
-0.08301427215337753,
-0.014513351023197174,
-0.05217133089900017,
-0.09219130873680115,
0.03461451828479767,
-0.05763181671500206,
0.07388103008270264,
-0.09281744807958603,
-0.15349605679512024,
-0.06856592744588852,
0.05534917861223221,
-0.03204520046710968,
-0.01220185961574316,
-0.006569125689566135,
-0.03173720836639404,
0.01145662646740675,
-0.03347548469901085,
-0.014478374272584915,
-0.055844832211732864,
0.08099325746297836,
-0.047500401735305786,
0.021056432276964188,
0.010244550183415413,
0.08870048075914383,
-0.15884461998939514,
0.009091849438846111,
-0.12718947231769562,
0.009875696152448654,
-0.14480875432491302,
0.09364219009876251,
-0.027706440538167953,
-0.11036059260368347,
-0.07413031905889511,
0.043515030294656754,
0.003607391146942973,
0.17020654678344727,
-0.11764609813690186,
-0.09792222082614899,
0.127386674284935,
-0.10800312459468842,
-0.19549532234668732,
0.031024334952235222,
-0.04150470346212387,
0.19735084474086761,
0.0575316958129406,
0.22861680388450623,
0.1949692666530609,
-0.07427236437797546,
0.03741995617747307,
0.07474006712436676,
-0.06003957241773605,
-0.06136668100953102,
0.04251158982515335,
0.006246603559702635,
-0.03154689818620682,
0.05806417390704155,
-0.03634023293852806,
0.05935048684477806,
-0.09703212231397629,
-0.055365223437547684,
-0.012169485911726952,
-0.04130454733967781,
0.0755784809589386,
0.0916452407836914,
0.13505461812019348,
-0.05730302259325981,
-0.010357989929616451,
0.08192648738622665,
0.05382279306650162,
-0.005538129713386297,
-0.04203138127923012,
-0.04328828305006027,
0.12891855835914612,
-0.06174946948885918,
-0.020290667191147804,
-0.16832087934017181,
0.004571538884192705,
-0.0036741895601153374,
0.04123532772064209,
-0.019455203786492348,
0.188378244638443,
0.0873098373413086,
-0.08853661268949509,
-0.028306804597377777,
-0.018776364624500275,
0.056126177310943604,
0.04213837534189224,
-0.13250300288200378,
-0.11148454993963242,
0.00757132563740015,
-0.03285835310816765,
0.10770886391401291,
-0.17919018864631653,
0.04963032901287079,
0.09919746965169907,
0.10267984122037888,
0.0007239299011416733,
0.034906212240457535,
0.0011441836832091212,
0.03304637223482132,
-0.01964830793440342,
-0.024213727563619614,
0.010636982508003712,
-0.004892448429018259,
-0.05222467705607414,
0.11093252897262573,
0.01655326411128044,
0.29623934626579285,
0.15026380121707916,
-0.08697130531072617,
-0.05233977735042572,
-0.00374596961773932,
-0.04930487275123596,
0.026887958869338036,
-0.03144034370779991,
0.041398655623197556,
0.048593100160360336,
-0.033754318952560425,
0.08799287676811218,
-0.03703244775533676,
-0.04277338460087776,
0.022721504792571068,
-0.019003450870513916,
-0.03796421363949776,
0.14272218942642212,
0.07824923098087311,
-0.21989205479621887,
0.19673773646354675,
0.2235756814479828,
0.023711154237389565,
0.10161298513412476,
-0.06584049761295319,
-0.02037087082862854,
-0.014939132146537304,
-0.07978297024965286,
-0.0802154541015625,
0.14624060690402985,
-0.18207311630249023,
-0.009439749643206596,
0.04586326330900192,
0.0854160264134407,
0.03519067168235779,
-0.14111343026161194,
-0.07273070514202118,
0.0674256905913353,
-0.006551808677613735,
-0.12880560755729675,
0.10848812758922577,
-0.003012803616002202,
0.09283319115638733,
-0.05083085224032402,
-0.10168054699897766,
0.05457574874162674,
-0.00968476664274931,
-0.13755545020103455,
0.20220057666301727,
-0.03628729656338692,
-0.12785284221172333,
-0.12629924714565277,
0.0005311753484420478,
-0.022211849689483643,
0.013008767738938332,
0.09611206501722336,
-0.07500264048576355,
-0.02902469038963318,
-0.004146249499171972,
0.022339262068271637,
-0.017170246690511703,
0.030420156195759773,
-0.1080607920885086,
0.0001088425051420927,
-0.005850076675415039,
-0.10712455213069916,
-0.07796512544155121,
-0.05453503131866455,
0.013120091520249844,
0.09932927787303925,
-0.11606591194868088,
0.06865355372428894,
0.14395394921302795,
-0.023759715259075165,
0.055107612162828445,
-0.04396625980734825,
0.299613356590271,
-0.12889641523361206,
-0.0281717237085104,
0.12183859944343567,
0.039446961134672165,
0.02324480377137661,
0.17778749763965607,
0.03711501508951187,
-0.06156511604785919,
-0.0427866205573082,
-0.01180132757872343,
-0.058163125067949295,
-0.21387352049350739,
-0.07219227403402328,
-0.061929572373628616,
-0.07574502378702164,
-0.0022283028811216354,
0.03734639659523964,
0.04514740779995918,
0.08485706895589828,
0.0665656328201294,
-0.03491237014532089,
-0.05407289043068886,
0.0028675163630396128,
0.16338829696178436,
0.001371470745652914,
0.11451084166765213,
-0.06083524227142334,
-0.03624667227268219,
0.04857727140188217,
-0.019137902185320854,
0.22251632809638977,
0.034209609031677246,
-0.11604616791009903,
0.11811341345310211,
0.13049298524856567,
0.05822720378637314,
0.07908611744642258,
0.01156824640929699,
-0.060215581208467484,
0.008219143375754356,
-0.009505553171038628,
-0.04279433935880661,
0.03136931732296944,
0.02918877638876438,
-0.051973629742860794,
-0.15317660570144653,
-0.14624181389808655,
0.050745151937007904,
0.13607338070869446,
0.07543681561946869,
-0.22257541120052338,
-0.10393994301557541,
0.01690526120364666,
0.013887736015021801,
-0.03276218846440315,
0.03583119437098503,
0.06529910862445831,
-0.13384512066841125,
0.03778956085443497,
0.016969088464975357,
0.09238207340240479,
0.07041327655315399,
0.05651380494236946,
0.023164238780736923,
-0.1725418120622635,
-0.03249775618314743,
0.1417481154203415,
-0.2814761698246002,
0.3306952118873596,
-0.018734103068709373,
-0.009451999329030514,
-0.08148524165153503,
-0.04940177872776985,
0.0710735023021698,
0.272244393825531,
0.08621399104595184,
0.006800872273743153,
-0.0618954561650753,
-0.19259698688983917,
0.053681861609220505,
0.0325968936085701,
0.004856188781559467,
-0.07458756864070892,
0.03062175214290619,
-0.014754233881831169,
0.0325271338224411,
0.01741967722773552,
0.04065501317381859,
-0.05587758496403694,
0.013892077840864658,
-0.0075491308234632015,
0.0032608485780656338,
-0.011981619521975517,
-0.046448297798633575,
-0.09974225610494614,
-0.18034687638282776,
0.07437901943922043,
-0.027805794030427933,
-0.027212563902139664,
-0.10539790242910385,
0.005281643010675907,
0.0326412096619606,
-0.09885907173156738,
0.013010921888053417,
-0.0191578920930624,
0.017313793301582336,
0.05161753296852112,
-0.08130557835102081,
0.07199709117412567,
-0.07306474447250366,
-0.09330660104751587,
-0.031088102608919144,
0.09359609335660934,
-0.0016054115258157253,
0.08723790943622589,
0.031893689185380936,
-0.006826070137321949,
-0.05747625231742859,
-0.10690636187791824,
0.003068633610382676,
0.00874356273561716,
-0.011949705891311169,
-0.0019204533891752362,
-0.03370455279946327,
-0.034336939454078674,
-0.011859457939863205,
-0.0032985941506922245,
0.14559999108314514,
0.18965473771095276,
-0.11034820973873138,
0.04400311037898064,
0.06052759289741516,
-0.07490035891532898,
-0.29621586203575134,
0.0780896246433258,
-0.035478346049785614,
0.06887103617191315,
0.12909558415412903,
-0.05008549615740776,
0.10681286454200745,
0.041965533047914505,
-0.08375062793493271,
-0.00030299107311293483,
-0.20383977890014648,
-0.09842437505722046,
0.18549461662769318,
0.024247098714113235,
0.17405974864959717,
-0.08693832904100418,
-0.04330615699291229,
0.03293365612626076,
-0.2422410249710083,
0.03158365562558174,
-0.09227675944566727,
0.0660005733370781,
-0.03277093917131424,
0.07177554816007614,
0.028913425281643867,
-0.06867844611406326,
0.14741970598697662,
0.06365478783845901,
0.107621930539608,
-0.06872963160276413,
-0.10276006162166595,
0.09981513023376465,
-0.03687366843223572,
0.10693655908107758,
-0.01796450838446617,
0.0627979040145874,
-0.21176110208034515,
-0.01340978592634201,
-0.09873225539922714,
0.058467067778110504,
-0.035074785351753235,
-0.07936388999223709,
-0.08238385617733002,
0.07372793555259705,
0.04734012484550476,
-0.04796036332845688,
0.1096864864230156,
-0.09259205311536789,
0.1255180388689041,
0.11103443801403046,
0.0958208292722702,
-0.06834283471107483,
-0.02680741436779499,
-0.018563324585556984,
-0.04081200435757637,
0.06435982882976532,
-0.2277684509754181,
0.049604382365942,
0.15821081399917603,
-0.012563326396048069,
0.11615823209285736,
0.07978692650794983,
0.01557241939008236,
-0.015459000132977962,
0.10919524729251862,
-0.09117021411657333,
-0.09636519849300385,
-0.043708592653274536,
-0.12773370742797852,
-0.12327384948730469,
0.10766999423503876,
0.05777204781770706,
-0.027344780042767525,
0.01288653165102005,
-0.011318388395011425,
-0.029146168380975723,
-0.09054319560527802,
0.1621163785457611,
0.08383585512638092,
0.06345828622579575,
-0.09429934620857239,
-0.009142190217971802,
0.013578668236732483,
-0.02625722996890545,
-0.05477358400821686,
-0.0627073347568512,
-0.09994843602180481,
-0.09573526680469513,
0.04837075620889664,
0.19028444588184357,
-0.1215476244688034,
-0.05001485347747803,
-0.10237488895654678,
-0.16976100206375122,
0.05089060962200165,
0.15770596265792847,
0.17522530257701874,
0.08905824273824692,
-0.02111140824854374,
-0.07828280329704285,
-0.0123304333537817,
0.04210063815116882,
0.028482524678111076,
0.012496333569288254,
-0.18927080929279327,
0.05874595046043396,
-0.04655678570270538,
0.1097647100687027,
-0.09439767897129059,
-0.05618184804916382,
-0.15002340078353882,
0.05568995699286461,
-0.10700467228889465,
-0.07060785591602325,
-0.026485087350010872,
0.016642043367028236,
0.029583560302853584,
-0.1607639044523239,
-0.08981426805257797,
0.009099030867218971,
-0.15542000532150269,
0.0679716020822525,
0.016828758642077446,
0.13926680386066437,
-0.07209870219230652,
-0.04164455085992813,
0.06501441448926926,
-0.013990062288939953,
0.07522919774055481,
0.036181528121232986,
-0.02530127950012684,
0.0697561576962471,
-0.1975514441728592,
-0.0002520784910302609,
0.041874054819345474,
0.020786644890904427,
0.09030549973249435,
-0.023231178522109985,
0.025173133239150047,
0.050822533667087555,
-0.008335666730999947,
0.1106625646352768,
-0.055899377912282944,
-0.07940346002578735,
-0.0063276467844843864,
-0.030692074447870255,
-0.07081568986177444,
-0.008274131454527378,
0.01895962283015251,
0.11983619630336761,
0.034339789301157,
0.17803989350795746,
-0.032984185963869095,
0.02047034725546837,
-0.1917923241853714,
0.0002988779451698065,
-0.01863439939916134,
-0.1067948266863823,
-0.07496204972267151,
-0.08736048638820648,
0.03337480127811432,
-0.008125849068164825,
0.2644672095775604,
0.09385743737220764,
0.02814164198935032,
0.03473225608468056,
0.09338496625423431,
0.10211426019668579,
-0.003033047541975975,
0.13876815140247345,
0.06236843019723892,
-0.03317176178097725,
-0.0007474900339730084,
0.05769628286361694,
-0.044202737510204315,
-0.007667540106922388,
0.10420329868793488,
0.122642382979393,
-0.06292253732681274,
0.0607873909175396,
0.03396642208099365,
0.03413885459303856,
-0.10438914597034454,
-0.15900872647762299,
-0.05733417347073555,
0.04242900013923645,
-0.04532145336270332,
0.07464434206485748,
0.0498584620654583,
-0.15925492346286774,
0.05492427945137024,
-0.12645487487316132,
-0.03620615601539612,
-0.11590424180030823,
-0.11539269238710403,
-0.07853205502033234,
-0.15947553515434265,
0.03277824446558952,
-0.030214034020900726,
-0.026294494047760963,
0.13786359131336212,
0.05785130336880684,
-0.05955633893609047,
0.09150244295597076,
-0.10380040854215622,
0.02198713831603527,
0.03102235309779644,
-0.04517223685979843,
-0.03153786435723305,
-0.17756609618663788,
0.03553212434053421,
-0.03977729007601738,
-0.01991448737680912,
0.0011387786362320185,
-0.061659954488277435,
-0.08715231716632843,
-0.056548334658145905,
-0.07429496198892593,
-0.05999955162405968,
-0.009536173194646835,
0.02954285405576229,
-0.004182054195553064,
0.010495026595890522,
-0.005922358948737383,
0.07263157516717911,
0.007182107772678137,
0.19800105690956116,
-0.015512838028371334,
-0.07180242985486984,
-0.11965690553188324,
0.28164246678352356,
0.10491009801626205,
0.046957630664110184,
0.006665325257927179,
-0.0755222737789154,
0.031083768233656883,
0.2526550590991974,
0.265069842338562,
-0.0383916050195694,
0.04303636774420738,
0.022105496376752853,
0.004940706770867109,
0.03717474639415741,
0.04215860739350319,
0.018310001119971275,
0.11172061413526535,
-0.11069076508283615,
0.04122615233063698,
-0.04897556081414223,
-0.05802156403660774,
0.06639272719621658,
0.07879519462585449,
0.07928474247455597,
-0.05699586495757103,
-0.07160747051239014,
0.11640015989542007,
-0.07853198051452637,
0.0349910631775856,
0.1075534075498581,
-0.18436886370182037,
-0.13977748155593872,
-0.02343306876718998,
0.040196362882852554,
0.028454303741455078,
0.05117877200245857,
-0.0530572272837162,
-0.012691534124314785,
0.025597676634788513,
-0.002382892183959484,
-0.06895332783460617,
-0.1491134613752365,
0.15259915590286255,
0.044691044837236404,
0.18378810584545135,
-0.033477555960416794,
0.10298468172550201,
0.10175830870866776,
0.049176063388586044,
-0.04138786345720291,
0.048246584832668304,
0.05749720335006714,
-0.04675091430544853,
-0.019101843237876892,
-0.03585075959563255,
0.020559953525662422,
0.023334668949246407,
0.030170124024152756,
-0.15296845138072968,
0.09702782332897186,
-0.14682908356189728,
-0.05259551480412483,
-0.09968000650405884,
0.00975024327635765,
-0.004075609613209963,
0.09874775260686874,
0.15595659613609314,
-0.013680940493941307,
-0.01912878081202507,
-0.043469496071338654,
0.06899361312389374,
0.03795446828007698,
-0.07227922230958939,
-0.03150187060236931,
-0.13296248018741608,
0.03265029564499855,
0.007595859933644533,
-0.0282438937574625,
-0.1361066848039627,
-0.051301538944244385,
-0.01596156880259514,
-0.0288222823292017,
-0.0007311931694857776,
0.09933384507894516,
0.05913880839943886,
0.05524935945868492,
-0.02814059890806675,
-0.12638212740421295,
-0.02070733532309532,
0.11183053255081177,
-0.10454576462507248,
-0.08464203029870987
] |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.